CN114680739B - Cleaning control method and device, intelligent equipment, mobile terminal and server - Google Patents

Cleaning control method and device, intelligent equipment, mobile terminal and server Download PDF

Info

Publication number
CN114680739B
CN114680739B CN202011599703.2A CN202011599703A CN114680739B CN 114680739 B CN114680739 B CN 114680739B CN 202011599703 A CN202011599703 A CN 202011599703A CN 114680739 B CN114680739 B CN 114680739B
Authority
CN
China
Prior art keywords
area
cleaned
cleaning
determining
mobile terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011599703.2A
Other languages
Chinese (zh)
Other versions
CN114680739A (en
Inventor
梁玉池
梁家勇
罗振宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Midea Group Co Ltd
GD Midea Air Conditioning Equipment Co Ltd
Original Assignee
Midea Group Co Ltd
GD Midea Air Conditioning Equipment Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Midea Group Co Ltd, GD Midea Air Conditioning Equipment Co Ltd filed Critical Midea Group Co Ltd
Priority to CN202011599703.2A priority Critical patent/CN114680739B/en
Publication of CN114680739A publication Critical patent/CN114680739A/en
Application granted granted Critical
Publication of CN114680739B publication Critical patent/CN114680739B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • A47L11/4008Arrangements of switches, indicators or the like
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/06Control of the cleaning action for autonomous devices; Automatic detection of the surface condition before, during or after cleaning

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Electric Vacuum Cleaner (AREA)

Abstract

The invention provides a cleaning control method, a cleaning control device, intelligent equipment, a mobile terminal and a server, wherein the cleaning control method comprises the following steps: determining a region to be cleaned for defining a cleaning range based on a moving track of the mobile terminal; and controlling the cleaning equipment to execute cleaning tasks on the area to be cleaned based on the area to be cleaned. The invention can determine the area to be cleaned according to the moving track of the mobile terminal, and realize the fixed-point cleaning of what you see is what you get, thereby effectively improving the cleaning efficiency and reducing the electricity consumption caused by cleaning.

Description

Cleaning control method and device, intelligent equipment, mobile terminal and server
Technical Field
The invention relates to the technical field of intelligent processing, in particular to a cleaning control method, a cleaning control device, intelligent equipment, a mobile terminal and a server.
Background
Sweeping robots are commonly used, and users can set fixed sweeping areas (such as living rooms, bedrooms and the like) to sweep. For example, when a certain set area (such as a living room) needs to be cleaned, the cleaning robot can be controlled to perform cleaning work on the set area.
However, there are cases where cleaning is not required for all positions in the setting area, and this is wasteful if cleaning is still performed for the setting area. For example, the amount of electricity and the sweeping time period of the sweeping robot are wasted.
Disclosure of Invention
Aiming at the problems in the prior art, the embodiment of the invention provides a cleaning control method, a cleaning control device, intelligent equipment, a mobile terminal and a server, which are used for solving the problem that accurate cleaning cannot be realized in the prior art.
In order to solve the problems in the prior art, the embodiment of the invention provides the following technical scheme:
in a first aspect, an embodiment of the present invention provides a cleaning control method, including:
determining a region to be cleaned for defining a cleaning range based on a moving track of the mobile terminal;
and controlling the cleaning equipment to execute cleaning tasks on the area to be cleaned based on the area to be cleaned.
Further, the determining, based on the movement track of the mobile terminal, the area to be cleaned for defining the cleaning range includes:
and determining the area to be cleaned for defining the cleaning range according to a moving track formed by moving the mobile terminal around the area to be cleaned.
Further, determining the area to be cleaned for defining the cleaning range according to a movement track formed by the movement of the mobile terminal around the area to be cleaned, including:
determining coordinates of a moving track formed by moving the mobile terminal around the area to be cleaned;
and determining a region to be cleaned for defining a cleaning range according to the coordinates of the moving track.
Further, determining the area to be cleaned for defining the cleaning range according to the coordinates of the moving track, including:
and determining whether the moving track is a closed track according to the coordinates of the moving track, and if so, determining the closed track as an area to be cleaned.
Further, determining the area to be cleaned for defining the cleaning range according to the coordinates of the moving track, including:
and determining whether the moving track is a closed track or not according to the coordinates of the moving track, if not, determining that the moving track is a non-closed track, and determining whether the non-closed track and a fixed barrier in a room can form a closed track according to the coordinates corresponding to the non-closed track, if so, determining the formed closed track as a region to be cleaned.
Further, determining the composed closed track as the area to be cleaned, including any one of the following modes:
determining that a plurality of closed tracks consisting of the non-closed track and the fixed barriers in the room are arranged, displaying the closed tracks consisting of the closed tracks for a user to select, and determining the closed track selected by the user as an area to be cleaned;
determining that a plurality of closed tracks consisting of the non-closed tracks and fixed barriers in a room exist, and determining the closed tracks with smaller cleaning area as an area to be cleaned;
and determining that a plurality of closed tracks consisting of the non-closed track and the fixed barriers in the room are arranged, and determining the closed track corresponding to the pre-calibrated closed track number as the area to be cleaned according to the pre-calibrated closed track number.
Further, determining coordinates of a movement track formed by moving the mobile terminal around the area to be cleaned includes:
and determining the coordinates of a moving track formed by moving the mobile terminal around the area to be cleaned based on the ultra wideband UWB indoor positioning system.
Further, the determining, based on the movement track of the mobile terminal, the area to be cleaned for defining the cleaning range includes:
And determining the area to be cleaned for defining the cleaning range according to the delineating shape formed by the mobile terminal in-situ immobility and the range delineating by pointing to the area to be cleaned, the angle between the mobile terminal and the cleaning surface and the distance between the mobile terminal and the cleaning surface.
Further, determining the area to be cleaned for defining the cleaning range according to the in-situ stationary delineation shape of the mobile terminal formed by delineating the area to be cleaned by pointing to the area, the angle between the mobile terminal and the cleaning surface, and the distance between the mobile terminal and the cleaning surface, specifically including:
selecting a preset number of first coordinate points (x 1, y 1) on a delineating shape formed by the mobile terminal in-situ immobility through pointing to an area to be cleaned and carrying out range delineation;
determining a preset number of second coordinate points (x 2, y 2) corresponding to the preset number of first coordinate points on the cleaning surface according to a first relation model;
determining a region to be cleaned for defining a cleaning range according to the second coordinate points of the preset number;
wherein the first relationship model is:
x2=x1-L×cosθ,y2=y1-L×sinθ
wherein L represents a distance between the first coordinate point (x 1, y 1) and the second coordinate point (x 1, y 1), H 1 Representing a distance between the mobile terminal and the cleaning surface; θ1 represents the angle between the mobile terminal and the cleaning surface, and θ represents the angle between the line segment L and the x-axis of the ground coordinate system.
Further, the cleaning control method further includes:
collecting an image of the area to be cleaned;
correcting the area to be cleaned based on the image;
accordingly, based on the area to be cleaned, controlling the cleaning device to perform a cleaning task on the area to be cleaned, including:
and controlling the cleaning equipment to execute cleaning tasks on the modified area to be cleaned based on the modified area to be cleaned.
Further, correcting the area to be cleaned based on the image includes:
receiving a marking area obtained by marking operation on the image by a user;
and correcting the area to be cleaned according to the marking area of the user to obtain a corrected area to be cleaned.
Further, based on the area to be cleaned, controlling the cleaning device to perform a cleaning task on the area to be cleaned, including:
identifying the ground type and/or dirt type of the area to be cleaned;
and controlling the cleaning equipment to carry out targeted cleaning treatment on the area to be cleaned according to the ground type and/or dirt type identification result.
Further, the ground type identification for the area to be cleaned includes:
collecting sound signals sent by cleaning equipment in the area to be cleaned;
determining the material type of the area to be cleaned according to the audio characteristics in the sound signals;
accordingly, according to the ground type recognition result, the cleaning device is controlled to carry out targeted cleaning treatment on the area to be cleaned, and the method comprises the following steps:
and controlling the cleaning equipment to carry out targeted cleaning treatment on the area to be cleaned by adopting a matched cleaning mode according to the ground type identification result.
Further, the method for identifying the dirt type of the area to be cleaned comprises the following steps:
collecting an image of the area to be cleaned;
and identifying the dirt type of the area to be cleaned according to the image.
Further, identifying the dirt type of the area to be cleaned according to the image information, wherein the method comprises any one of the following modes:
inputting the image into a dirt type recognition model, and acquiring a dirt type recognition result of the area to be cleaned; the dirt type recognition model is obtained after training by adopting an image sample and a dirt type recognition result label of the image sample;
Comparing the image with each reference image of known dirt types stored in a database, and taking the dirt type corresponding to the reference image matched with the image in the database as the dirt type of the area to be cleaned.
In a second aspect, an embodiment of the present invention further provides a cleaning control device, including:
the determining module is used for determining a to-be-cleaned area for defining a cleaning range based on the moving track of the mobile terminal;
and the control module is used for controlling the cleaning equipment to execute the cleaning task on the area to be cleaned based on the area to be cleaned.
In a third aspect, an embodiment of the present invention further provides an intelligent device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the cleaning control method according to the first aspect when executing the program.
In a fourth aspect, an embodiment of the present invention further provides a sweeping robot device, including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor implements the steps of the sweeping control method according to the first aspect when executing the program.
In a fifth aspect, an embodiment of the present invention further provides a mobile terminal, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the cleaning control method according to the first aspect when executing the program.
In a sixth aspect, an embodiment of the present invention provides a server, including a memory, a processor, and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the cleaning control method according to the first aspect when the program is executed.
In a seventh aspect, embodiments of the present invention provide a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the cleaning control method according to the first aspect.
According to the technical proposal, the cleaning control method, the device, the intelligent equipment, the mobile terminal and the server provided by the invention,
in an eighth aspect, an embodiment of the present invention further provides a cleaning control method, including:
receiving a cleaning instruction which is triggered by a user through voice in real time and is used for arranging a random cleaning task; wherein, the random cleaning task refers to a task which does not belong to a preset cleaning task;
Determining a region to be cleaned for defining a cleaning range according to the cleaning instruction;
and controlling the cleaning equipment to execute a cleaning task on the area to be cleaned according to the area to be cleaned.
Further, determining an area to be cleaned for defining a cleaning range according to the cleaning instruction comprises:
determining a reference object and relative position area information of the reference object contained in the cleaning instruction, and determining a to-be-cleaned area for defining a cleaning range according to the position information of the reference object and the relative position area information of the reference object;
and/or the number of the groups of groups,
determining absolute position coordinate information contained in the cleaning instruction, and determining a region to be cleaned for defining a cleaning range according to the absolute position coordinate information;
and/or the number of the groups of groups,
and determining a reference object and relative coordinate information of the reference object contained in the cleaning instruction, and determining a region to be cleaned for defining a cleaning range according to the reference object and the relative coordinate information of the reference object.
Further, according to the area to be cleaned, controlling the cleaning device to perform a cleaning task on the area to be cleaned, including:
identifying the ground type and/or dirt type of the area to be cleaned;
And controlling the cleaning equipment to carry out targeted cleaning treatment on the area to be cleaned according to the ground type and/or dirt type identification result.
Further, the ground type identification for the area to be cleaned includes:
collecting sound signals sent by cleaning equipment in the area to be cleaned;
and determining the material type of the area to be cleaned according to the audio characteristics in the sound signals.
Further, according to the ground type recognition result, the cleaning device is controlled to perform targeted cleaning treatment on the area to be cleaned, including:
and controlling the cleaning equipment to carry out targeted cleaning treatment on the area to be cleaned by adopting a matched cleaning mode according to the ground type identification result.
Further, the method for identifying the dirt type of the area to be cleaned comprises the following steps:
collecting an image of the area to be cleaned;
and identifying the dirt type of the area to be cleaned according to the image.
Further, identifying the dirt type of the area to be cleaned according to the image information, including:
inputting the image into a dirt type recognition model, and acquiring a dirt type recognition result of the area to be cleaned;
the dirt type recognition model is obtained after training by adopting an image sample and a dirt type recognition result label of the image sample.
Further, identifying the dirt type of the area to be cleaned according to the image, including:
comparing the image with each reference image of known dirt types stored in a database, and taking the dirt type corresponding to the reference image matched with the image in the database as the dirt type of the area to be cleaned.
In a ninth aspect, an embodiment of the present invention further provides a cleaning control method, including:
receiving a cleaning instruction which is triggered by a user through limb actions in real time and is used for arranging a random cleaning task; wherein, the random cleaning task refers to a task which does not belong to a preset cleaning task;
determining a region to be cleaned for defining a cleaning range according to the cleaning instruction;
and controlling the cleaning equipment to execute a cleaning task on the area to be cleaned according to the area to be cleaned.
Further, determining an area to be cleaned for defining a cleaning range according to the cleaning instruction comprises:
determining limb pointing and limb actions sent by a user to an area to be cleaned;
determining a coordinate area range formed on the cleaning surface according to the limb direction and the limb action;
and determining a region to be cleaned for defining a cleaning range according to the coordinate region range.
Further, according to the area to be cleaned, controlling the cleaning device to perform a cleaning task on the area to be cleaned, including:
identifying the ground type and/or dirt type of the area to be cleaned;
and controlling the cleaning equipment to carry out targeted cleaning treatment on the area to be cleaned according to the ground type and/or dirt type identification result.
Further, the ground type identification for the area to be cleaned includes:
collecting sound signals sent by cleaning equipment in the area to be cleaned;
and determining the material type of the area to be cleaned according to the audio characteristics in the sound signals.
Further, according to the ground type recognition result, the cleaning device is controlled to perform targeted cleaning treatment on the area to be cleaned, including:
and controlling the cleaning equipment to carry out targeted cleaning treatment on the area to be cleaned by adopting a matched cleaning mode according to the ground type identification result.
Further, the method for identifying the dirt type of the area to be cleaned comprises the following steps:
collecting an image of the area to be cleaned;
and identifying the dirt type of the area to be cleaned according to the image.
Further, identifying the dirt type of the area to be cleaned according to the image information, including:
Inputting the image into a dirt type recognition model, and acquiring a dirt type recognition result of the area to be cleaned;
the dirt type recognition model is obtained after training by adopting an image sample and a dirt type recognition result label of the image sample.
Further, identifying the dirt type of the area to be cleaned according to the image, including:
comparing the image with each reference image of known dirt types stored in a database, and taking the dirt type corresponding to the reference image matched with the image in the database as the dirt type of the area to be cleaned.
According to the above technical solution, according to the cleaning control method provided by the embodiments of the present invention, a to-be-cleaned area for defining a cleaning range is determined based on a movement track of a mobile terminal, and then a cleaning device is controlled to perform a cleaning task on the to-be-cleaned area based on the to-be-cleaned area. Therefore, the embodiment of the invention can determine the area to be cleaned through the track formed by the movement of the mobile terminal carried by the user, thereby realizing the fixed-point cleaning of what you see is what you get, effectively improving the cleaning efficiency and reducing the electricity consumption caused by cleaning.
Additional aspects and advantages of the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a cleaning control method according to an embodiment of the present invention;
FIG. 2a is a schematic diagram of a living room according to an embodiment of the present invention;
FIG. 2b is a schematic diagram showing a second embodiment of the present invention for designating a certain area to be cleaned in a living room;
FIG. 2c is a schematic diagram of a third embodiment of the present invention for designating a certain area to be cleaned in a living room;
FIG. 2d is a schematic diagram of a living room with a certain area to be cleaned designated according to an embodiment of the present invention;
FIG. 2e is a schematic diagram of a living room with a specific area to be cleaned according to an embodiment of the present invention;
FIG. 2f is a schematic diagram showing a specific area to be cleaned in a living room according to an embodiment of the present invention;
FIG. 3 is a schematic view of a UWB positioning system architecture according to an embodiment of the present invention;
FIG. 4 is a flowchart illustrating a specific implementation of a cleaning control method according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a user drawing an area to be cleaned without moving according to an embodiment of the present invention;
FIG. 6 is a top plan view of the floor of FIG. 5;
fig. 7 is a schematic diagram of determining an area to be cleaned according to an imaging angle of view of a camera by photographing according to an embodiment of the present invention;
fig. 8 and fig. 9 are schematic diagrams of determining a cleaning area to be cleaned by marking the cleaning area on a photographed image after photographing according to an embodiment of the present invention;
fig. 10 is a schematic structural diagram of a cleaning control device according to an embodiment of the present invention;
FIG. 11 is a schematic structural diagram of an intelligent device according to an embodiment of the present invention;
fig. 12 is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention;
fig. 13 is a schematic structural diagram of a server according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be noted that, in the embodiment of the present invention, the term "and/or" describes an association relationship of an association object, which indicates that three relationships may exist, for example, a and/or B may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship. Furthermore, the term "plurality" in embodiments of the present invention means two or more, and other adjectives are similar thereto.
Fig. 1 shows a flowchart of a cleaning control method according to an embodiment of the present invention, and referring to fig. 1, the cleaning control method according to the embodiment of the present invention includes:
Step 101: based on the movement locus of the mobile terminal, a region to be cleaned for defining a cleaning range is determined.
Step 102: and controlling the cleaning equipment to execute cleaning tasks on the area to be cleaned based on the area to be cleaned.
In this embodiment, the execution body of the cleaning control method provided in this embodiment may be a cleaning control device for controlling the cleaning apparatus to work, or the cleaning apparatus itself, such as a cleaning robot, or a mobile terminal, such as a mobile phone, a PAD, a notebook computer, or the like, or control software installed on the mobile terminal, or a server, or the like, which is not limited in this embodiment.
In this embodiment, it should be noted that when an area range to be cleaned is generated, an area to be cleaned that can clearly define the area range is determined first, and then based on the area to be cleaned, the cleaning device is controlled to execute a cleaning task on the area to be cleaned, thereby completing a targeted cleaning treatment task. The innovation point of the method is that when the to-be-cleaned area used for defining the cleaning range is determined, the to-be-cleaned area is determined by adopting the mode based on the moving track of the mobile terminal, and it can be understood that the mode based on the moving track of the mobile terminal is free and flexible, the to-be-cleaned area which can be visually and intuitively seen by a user can be accurately converted into the area which needs to be cleaned by the cleaning equipment, so that the cleaning at the same time of the obtained fixed point is realized, the cleaning efficiency can be effectively improved, and meanwhile, the electric quantity consumption caused by cleaning can be reduced.
In this embodiment, the determining the area to be cleaned according to the movement track of the mobile terminal may have at least two implementation manners, one is that the user needs to carry the mobile terminal to actually move around the area to be cleaned to form the movement track, and then determine the area to be cleaned according to the movement track. The other is that the user does not need to carry the mobile terminal to actually move around the area to be cleaned, namely, the user can stay in place, and the mobile terminal is only required to point to the area to be cleaned to carry out area delineating to form a delineating shape, so that the area to be cleaned for defining the cleaning range is determined according to the delineating shape.
In this embodiment, for the first implementation manner, a mobile terminal with a positioning function may be used, and then the mobile terminal is carried or held by hand to move around the area to be cleaned, generally, a closed moving track needs to be formed, and then the area to be cleaned capable of accurately defining the cleaning range is formed according to the moving track. In special cases, a closed movement path may not be formed, for example, a semi-closed movement path may be formed, and the semi-closed movement path is suitable for being used around the corner, the sofa side and other areas, and the areas are just such that the semi-closed movement path can form a closed movement path by means of the wall, the sofa side and the like. It will be appreciated that this implementation of the present embodiment has the advantage of ease of operation and a relatively accurate determination of the area to be cleaned.
Therefore, the present embodiment provides a processing manner capable of flexibly determining the area to be cleaned, that is, determining the area to be cleaned for defining the cleaning range based on the movement track of the mobile terminal. The user can carry or hold the mobile terminal to accurately move around the area to be cleaned, and then the area to be cleaned which can accurately define the cleaning range is formed.
In this embodiment, the area to be cleaned may be an area surrounded by a moving track with an arbitrary shape. For example, the movement track may be circular, elliptical, triangular, rectangular, trapezoidal, diamond-shaped, irregular polygonal, free curve, etc., which is not limited in this embodiment.
It should be noted that, generally, the area to be cleaned is an area surrounded by a closed track, as shown in fig. 2a, and the area to be cleaned is surrounded by the closed track in the living room. In addition, in special cases, for example, in corner areas or sofa corners, the area to be cleaned can also be an area surrounded by a semi-closed track.
It can be understood that, for the solution of this embodiment, the mobile terminal needs to carry positioning software or have a positioning function, so that when the mobile terminal moves, a moving track of the mobile terminal can be determined by the positioning software or the positioning function, and then a to-be-cleaned area for defining a cleaning range is determined according to the track. It will be appreciated that in one implementation, the movement trajectory may be directly used as the area to be cleaned. In other implementations, the area corresponding to the movement track may be slightly enlarged and then used as the area to be cleaned. For example, the magnification may be 0.1 times, 0.2 times, or the like. It can be understood that the benefit of slightly amplifying the pattern corresponding to the moving track to serve as the area to be cleaned is that the cleaning effect can be ensured, and the problems of cleaning omission or poor cleaning effect of the edges of the area to be cleaned are avoided.
In this embodiment, for the second implementation manner, that is, based on the movement track of the mobile terminal, the area to be cleaned for defining the cleaning range is determined, or the area to be cleaned for defining the cleaning range may be determined according to the in-situ delineating shape formed by the mobile terminal by pointing to the area to be cleaned and the angle between the mobile terminal and the cleaning surface, and the distance between the mobile terminal and the cleaning surface.
Therefore, according to the cleaning control method provided by the embodiment, the to-be-cleaned area used for defining the cleaning range is determined through the moving track of the mobile terminal, and then the cleaning equipment is controlled to execute the cleaning task on the to-be-cleaned area based on the to-be-cleaned area, so that the to-be-cleaned area can be accurately cleaned in a targeted manner, and the cleaning efficiency is improved. Therefore, by adopting the cleaning control method provided by the embodiment, the area to be cleaned of the area to be cleaned can be determined in real time by means of the mobile equipment before cleaning, and the area to be cleaned can be accurately cleaned in a targeted manner, so that the cleaning efficiency can be effectively improved, and meanwhile, the electricity consumption caused by cleaning can be reduced.
It can be appreciated that the method of determining the area to be cleaned based on the movement track of the mobile terminal is convenient and flexible, and the fixed-point cleaning of the user can be realized, so that the cleaning efficiency can be effectively improved.
In this embodiment, it should be noted that, here, the "determining the area to be cleaned for defining the cleaning range" is different from the "set fixed cleaning area" in the prior art, for example, the existing cleaning robot generally sets several fixed cleaning areas, such as a living room, a main lying, a sub lying, and the like, in a preset manner, so that the subsequent user can select one or more of the several cleaning areas to perform cleaning, and accordingly, the cleaning robot can perform cleaning according to the one or more cleaning areas selected by the user from the several fixed cleaning areas.
In contrast to the present application, the "determining the area to be cleaned for defining the cleaning range based on the movement track of the mobile terminal" in the present application refers to that the area to be cleaned is generated in real time by utilizing the characteristics that the mobile terminal can position itself and is convenient to move according to the actual range to be cleaned.
For example, assuming that a user only needs to clean a small area in front of a living room sofa, a to-be-cleaned area for defining the small area can be determined based on a movement track of the mobile terminal, and then cleaning processing is performed on the to-be-cleaned area (i.e., a small area in front of the living room sofa) according to the to-be-cleaned area, without cleaning the whole living room, so that cleaning efficiency can be improved, cleaning time can be shortened, cleaning energy consumption can be reduced, and meanwhile requirements can be met.
It follows that the present application "determining the area to be cleaned for defining the cleaning range" differs from the "fixed cleaning area set in the prior art" mainly in that: (1) the area to be cleaned in the application is generated in real time; (2) the area to be cleaned in the application is a targeted area to be cleaned, which is generated for the current area to be cleaned; (3) the to-be-cleaned area in the application can be a to-be-cleaned area determined in a certain existing fixed cleaning area, or can be a to-be-cleaned area determined to span a plurality of existing fixed cleaning areas, and the to-be-cleaned area in the application is irrelevant to whether the fixed cleaning area is preset or not, so long as a user generates the to-be-cleaned area to a required cleaning range based on the moving track of the mobile terminal before using the mobile terminal. It can be understood that the three differences in the present application and the prior art do not exist independently, but complement each other, that is, the to-be-cleaned area in the present application is an area for defining the cleaning range according to the area to be cleaned currently, the area does not use a room as a division standard, and does not use a function room as a division standard, but uses the area range to be cleaned currently as a division standard, so that the user can accurately determine the to-be-cleaned area according to the area range to be cleaned currently, and the movement track of the mobile terminal can clearly define the cleaning range corresponding to the cleaning requirement of the user, and further can perform cleaning processing on the to-be-cleaned area according to the to-be-cleaned area, thereby pointedly completing the fixed-point cleaning task, improving the cleaning efficiency, reducing the energy consumption and shortening the cleaning time while meeting the current cleaning requirement.
For example, it is assumed that the user needs to clean the interactive zone area of the main bedroom and the living room, i.e. the area including a small portion of the bedroom, and also the area including a small portion of the living room, when the existing sweeping robot is operated in a mode that a whole house cleaning (including each room in the house) is performed, and another mode that the living room and the main bedroom are cleaned, i.e. only the living room and the main bedroom are cleaned. However, in either way, there is a problem of low cleaning efficiency, in practice, only the interactive zone area between the main bedroom and the living room needs to be cleaned, and according to the scheme of the embodiment, the area to be cleaned for defining the interactive zone area can be determined based on the movement track of the mobile terminal, for example, the user can walk round the interactive zone area by holding the mobile phone, then the area to be cleaned is determined according to the movement track of the mobile phone, and then the cleaning robot is controlled to start to perform cleaning work according to the determined area to be cleaned, so that the cleaning of the interactive zone area between the main bedroom and the living room can be completed in a targeted manner, thereby effectively improving the cleaning efficiency, reducing the energy consumption and shortening the cleaning time.
For another example, it is assumed that the user needs to clean the right area of the balcony, and at this time, if the existing operation mode of the floor sweeping robot is adopted, one mode is to perform cleaning of the whole house (including each room in the house), and the other mode is to perform cleaning of the balcony in a specific area, that is, only clean the functional room of the balcony. However, in either way, there is a problem that the cleaning efficiency is low, in practice, only the right area of the balcony needs to be cleaned, and according to the scheme of the embodiment, the area to be cleaned for defining the right area of the balcony can be determined, for example, a user can walk round the right area of the balcony by holding the mobile phone, then the area to be cleaned is determined according to the moving track of the mobile phone, and then the cleaning robot is controlled to start to perform the cleaning work according to the determined area to be cleaned, so that the cleaning of the right area of the balcony can be completed in a targeted manner, thereby effectively improving the cleaning efficiency, reducing the energy consumption and shortening the cleaning time. For example, for the robot of sweeping floor, can effectively reduce the electric quantity consumption of robot of sweeping floor, improve the duration of robot of sweeping floor.
Therefore, the random area with any shape can be specified in real time, the specified area can be precisely cleaned in a targeted manner, and the cleaning control obtained immediately after the user sees the object is realized, so that the cleaning efficiency is improved.
In this embodiment, it should be noted that the cleaning control method provided in this embodiment may be used for common cleaning devices, such as a cleaning robot, a floor cleaner, a table cleaner, a bed sheet cleaner, and the like. In addition, the cleaning control method provided in this embodiment may also be used for other intelligent cleaning devices, such as a robot for cleaning a wall, a robot for cleaning a roof of a room, a robot for cleaning a roof of an outside room, and the like, which is not limited in this embodiment. In addition, the cleaning control method provided in this embodiment may be widely used in households, factories, airports, outdoor, etc., and this embodiment is not limited thereto.
Based on the foregoing embodiment, in this embodiment, the determining, based on the movement track of the mobile terminal, the area to be cleaned for defining the cleaning range includes:
and determining the area to be cleaned for defining the cleaning range according to a moving track formed by moving the mobile terminal around the area to be cleaned.
In this embodiment, only before cleaning, the area to be cleaned can be determined in real time by carrying the mobile terminal around the movement track formed by moving the area to be cleaned, and then the cleaning equipment is controlled to perform targeted accurate cleaning on the area, so that the cleaning efficiency can be effectively improved, and meanwhile, the electricity consumption caused by cleaning can be reduced.
Therefore, the mobile terminal moves around the area to be cleaned, the area to be cleaned which can be visually and visually seen by a user can be accurately converted into the area to be cleaned by the cleaning equipment, so that the fixed-point cleaning which is obtained by the user is realized, the flexibility of determining the area to be cleaned is greatly improved, the real-time determination can be realized, and the accurate determination can be realized.
In this embodiment, the mobile terminal carries positioning software or the mobile terminal has a positioning function, so that when the mobile terminal moves, the moving track of the mobile terminal can be determined through the positioning software or the positioning function, and then the area to be cleaned for defining the cleaning range is determined according to the track. For example, the coordinates of a movement track formed by the mobile terminal actually moving around the area to be cleaned may be determined based on the ultra-wideband UWB (Ultra Wide Band) positioning system, and then the area to be cleaned for defining the cleaning range is determined according to the coordinates of the movement track.
When determining a moving track based on an ultra wideband UWB positioning system, firstly, the UWB positioning system as shown in fig. 3 needs to be established, and the process of establishing the UWB positioning system is as follows:
First, a reference tag needs to be provided for calibrating the reference position, where at least 1 reference tag is required. In addition, a positioning base station is required to be arranged, and the base station has UWB receiving and transmitting functions and is used for receiving positioning tag information and transmitting the information to a mobile phone or a sweeping robot. It is also necessary to provide at least 3 positioning base stations. In addition, a switch is required for the convergence connection of the positioning base stations, the switch is connected to a positioning system server, and a server is required for the positioning data processing.
It can be appreciated that the robot can be a WiFi intelligent robot that is integrated with a UWB positioning transceiver, and the mobile phone can be a smart mobile phone that is integrated with a UWB positioning transceiver.
The specific implementation process is shown in fig. 4, and comprises the following steps:
A. the method comprises the steps that a UWB positioning system is installed in a home house and comprises a server, an exchanger, a positioning base station and a reference tag, so that the UWB positioning system is installed in the home house and covers all positions of the house;
B. the mobile phone enters a sweeping area mode set by the sweeping robot through the APP;
C. drawing a closed graph in an area to be cleaned according to a mobile phone positioning track;
D. Drawing graphic coordinate information and directly transmitting the graphic coordinate information to the sweeping robot through UWB;
E. the sweeping robot interrupts and stores the current state according to the graphic coordinate information and automatically moves to the area to carry out sweeping operation;
F. after the pattern area is cleaned, the robot returns to a state before interruption (cleaning or standby charging).
In this embodiment, it can be understood that, in the setting of the mobile phone APP sweeping robot, setting of cleaning a real-time designated area (that is, setting of the cleaning control method provided in this embodiment) may be added, after the drawing of the opening area is confirmed, the mobile phone is moved, the area to be cleaned is outlined by the mobile phone track, and after a closed shape is drawn, the cleaning area is prompted and confirmed at the APP end. After a user confirms a cleaning area, the coordinate of the cleaning area is calculated and determined through the UWB server, an instruction is sent to the sweeping robot, the sweeping robot stores the current position and state, the sweeping robot carries out cleaning operation to the cleaning area according to the sweeping instruction, after the whole designated area is covered by cleaning, the sweeping robot feeds back information to the APP to prompt the completion of the cleaning task of the area, and the sweeping robot returns to the position and state before cleaning. It will be appreciated that this implementation of the present embodiment has the advantage of ease of operation and a relatively accurate determination of the area to be cleaned.
Based on the foregoing embodiment, in this embodiment, the determining, according to a movement track formed by moving the mobile terminal around the area to be cleaned, the area to be cleaned for defining the cleaning range includes:
determining coordinates of a moving track formed by moving the mobile terminal around the area to be cleaned;
and determining a region to be cleaned for defining a cleaning range according to the coordinates of the moving track.
In this embodiment, the coordinates of a movement track formed by moving the mobile terminal around the area to be cleaned may be determined based on an indoor positioning system or other positioning methods, and then the area to be cleaned for defining the cleaning range is determined according to the coordinates of the movement track.
Based on the content of the above embodiment, in the present embodiment, determining the area to be cleaned for defining the cleaning range according to the coordinates of the movement locus includes:
and determining whether the moving track is a closed track according to the coordinates of the moving track, and if so, determining the closed track as an area to be cleaned.
In this embodiment, it should be noted that, when the moving track is a closed track, the closed track may be directly determined as the area to be cleaned, so as to control the sweeping robot to sweep according to the closed track.
Based on the foregoing embodiment, in this embodiment, determining the area to be cleaned for defining the cleaning range according to a movement track formed by moving the mobile terminal around the area to be cleaned includes any one of the following modes:
determining that a moving track formed by moving the mobile terminal around an area to be cleaned is a closed track, and determining the closed track as the area to be cleaned;
determining whether a moving track formed by moving the mobile terminal around the area to be cleaned is a non-closed track, if so, determining whether the non-closed track can form a closed track with a fixed barrier in a room, and if so, determining the formed closed track as the area to be cleaned.
In this embodiment, the coordinates of a moving track formed by moving the mobile terminal around the area to be cleaned may be determined based on the indoor positioning system, then, according to the coordinates of the moving track, whether the moving track is a closed track is determined, if not, the moving track is determined to be a non-closed track, and if so, whether the non-closed track can form a closed track with a fixed barrier in the room is determined according to the coordinates corresponding to the non-closed track, and if so, the formed closed track is determined to be the area to be cleaned.
In the present embodiment, when determining an area to be cleaned using a movement locus formed by movement of a mobile terminal around the area to be cleaned, there are at least two cases: A. the moving track formed by the mobile terminal moving around the area to be cleaned is a closed track (as shown in fig. 2 a). B. The moving track formed by the mobile terminal moving around the area to be cleaned is a non-closed track (as shown in fig. 2b, 2c, 2d, 2e and 2 f).
In this case, as shown in fig. 2a, since a moving track formed by the mobile terminal moving around the area to be cleaned in the living room is a closed track, the area formed by the closed track may be directly determined as the area Q1 to be cleaned (shown by hatching).
In this case, as shown in fig. 2b, since the moving tracks (L1 and L2, for convenience of presentation, represented by dotted lines) formed by the movement of the mobile terminal in the living room are non-closed tracks, it should be determined whether the non-closed tracks L1 and L2 can form a closed track with the fixed barrier in the room, and if so, the closed track formed is determined as the area to be cleaned. Referring to fig. 2B, the non-closed type tracks L1 and L2 just form closed type tracks with the a-side and B-side of the sofa, so that the area enclosed by the closed type tracks can be defined as the area Q2 to be cleaned (shown by shading). In this embodiment, the coordinates of the fixed barrier in the default room are known to be pre-stored. Here, determining whether the non-closed track and the fixed barrier in the room form a closed track refers to determining that a closed curve connected end to end can be formed according to the coordinates of the non-closed track and the coordinates of the fixed barrier in the room, and if so, determining that the non-closed track and the fixed barrier in the room form the closed track.
In this case, as shown in fig. 2c, since the moving track (L3, for convenience of presentation, represented by a dotted line) formed by the moving of the mobile terminal in the living room is a non-closed track, it should be determined whether the non-closed track L3 and the fixed barrier in the room form a closed track, and if so, the closed track formed is determined as the area to be cleaned. Referring to fig. 2C, the non-closed track L3 just forms a closed track with the C-face of the sofa and the D-face of the bedroom 2, so that the area enclosed by the closed track can be defined as the area Q3 to be cleaned (shown by hatching).
In this case, as shown in fig. 2d, since the moving track (L4, for convenience of presentation, represented by a dotted line) formed by the moving of the mobile terminal in the living room is a non-closed track, it should be determined whether the non-closed track L4 and the fixed barrier in the room form a closed track, and if so, the closed track formed is determined as the area to be cleaned. Referring to fig. 2d, the non-closed track L4 cannot form a closed track with the fixed barrier in the room, so the non-closed track L4 is an ineffective moving track, and cannot form an effective area to be cleaned, and at this time, the user can be prompted to inform the user that the current moving track cannot form a closed curve, and cannot form a closed track with the fixed barrier in the room, so as to remind the user to perform a re-operation.
Based on the foregoing embodiment, in this embodiment, it is determined that a movement track formed by moving a mobile terminal around an area to be cleaned is a closed track, and determining the closed track as the area to be cleaned includes:
based on the indoor positioning system, determining coordinates of a moving track formed by moving the mobile terminal around the area to be cleaned;
and determining whether the moving track is a closed track according to the coordinates of the moving track, and if so, determining the closed track as an area to be cleaned.
In this embodiment, it should be noted that, when determining whether a movement track formed by moving the mobile terminal around the area to be cleaned is a closed track, the coordinates of the movement track formed by moving the mobile terminal around the area to be cleaned may be determined based on the indoor positioning system, and then, according to the coordinates of the movement track, whether the movement track is a closed track is determined, if so, the closed track is determined as the area to be cleaned. Generally, if the coordinates of the moving track are determined to form a closed curve end to end, the moving track can be determined to be a closed track.
Based on the content of the above embodiment, in this embodiment, based on the area to be cleaned, controlling the cleaning apparatus to perform a cleaning task on the area to be cleaned includes:
and controlling the cleaning equipment to clean the area in the closed track based on the coordinates corresponding to the closed track.
In this embodiment, when the area to be cleaned is determined, the cleaning device may be controlled to clean the area in the closed track according to the coordinates corresponding to the closed track, so as to complete the fixed-point cleaning operation. Therefore, the control mode of the embodiment is simple and convenient.
Based on the content of the above embodiments, in the present embodiment, the closed-type trajectory of the composition is determined as the area to be cleaned, including any one of the following modes:
(1) determining that a plurality of closed tracks consisting of the non-closed track and the fixed barriers in the room are arranged, displaying the closed tracks consisting of the closed tracks for a user to select, and determining the closed track selected by the user as an area to be cleaned;
(2) and determining that a plurality of closed tracks consisting of the non-closed tracks and fixed barriers in the room exist, and determining the closed track with smaller cleaning area as the area to be cleaned.
In this embodiment, as shown in fig. 2e, in this case, since the moving track (L5, for convenience of presentation, represented by a dashed line) formed by the moving of the mobile terminal in the living room is a non-closed track, it should be determined whether the non-closed track L5 can form a closed track with a fixed barrier in the room, and if so, the closed track formed is determined as the area to be cleaned. Referring to fig. 2E, the non-closed track L5 can just form a closed track with the E-plane of the bedroom 1, the D-plane of the bedroom 2, and the door and sofa, so that the area enclosed by the formed closed track can be determined as the area Q4 to be cleaned (shown by hatching). Meanwhile, the non-closed type track L5 can also form a closed type track with the F face of the bedroom 1, the G face of the bedroom 2, the H face of the kitchen and the I face of the bathroom, and therefore, the area surrounded by the formed closed type track can be determined as an area Q5 to be cleaned (indicated by hatching). Therefore, when the number of the closed tracks formed by the non-closed tracks and the fixed barriers in the room is multiple, in order to facilitate a user to determine which closed track forms an area to be cleaned, the closed tracks formed by the plurality of closed tracks can be displayed for the user to select, and the area surrounded by the closed tracks selected by the user is determined as the area to be cleaned. For example, for fig. 2e, the areas Q4 and Q5 to be cleaned may be displayed simultaneously for the user to select, and the area selected by the user is determined as the area to be cleaned. It can be seen that this way of processing improves the user experience so that the user can select the appropriate area to be cleaned.
In this embodiment, as shown in fig. 2f, in this case, since the moving track (L6, for convenience of presentation, represented by a dashed line) formed by moving the mobile terminal in the living room is a non-closed track, it should be determined whether the non-closed track L6 can form a closed track with a fixed barrier in the room, and if so, the closed track formed is determined as the area to be cleaned. Referring to fig. 2f, the non-closed track L6 just forms a closed track with the E-face of the bedroom 1, and the door and sofa, so that the area enclosed by the closed track can be defined as the area Q6 to be cleaned (shown by hatching). Meanwhile, the non-closed type track L6 can also form a closed type track with the F face of the bedroom 1, the D and G faces of the bedroom 2, the H face of the kitchen, and the I face of the bathroom, and therefore, the area surrounded by the formed closed type track can be determined as an area Q7 to be cleaned (indicated by hatching). In this case, for a targeted fixed point cleaning requirement, it is generally meant that the area Q6 is to be cleaned (e.g. some snack residues such as melon seed skin are required to be cleaned around the sofa and around the television when the television is seen in the living room), and it is understood that the area Q7 having an area significantly larger than the area Q6 is not generally a target of the fixed point cleaning, and it is assumed that it is desirable that if the area Q7 of the large area is a target of the cleaning, it is entirely possible to set a full house cleaning without missing a small area (although this is not necessarily the case, only more fitting the user requirement). Therefore, when the number of the closed tracks formed by the non-closed tracks and the fixed barriers in the room is multiple, the area surrounded by the closed tracks with smaller cleaning area can be determined as the area to be cleaned by default, so that the trouble of user selection is omitted.
It can be understood that (3) when there are a plurality of closed tracks composed of the non-closed track and the fixed barrier in the room, the corresponding area can be automatically selected for cleaning according to the pre-calibrated area number. For example, with respect to fig. 2f, when the user forms the moving track L6 by using the mobile phone, the area Q6 to be cleaned and the area Q7 to be cleaned appear at the same time, and in this case, the area Q6 to be cleaned may be preset to be an area that the user needs to clean at a fixed point (or the area Q7 to be cleaned may be preset to be an area that the user needs to clean at a fixed point), so that the trouble that the user selects each time is avoided, and meanwhile, the error caused by automatic machine selection is avoided.
Based on the foregoing embodiment, in this embodiment, the determining, based on the movement track of the mobile terminal, the area to be cleaned for defining the cleaning range includes:
and determining the area to be cleaned for defining the cleaning range according to the delineating shape formed by the mobile terminal in-situ immobility and the range delineating by pointing to the area to be cleaned, the angle between the mobile terminal and the cleaning surface and the distance between the mobile terminal and the cleaning surface.
In this embodiment, unlike the above embodiment, the present embodiment does not require that the user carries the mobile terminal to actually move around the area to be cleaned, that is, the user can stay in place, and only needs to point the mobile terminal to the area to be cleaned to perform area delineating to form a delineating shape, and then the area to be cleaned for defining the cleaning range can be determined according to the delineating shape, as shown in fig. 5.
In one implementation, the area to be cleaned for defining the cleaning range may be determined according to a delineating shape formed by the mobile terminal by pointing to the area to be cleaned and an angle between the mobile terminal and the cleaning surface and a distance between the mobile terminal and the cleaning surface.
In this implementation manner, it may be understood that when an angle exists between the mobile terminal and the cleaning surface, according to a distance between the mobile terminal and the cleaning surface and an angle between the mobile terminal and the cleaning surface, a delineating shape formed by area delineating by pointing to an area to be cleaned by the mobile terminal is mapped onto the cleaning surface, and an area to be cleaned corresponding to the delineating shape on the cleaning surface is determined.
In other implementations, it will be appreciated that when the mobile terminal is facing the cleaning surface (i.e., the mobile terminal is parallel to the cleaning surface), the area to be cleaned for defining the cleaning range may be determined directly according to the area delineating and delineating of the mobile terminal to the area to be cleaned, i.e., the two are equivalent. For example, when the wall surface needs to be cleaned, the mobile terminal can be moved over the wall surface, the area is defined to the area to be cleaned to form a defined shape, for example, the defined shape is a circle, and then the defined shape can be directly used as the area to be cleaned for defining the cleaning range. In addition, when the ground is required to be cleaned, the mobile terminal can be moved over the ground, the area is delineated to the area to be cleaned to form a delineated shape, for example, the delineated shape is a square shape, and then the delineated shape can be directly used as the area to be cleaned for defining the cleaning range.
It can be appreciated that by adopting the mode of the embodiment, the purpose of demarcating the area (demarcating the area to be cleaned) can be achieved without the movement of the user, thereby being more convenient for the user to use, and enabling the user to control the cleaning equipment to clean the demarcating area in a targeted way at any time and any place.
Based on the foregoing embodiment, in this embodiment, determining an area to be cleaned for defining a cleaning range according to a delineating shape formed by in-situ immobilisation of a mobile terminal by pointing to the area to be cleaned, an angle between the mobile terminal and a cleaning surface, and a distance between the mobile terminal and the cleaning surface specifically includes:
selecting a first coordinate point (x 1, y 1) of a preset number (such as 10, 20 or 30) on a delineating shape formed by pointing to an area to be cleaned in situ and carrying out range delineation; it should be noted that, in order to ensure the accuracy of the area to be cleaned, the selected preset number of first coordinate points are preferably uniformly distributed on the defined shape;
determining a preset number of second coordinate points (x 2, y 2) corresponding to the preset number of first coordinate points on the cleaning surface according to a first relation model;
Determining a region to be cleaned for defining a cleaning range according to the second coordinate points of the preset number;
wherein the first relationship model is:
x2=x1-L×cosθ,y2=y1-L×sinθ
wherein L represents a distance between the first coordinate point (x 1, y 1) and the second coordinate point (x 1, y 1), H 1 Representing a distance between the mobile terminal and the cleaning surface; θ1 represents the angle between the mobile terminal and the cleaning surface, and θ represents the angle between the line segment L and the x-axis of the ground coordinate system.
In this embodiment, the angle of the mobile phone can be identified through the mobile phone gyroscope, and then the mobile phone is mapped onto the cleaning surface through the circling shape formed by circling the mobile phone in the air by matching with the distance between the mobile phone and the ground, so that the user can delimit the ground cleaning area without walking, and the use of the mobile phone is convenient for the user.
In this embodiment, the drawing of the first shape refers to the computer automatically drawing the first shape, and may be understood as generating the first shape.
The scheme provided by the present embodiment is described below with reference to fig. 5 and 6. In this embodiment, this may be achieved by means of a UWB positioning system. Specifically, more than 4 UWB positioning base stations may be provided, so that three-dimensional coordinates may be located to identify the heights of the mobile phone and the ground. Identifying the direction of the mobile phone through a mobile phone gyroscope, wherein the mobile phone is obliquely directed to the ground; calculating the coordinate of the intersection point of the X axial direction of the mobile phone and the ground according to the positioning coordinate of the mobile phone, the ground clearance height and the angle theta 1 (first angle) between the X axial direction of the mobile phone and the ground; the mobile phone rotates a small circle in the air, and the intersection point of the X axis of the mobile phone and the ground can simultaneously draw a large ground area. The floor area can be used as an area to be cleaned by the sweeper. The purpose of demarcating the area can be achieved without the need for a person to move (see fig. 5). The coordinate calculation method comprises the following steps: UWB positioning knows the coordinate position of (x 1, y 1), the distance between coordinates (x 1, y 1) and (x 2, y 2) on the ground:
The UWB positioning system has generated a ground coordinate system, the mobile phone direction and the included angle theta between the L line segment and the x axis of the coordinate system are identified by a mobile phone gyroscope, so that the ground plane coordinate of (x 2, y 2) is calculated (as shown in fig. 6):
x2=x1-L×cosθ,y2=y1-L×sinθ
(x2,y2)=(x1-L×cosθ,y1-L×sinθ)
in this embodiment, the angle of the mobile phone can be identified through the mobile phone gyroscope, and then the mobile phone is mapped onto the cleaning surface through the circling shape formed by circling the mobile phone in the air by matching with the distance between the mobile phone and the ground, so that the user can delimit the ground cleaning area without walking, and the use of the mobile phone is convenient for the user.
Based on the foregoing embodiment, in this embodiment, the cleaning control method further includes the following steps:
step 103: collecting an image of the area to be cleaned;
step 104: correcting the area to be cleaned based on the image;
accordingly, the step 102 controls the cleaning device to perform the cleaning task on the area to be cleaned based on the area to be cleaned, including:
and controlling the cleaning equipment to execute cleaning tasks on the modified area to be cleaned based on the modified area to be cleaned.
In this embodiment, in one implementation manner, the correction of the area to be cleaned based on the image may be implemented in the following manner:
Receiving a marking area obtained by marking operation on the image by a user;
and correcting the area to be cleaned according to the marking area of the user to obtain a corrected area to be cleaned.
For example, in one implementation, the area to be cleaned is modified according to the mark area of the user, and obtaining the modified area to be cleaned may refer to taking the union of the mark area and the area to be cleaned as the modified area to be cleaned. For another example, in other implementation manners, the area to be cleaned may be locally modified (locally reduced or enlarged, etc.) according to the marked area of the user, so as to obtain the modified area to be cleaned.
Based on the foregoing embodiment, in this embodiment, according to the area to be cleaned, controlling the cleaning apparatus to perform a cleaning task on the area to be cleaned includes:
identifying the ground type and/or dirt type of the area to be cleaned;
and controlling the cleaning equipment to carry out targeted cleaning treatment on the area to be cleaned according to the ground type and/or dirt type identification result.
In this embodiment, the floor type includes different types of floors, carpets, tiles, and the like. The types of dirt include dust, debris, hard shells, water stains, oil stains, and the like.
In this embodiment, the cleaning treatment is performed specifically on the area to be cleaned according to the floor type and/or the dirt type recognition result. For example, if the floor type is determined to be a carpet material, a cleaning mode matching the carpet material may be used, such as increasing the adsorption force. If the floor type is determined to be the floor material, a cleaning mode matched with the floor material, such as a mode of adsorbing by adopting standard adsorption force and then mopping by adopting micro-wet cloth, can be adopted. If the floor type is determined to be the floor tile material, a cleaning mode matched with the floor tile material, such as a cleaning mode of adopting a smaller adsorption force to adsorb, then adopting wet cloth to mop and adopting dry cloth to mop, can be adopted.
In addition, different cleaning modes are also adopted for different dirt types, for example, if the determined dirt type is dust, the cleaning mode can be adopted by adopting a wet cloth cleaning mode. If the determined type of dirt is debris, the cleaning may be performed by adsorption. If the determined dirt type is water dirt, the dirt can be cleaned by adopting a dry cloth mopping mode. If the determined dirt type is greasy dirt, cleaning can be performed by spraying a detergent first and then wiping with wet cloth, and if the dirt type is carpeting, cleaning can be performed by spraying a detergent first and then adsorbing, or cleaning can be performed by spraying a detergent first and then washing.
It will be appreciated that in the case of floor type recognition, the recognition may be performed by means of image acquisition and machine learning, or by means of sound acquisition and sound characteristics, where sound refers to the sound emitted by the cleaning device as it passes over the floor. With respect to the specific recognition process, the neural network model may be used for training and recognition, and this embodiment will not be described in detail.
It will be appreciated that in the context of scale type identification, the identification may be performed by means of image acquisition + machine learning. With respect to the specific recognition process, the neural network model may be used for training and recognition, and this embodiment will not be described in detail.
Based on the foregoing embodiment, in this embodiment, performing the ground type identification on the area to be cleaned includes:
collecting sound signals sent by cleaning equipment in the area to be cleaned;
and determining the material type of the area to be cleaned according to the audio characteristics in the sound signals.
In this embodiment, it may be understood that, during the walking process of the cleaning device, the internal components such as the motor operate to generate sound, and the sound is transmitted in the environment where the cleaning device is located, so that the sound signal of the environment where the cleaning device is located may be collected by providing the sound collector on the cleaning device, and the audio feature extraction and analysis is performed on the sound signal to determine the type of the floor material of the environment where the cleaning device is currently located.
The propagation path of the sound generated by the cleaning device may be various, and one of the propagation paths is directed to the ground so as to be reflected by the ground. The effect of different floor materials on the sound is different, for example, when the floor material is a carpet, the carpet tends to absorb much energy of the sound, so that the energy of the sound signal reflected by the carpet is severely attenuated. Therefore, the influence of different ground material environments on the sound signals sent by the cleaning equipment is different, so that the characteristics of different audio characteristics of the sound signals reflected by different ground materials are caused, and the purpose of accurately identifying the ground material type is realized based on the audio characteristics in the sound signals reflected by the ground. Therefore, the collecting the sound signal sent by the cleaning device in the area to be cleaned can be: and collecting the sound signals emitted by the cleaning equipment and reflected by the ground in the cleaning process.
In the following, the explanation will be given by taking as an example the collection of a sound signal reflected by the floor surface emitted by the cleaning apparatus during the cleaning process, but it is understood that the collection of a sound signal of the environment in which the cleaning apparatus is located is not limited to this example.
In this embodiment, the shape and structure of the cleaning device are not specifically limited, but it can be understood that when the cleaning device is placed on the ground to perform cleaning tasks such as sweeping, mopping, and the like, a gap with a certain height is often formed between the bottom of the mechanical body of the cleaning device and the ground, so that the cleaning device can smoothly walk on the ground and friction between the cleaning device and the ground in the walking process is reduced. Accordingly, a sound collector, such as a microphone, may be provided at the bottom of the machine body of the cleaning device for collecting the ground-reflected sound signals emitted by the cleaning device during cleaning. Before specifically describing the ground material identification method provided by the embodiment of the invention, the type of the ground material is described: cleaning devices are often used in home or airport or factory settings where the floor environment is often complex, such as in certain homes where tiles are laid in kitchens, toilets, other rooms where floors are laid, and where small areas of carpets, footrests, etc. may be laid, such as in living rooms or in certain rooms. Therefore, in this embodiment, the types of ground materials may be classified into carpet, floor and tile materials according to the common ground material conditions in the actual home scene.
The ground material in this embodiment refers to a material of a medium on which the cleaning apparatus directly travels, that is, the ground in this embodiment refers to a medium on which the cleaning apparatus directly travels. For example, when a carpet is laid on a floor surface, the cleaning apparatus is directly driven over the carpet while cleaning the carpet, and thus the floor material at this time is referred to as a carpet, not a floor under the carpet.
It will be appreciated that, regardless of how the types of floor materials are specifically divided in advance, the influence of each type of floor material obtained by division on the sound signal emitted by the cleaning device is different, especially the influence on the sound signal reflected by the floor is different, and the influence can be represented by the difference of the audio feature information contained in the sound signal reflected by the different type of floor material, so that the embodiment can determine the type of material of the area to be cleaned according to the audio feature in the sound signal.
Specifically, when determining the material type of the area to be cleaned according to the audio characteristics in the audio signal, the method of automatic recognition by using a neural network model or the method of matching recognition by using a database may be used, which is not limited in this embodiment.
Therefore, the ground type is identified, so that the cleaning mode suitable for the current ground type can be selected, the cleaning effect can be improved, and user experience can be improved.
Based on the foregoing embodiment, in this embodiment, according to a ground type identification result, a cleaning apparatus is controlled to perform targeted cleaning processing on an area to be cleaned, including:
and controlling the cleaning equipment to carry out targeted cleaning treatment on the area to be cleaned by adopting a matched cleaning mode according to the ground type identification result.
In this embodiment, according to the ground type recognition result, the cleaning apparatus may be controlled to perform targeted cleaning processing on the area to be cleaned using the matched cleaning mode. For example, if the floor type is determined to be a carpet material, a cleaning mode matching the carpet material may be used, such as increasing the adsorption force. If the floor type is determined to be the floor material, a cleaning mode matched with the floor material, such as a mode of adsorbing by adopting standard adsorption force and then mopping by adopting micro-wet cloth, can be adopted. If the floor type is determined to be the floor tile material, a cleaning mode matched with the floor tile material, such as a cleaning mode of adopting a smaller adsorption force to adsorb, then adopting wet cloth to mop and adopting dry cloth to mop, can be adopted. Therefore, according to the ground type identification result, the cleaning equipment is controlled to carry out targeted cleaning treatment on the area to be cleaned by adopting the matched cleaning mode, so that the cleaning effect can be effectively improved.
Based on the foregoing embodiment, in this embodiment, performing the dirt type recognition on the area to be cleaned includes:
collecting an image of the area to be cleaned;
and identifying the dirt type of the area to be cleaned according to the image.
In this embodiment, when the dirt type of the area to be cleaned is identified, an effective means is to determine the dirt type of the area to be cleaned by means of image identification. For example, different types of dust, chips, hard shells, water stains, oil stains and the like have obvious image distinguishing characteristics, so that the corresponding dirt types can be accurately identified according to an image identification mode.
Based on the foregoing embodiment, in this embodiment, identifying, according to the image information, the type of dirt in the area to be cleaned includes:
inputting the image into a dirt type recognition model, and acquiring a dirt type recognition result of the area to be cleaned;
the dirt type recognition model is obtained by training an image sample and a dirt type recognition result label of the image sample based on a neural network.
In this embodiment, a specific recognition manner is provided, and the embodiment uses a dirt type recognition model implemented based on a neural network to recognize the dirt type, so that the dirt type recognition result of the area to be cleaned can be accurately obtained. Since conventional algorithms can be employed for the construction and training of the neural network model, details are not described herein.
Based on the foregoing embodiment, in this embodiment, identifying, based on the image, a type of dirt in the area to be cleaned includes:
comparing the image with each reference image of known dirt types stored in a database, and taking the dirt type corresponding to the reference image matched with the image in the database as the dirt type of the area to be cleaned.
In this embodiment, a different processing manner from the foregoing embodiment is provided, where the collected image is compared with each reference image of known dirt types stored in the database by using the dirt type corresponding to the reference image matched with the image in the database as the dirt type of the area to be cleaned, and the advantage of the manner of this embodiment is that the implementation is simpler without undergoing a complex model training process.
The invention further provides a cleaning control method, which provides a novel method for determining the area to be cleaned for a user, and the user can delimit the cleaning area by photographing the area to be cleaned without walking, so that the user can use the cleaning control method conveniently. The cleaning control method provided in this embodiment will be explained with reference to specific embodiments.
A1: determining position and posture information of a mobile terminal when the mobile terminal shoots an area to be cleaned;
in this step, the mobile terminal may refer to a mobile intelligent mobile device such as a mobile phone, a PAD, a notebook computer, an intelligent bracelet, an intelligent watch, and an intelligent reader.
In this step, when the mobile terminal shoots an area to be cleaned, the position and posture information of the mobile terminal includes: position information when the mobile terminal performs shooting, angle information between the mobile terminal and a cleaning surface (typically, the ground), distance information between the mobile terminal and the cleaning surface when the mobile terminal performs shooting, and the like.
It can be understood that the purpose of acquiring the position and posture information of the mobile terminal when shooting is performed in this step is to automatically convert the image shot by the mobile terminal for the area to be cleaned into the area to be cleaned on the cleaning surface in the subsequent step, so as to realize the fixed-point cleaning of what you see is what you get.
It should be noted that, the advantage of this embodiment is that after the user performs fixed-point shooting on the area to be cleaned, the shot image may be directly converted into the area to be cleaned on the cleaning surface, so as to implement targeted fixed-point cleaning. It should be emphasized here that, unlike the prior art in which a certain room or a certain area in the room (the area includes an area requiring cleaning and an area not requiring cleaning) is photographed, and then the area requiring cleaning is automatically identified from the photographed image, the present embodiment does not need to identify the area requiring cleaning from the photographed image, but directly uses an intelligent device to perform targeted fixed-point photographing on the area requiring cleaning, and then directly converts the photographed image into the area requiring cleaning on the cleaning surface, so as to implement the visible fixed-point cleaning, which is the innovation of the present application. Therefore, the user can conveniently pick up the mobile phone to shoot the area needing cleaning, for example, the user is supposed to eat melon seeds before a sofa to leave a melon seed skin in the process of watching television, at the moment, the user only needs to pick up the mobile phone to shoot the area needing cleaning of the melon seed skin before the sofa at fixed points, the scheme of the application can automatically convert the image shot at fixed points by the user into the area needing cleaning (namely the area to be cleaned) of the cleaning equipment on the cleaning surface, and therefore the fixed point cleaning obtained after the user sees is realized, and meanwhile, the unnecessary image recognition processing process is greatly simplified.
A2: determining an imaging capturing angle of a camera on the mobile terminal when the mobile terminal shoots an area to be cleaned;
in this step, it should be noted that, the imaging capturing angle of the camera on the mobile terminal is related to a photographing parameter (such as a wide-angle parameter) set during image photographing, and the imaging capturing angle of the camera on the mobile terminal can be obtained by reading the setting of the related photographing parameter (lens wide-angle) of the camera on the mobile terminal. It will be appreciated that the same camera, the different wide angles of the lens take different ranges.
It should be noted that, steps A1 and A2 are not sequenced.
A3: determining the area to be cleaned according to the shot image, the position and posture information and the imaging capturing angle;
in this step, according to the position and posture information of the mobile terminal when the mobile terminal shoots the area to be cleaned, the imaging capturing angle of the camera on the mobile terminal when the mobile terminal shoots the area to be cleaned, and the image obtained by shooting the area to be cleaned, the mobile terminal accurately converts the area to be cleaned which can be visually and visually seen by the user into the area which needs to be cleaned on the cleaning surface of the cleaning device, thereby realizing the fixed-point cleaning of the area to be cleaned, effectively improving the cleaning efficiency, and reducing the electric quantity consumption caused by cleaning. Therefore, the area to be cleaned which can be visually seen by a user is accurately converted into the area to be cleaned by the cleaning equipment by using the mobile terminal, so that the fixed-point cleaning which is obtained immediately after the user sees is realized, the cleaning efficiency can be effectively improved, and meanwhile, the electric quantity consumption caused by cleaning can be reduced.
In this embodiment, the cleaning surface refers to a surface on which the cleaning device performs a cleaning task. For example, the cleaning surface is the floor surface if the cleaning device cleans the floor surface, and the cleaning surface is the wall surface if the cleaning device cleans the wall surface.
A4: and controlling the cleaning equipment to execute a cleaning task on the area to be cleaned according to the area to be cleaned.
In the step, after the area to be cleaned is obtained, the cleaning equipment is controlled to run to the area to be cleaned to execute the cleaning task. In this step, it should be noted that, when the cleaning device is controlled to operate to the area to be cleaned to perform the cleaning task, the cleaning device may be controlled to adopt a default working mode, or the cleaning device may be controlled to select a working mode adapted to the type of dirt in the current area to be cleaned.
In this embodiment, the execution body of the cleaning control method provided in this embodiment may be a cleaning control device for controlling the cleaning apparatus to work, or may be the cleaning apparatus itself, such as a cleaning robot, or may be a mobile apparatus, such as a mobile phone, a PAD, or the like, or may be control software installed on the mobile apparatus, or may be a server, or the like, which is not limited in this embodiment.
According to the cleaning control method provided by the embodiment, the to-be-cleaned area is accurately determined according to the position and posture information of the mobile terminal when the mobile terminal shoots the to-be-cleaned area, the imaging capturing angle of the camera on the mobile terminal when the mobile terminal shoots the to-be-cleaned area, and the image obtained by shooting the to-be-cleaned area, so that the to-be-cleaned area which is visually and intuitively seen by a user is accurately converted into the area which needs to be cleaned by the cleaning equipment by the mobile terminal, the obtained fixed-point cleaning is realized, the cleaning efficiency is effectively improved, and the electricity consumption caused by cleaning is reduced. Therefore, by adopting the cleaning control method provided by the invention, the area to be cleaned can be determined in real time by shooting the area to be cleaned by the mobile terminal before cleaning, and then the cleaning equipment is controlled to clean the area accurately and pointedly, so that the cleaning efficiency can be effectively improved, and meanwhile, the electricity consumption caused by cleaning can be reduced.
As described above, it is emphasized that, in this embodiment, when the mobile terminal is used to photograph an area to be cleaned, unlike the prior art in which a certain room or a certain area in a room is photographed (the area includes both an area to be cleaned and an area not to be cleaned), the area to be cleaned does not need to be identified from the image obtained by photographing, and when the image is photographed, the area to be cleaned is referred to, that is, when the image is photographed, the area to be cleaned is photographed, and the area not to be cleaned is not included. In other words, the intelligent device is directly utilized to perform targeted fixed-point shooting on the area to be cleaned, and then the shot image is directly converted into the area to be cleaned on the cleaning surface, so that the fixed-point cleaning is realized. In the prior art, a certain area (the area contains an area needing to be cleaned and an area needing not to be cleaned) in a certain room or the room is photographed, then the area needing to be cleaned is determined by automatically identifying the image obtained by photographing, then the area needing to be cleaned is identified, but the area needing to be cleaned is not identified from the image obtained by photographing, the intelligent device is directly utilized to carry out targeted fixed-point photographing on the area needing to be cleaned, and then the image obtained by photographing is directly converted into the area needing to be cleaned on the cleaning surface, so that the visible fixed-point cleaning is realized, which is the innovation of the embodiment.
In addition, in this embodiment, it should also be noted that the "area to be cleaned" in this embodiment is different from the "fixed cleaning area set" in the prior art, for example, the existing sweeping robot generally sets several fixed cleaning areas, such as a living room, a main lying, a sub lying, and the like, in a preset manner, so that the subsequent user can select one or more of the several cleaning areas to perform cleaning, and accordingly, the sweeping robot can perform cleaning according to the one or more cleaning areas selected by the user from the several fixed cleaning areas.
In contrast to the present application, the "area to be cleaned" in the present application refers to an area to be cleaned that is generated in real time according to an actual range to be cleaned, and it is understood that a place located in the area to be cleaned needs to be cleaned, and a place located outside the area to be cleaned does not need to be cleaned.
For example, assume that the user leaves some snack residues in front of the living room sofa after watching television, and the area to be cleaned is a small area in front of the living room sofa. For the scheme of the embodiment, the mobile phone can be taken up to carry out targeted shooting on the small area to obtain an image almost containing only the small area, then the small area contained in the image is mapped into an area to be cleaned (namely, a small area in front of a living room sofa) of a cleaning surface (ground) through algorithm processing according to the position information, the height information and the angle information of the mobile phone during shooting and the imaging capturing angle information of a camera on the mobile phone, and then the robot can be controlled to travel to the area to be cleaned of the cleaning surface for cleaning.
Therefore, the embodiment does not need to clean the whole living room, but realizes the fixed-point cleaning of what you see is what you get, so that the cleaning efficiency can be improved, the cleaning time can be shortened, the cleaning energy consumption can be reduced, meanwhile, the requirements can be met, the user experience can be improved, and the embodiment can enable the user to arrange temporary fixed-point cleaning tasks at any time according to the requirements in front of eyes to perform targeted cleaning.
It follows that the present application "determining the area to be cleaned for defining the cleaning range" differs from the "fixed cleaning area set in the prior art" mainly in that: (1) the area to be cleaned in the application is generated in real time; (2) the area to be cleaned in the application is a targeted area to be cleaned, which is generated for the current area to be cleaned; (3) the to-be-cleaned area in the application can be a to-be-cleaned area determined in a certain existing fixed cleaning area, or can be a to-be-cleaned area determined to span a plurality of existing fixed cleaning areas, and the to-be-cleaned area in the application is irrelevant to whether the fixed cleaning area is preset or not, so long as a user generates the to-be-cleaned area to a range to be cleaned before using the to-be-cleaned area. It can be understood that the three differences in the present application and the prior art do not exist independently, but complement each other, that is, the to-be-cleaned area in the present application is a pattern for defining the cleaning range in real time according to the area to be cleaned in the present application, the pattern does not use a room as a division standard, and does not use a function room as a division standard, but uses the area to be cleaned currently as a division standard, and defines the area to be cleaned currently in the to-be-cleaned area, so that a user can determine the matched to-be-cleaned area according to the cleaning range corresponding to the current cleaning requirement, so that the to-be-cleaned area can clearly define the cleaning range corresponding to the user cleaning requirement, and further the cleaning process can be performed on the area corresponding to the to-be-cleaned area according to the to-be-cleaned area, thereby completing the cleaning task in a targeted manner, and improving the cleaning efficiency, reducing the energy consumption and shortening the cleaning time while meeting the current cleaning requirement. It is particularly emphasized that the cleaning device can accurately convert the area to be cleaned, which can be visually and intuitively seen by a user, into the area, which needs to be cleaned on the cleaning surface, of the cleaning device, so that the fixed-point cleaning (which is the key point of the cleaning device) is realized, the cleaning efficiency can be effectively improved, and meanwhile, the electricity consumption caused by cleaning can be reduced.
It can be seen that this embodiment is different from the prior art in that a treatment scheme for photographing a large area and then identifying a small area to be cleaned is different from the prior art in that a treatment scheme for fixing the cleaning area is preset. The innovation point of the embodiment is that the region to be cleaned which is seen visually can be accurately converted into the region which needs to be cleaned on the cleaning surface by the cleaning equipment, so that the fixed-point cleaning which is seen and obtained is realized.
For the above example, assuming that a user only needs to clean a small melon seed skin area in front of a living room sofa, the area can be photographed in a targeted manner by using a mobile phone, an image of the area is obtained (other areas which do not need to be cleaned except the area are no longer in the image range), and then the area to be cleaned on the image is accurately converted into the area which needs to be cleaned on a cleaning surface by cleaning equipment through processing, so that the spot cleaning obtained in the observation is realized. That is, the present embodiment accurately determines the area to be cleaned that needs to be cleaned, and further performs cleaning processing according to the area to be cleaned (i.e., a small area in front of the living room sofa), without cleaning other areas, thereby improving cleaning efficiency, shortening cleaning time, reducing cleaning energy consumption, and simultaneously meeting requirements.
In this embodiment, it should be noted that the cleaning control method provided in this embodiment may be used for common cleaning devices, such as a cleaning robot, a floor cleaner, a table cleaner, a bed sheet cleaner, and the like. In addition, the cleaning control method provided in this embodiment may also be used for other intelligent cleaning devices, such as a robot for cleaning a wall, a robot for cleaning a roof of a room, a robot for cleaning a roof of an outside room, and the like, which is not limited in this embodiment.
In addition, the cleaning control method provided in this embodiment may be widely used in households, factories, airports, outdoor, etc., and this embodiment is not limited thereto.
Based on the foregoing embodiment, in this embodiment, determining position and orientation information of a mobile terminal when the mobile terminal photographs an area to be cleaned includes:
determining the position coordinates of the mobile terminal when the mobile terminal shoots an area to be cleaned based on an indoor positioning system;
determining a first distance between the mobile terminal and a cleaning surface when the mobile terminal shoots an area to be cleaned based on an indoor positioning system;
and determining a first angle between the mobile terminal and the cleaning surface when the mobile terminal shoots the area to be cleaned based on an angular velocity sensor carried on the mobile terminal.
In this embodiment, the position and posture information of the mobile device includes a position coordinate of the mobile terminal when the mobile terminal photographs the area to be cleaned, a first distance between the mobile terminal and the cleaning surface when the mobile terminal photographs the area to be cleaned, and a first angle between the mobile terminal and the cleaning surface when the mobile terminal photographs the area to be cleaned. In order to simplify the processing procedure, in one implementation mode, an indoor positioning system can be directly utilized to determine the position coordinates of the mobile terminal when the mobile terminal shoots an area to be cleaned; and determining a first distance between the mobile terminal and the cleaning surface when the mobile terminal shoots the area to be cleaned by using the indoor positioning system, and determining a first angle between the mobile terminal and the cleaning surface when the mobile terminal shoots the area to be cleaned by directly using an angular velocity sensor carried on the mobile terminal.
In this embodiment, it will be appreciated that when the cleaning surface is the floor, the first distance is actually the height of the mobile terminal from the floor.
In this embodiment, the indoor positioning system may be an ultra-wideband UWB (Ultra Wide Band) positioning system. Further, a global positioning system GPS (Global Positioning System) is also possible. In addition, the current position information of the mobile terminal can also be obtained through a human body infrared sensor (pyroelectric infrared sensor).
In the present embodiment, the angular velocity sensor may be implemented using a gyro sensor.
Based on the content of the above embodiment, in the present embodiment, determining the area to be cleaned according to the photographed image, the position and orientation information, and the imaging capturing angle includes:
according to the shot image, the position coordinates, the first distance and the first angle, determining that a center point on the image is mapped to a center coordinate on the cleaning surface;
and determining a region to be cleaned corresponding to the image on the cleaning surface according to the center coordinates and the imaging capturing angle.
In this embodiment, a scheme provided by this embodiment will be described with reference to fig. 7. In this embodiment, this may be achieved by means of a UWB positioning system. Specifically, more than 4 UWB positioning base stations may be provided, so that three-dimensional coordinates may be located to identify the heights of the mobile phone and the ground. The mobile phone gyroscope is used for identifying the direction of the mobile phone, and the mobile phone points to the ground for photographing. The coordinates of a center point A of the image mapped to the ground are calculated according to the mobile phone positioning coordinates, the ground clearance and the angle theta 2 (first angle) between the Z axis of the mobile phone and the ground, and then the to-be-cleaned area on the ground corresponding to the to-be-cleaned area on the image can be determined according to the coordinates of the center point A and the shooting angle theta 3 (imaging capturing angle) of the mobile phone camera (as shown in fig. 7).
It can be appreciated that by this processing manner of the present embodiment, a novel manner of determining the area to be cleaned is provided for the user, and the user does not need to walk, but only needs to photograph the area to be cleaned to define the cleaning area (as shown in fig. 7), thereby facilitating the use of the user.
Based on the foregoing embodiment, in this embodiment, determining, according to the center coordinate and the imaging capturing angle, a region to be cleaned corresponding to the image on the cleaning surface includes:
determining a circular arc-shaped area of the image on the cleaning surface according to the center coordinates and the imaging capturing angle;
determining a first region coordinate range of the image in the circular arc region according to a corresponding cutting processing process when the image is mapped to the cleaning surface;
and determining the first area coordinate range as an area to be cleaned.
In this embodiment, a scheme provided by this embodiment will be described with reference to fig. 7. In this embodiment, this may be achieved by means of a UWB positioning system. Specifically, more than 4 UWB positioning base stations may be provided, so that three-dimensional coordinates may be located to identify the heights of the mobile phone and the ground. The mobile phone gyroscope is used for identifying the direction of the mobile phone, and the mobile phone points to the ground for photographing. The method comprises the steps of calculating the coordinates of a central point A of an image mapped to the ground according to the mobile phone positioning coordinates, the ground clearance height and the angle theta 2 (first angle) between the Z axis of the mobile phone and the ground, further, according to the coordinates of the central point A and the shooting angle theta 3 (imaging capturing angle) of a mobile phone camera, drawing out a circular arc area, calculating the coordinate range of an actual area of the image according to the clipping process of the process of switching imaging of a camera lens into a directional image, and then taking the range of the image as the area cleaned by a sweeper (shown in fig. 7).
Therefore, the image shot for the area to be cleaned can be effectively converted into the area to be cleaned on the ground, and accordingly fixed-point cleaning control obtained immediately after the image is seen can be easily and accurately completed.
Based on the same inventive concept, another embodiment of the present invention provides a cleaning control method, including:
s101, determining position and posture information of a mobile terminal when the mobile terminal shoots an area;
in this step, it should be noted that the "area" here is different from the "area to be cleaned" in the above embodiment, and the "area" here refers to an area having a larger range than the area to be cleaned. For example, assuming that the area to be cleaned is a small area in front of the sofa, the "area" herein may refer to the living room area, or to an area containing the contents of the sofa, television, and tea table.
S102, determining an imaging capturing angle of a camera on the mobile terminal when the mobile terminal shoots the area;
in this step, it should be noted that, the imaging capturing angle of the camera on the mobile terminal is generally set fixedly when the mobile device leaves the factory, and is generally in the range of 60 ° -80 °. And acquiring the imaging capturing angle of the camera on the mobile terminal by reading the corresponding setting on the mobile terminal. It is understood that the steps S101 and S102 are not sequential.
S103, receiving a marked area of a user on the shot image;
in this step, it should be noted that, in order to facilitate the user to define the cleaning area more accurately, the area marking or the area defining may be performed on the captured image (as shown in fig. 8 and 9). It can be understood that the manner of carrying out the region marking or the region delineating on the shot image can provide more operation freedom for the user, so that the user has more time to select the proper region needing cleaning. In addition, the manner of marking or delineating the area on the photographed image may mark or delineate one or more areas to be cleaned according to the need, which is not limited in this embodiment.
S104, determining a region to be cleaned according to a mark region, the position and posture information and the imaging capturing angle of the user on the shot image;
in this step, according to the position and posture information of the mobile terminal when the mobile terminal shoots, the imaging capturing angle of the camera on the mobile terminal, and the marked area of the user on the shot image, the marked area is accurately converted into the area where the cleaning equipment needs to clean on the cleaning surface, so that the cleaning can be performed at a precise fixed point, the cleaning efficiency can be effectively improved, and meanwhile, the electric quantity consumption caused by cleaning can be reduced.
And S105, controlling the cleaning equipment to execute a cleaning task on the area to be cleaned according to the area to be cleaned.
In the step, after the area to be cleaned is obtained, the cleaning equipment is controlled to run to the area to be cleaned to execute the cleaning task.
In this embodiment, it should be noted that, when determining that the mobile terminal photographs the area, the position and posture information of the mobile terminal may be: determining the position coordinates of the mobile terminal when the mobile terminal shoots the area based on an indoor positioning system; determining a first distance between the mobile terminal and a cleaning surface when the mobile terminal shoots the area based on an indoor positioning system; and determining a first angle between the mobile terminal and the cleaning surface when the mobile terminal shoots the area based on an angular velocity sensor carried on the mobile terminal.
Based on the above-described embodiments, in the present embodiment, determining an area to be cleaned according to a mark area, the position and orientation information, and the imaging capturing angle of a user on a photographed image includes:
determining a center coordinate of a center point on the image mapped to the cleaning surface according to the shot image, the position coordinate, the first distance and the first angle;
And determining a corresponding area to be cleaned on the cleaning surface according to the center coordinates, the imaging capturing angle and the marking area.
In this embodiment, in order to facilitate the user to define the cleaning area more accurately, an area mark or area definition may be performed on the photographed image (as shown in fig. 8 and 9). It can be understood that the manner of carrying out the region marking or the region delineating on the shot image can provide more operation freedom for the user, so that the user has more time to select the proper region needing cleaning. In addition, the manner of marking or delineating the area on the photographed image may mark or delineate one or more areas to be cleaned according to the need, which is not limited in this embodiment.
The scheme provided by the present embodiment is described below with reference to fig. 8 and 9. From the photographed image, an area can be defined as a cleaning area (or other marking means) of the sweeper by sliding on the screen of the mobile phone. In this embodiment, this may be achieved by means of a UWB positioning system. Specifically, more than 4 UWB positioning base stations may be provided, so that three-dimensional coordinates may be located to identify the heights of the mobile phone and the ground. The mobile phone gyroscope is used for identifying the direction of the mobile phone, and the mobile phone points to the ground for photographing. And calculating the coordinates of a central point A of the image mapped to the ground according to the mobile phone positioning coordinates, the ground clearance and the angle theta 2 (first angle) between the Z axis of the mobile phone and the ground, and further determining the to-be-cleaned area on the ground corresponding to the marked area on the image according to the coordinates of the central point A, the shooting angle theta 3 (imaging capturing angle) of the mobile phone camera and the marked area (as shown in fig. 9).
It should be noted that, in this embodiment, the method of determining the area to be cleaned by performing the area marking in the photographed image is adopted, and thus, compared with the method of determining the area to be cleaned by adopting the image recognition in the prior art, the processing pressure is greatly reduced, and the processing method in this embodiment is simple and convenient, and the area to be cleaned can be determined rapidly.
Based on the foregoing embodiment, in this embodiment, determining, according to the center coordinates, the imaging capturing angle, and the marker region, a region to be cleaned corresponding to the image on the cleaning surface includes:
determining a circular arc-shaped area of the image on the cleaning surface according to the center coordinates and the imaging capturing angle;
determining a sub-imaging capturing angle corresponding to the marking area according to the distance and the direction relation between the marking area and the central point of the image; wherein the sub-imaging capture angle is less than the imaging capture angle;
determining a sub-circular arc area of the mark area in the circular arc area according to the center coordinates and the sub-imaging capturing angle;
determining a second region coordinate range of the marking region in the sub-circular arc region according to a corresponding cutting processing process when the marking region is mapped to the cleaning surface;
And determining the second area coordinate range as an area to be cleaned.
The scheme provided by the present embodiment is described below with reference to fig. 8 and 9. From the photographed image, an area can be defined as a cleaning area (or other marking means) of the sweeper by sliding on the screen of the mobile phone. In this embodiment, this may be achieved by means of a UWB positioning system. Specifically, more than 4 UWB positioning base stations may be provided, so that three-dimensional coordinates may be located to identify the heights of the mobile phone and the ground. The mobile phone gyroscope is used for identifying the direction of the mobile phone, and the mobile phone points to the ground for photographing. Calculating the coordinates of a central point A of an image mapped to the ground according to the positioning coordinates of the mobile phone, the ground clearance height and the angle theta 2 (first angle) of the Z axis of the mobile phone and the ground, determining a circular arc-shaped area of the image on the cleaning surface according to the coordinates of the central point A and the shooting angle theta 3 (imaging capturing angle) of the mobile phone camera, determining a sub-imaging capturing angle theta 4 (the angle theta 4 is less than or equal to theta 3 in the image range) corresponding to the marking area according to the distance and the direction relation between the marking area and the central point of the image, and determining a second area coordinate range of the marking area in the sub-circular arc-shaped area according to the cutting process of the process of switching imaging of the camera lens into the direction image, namely according to the cutting process corresponding to the mapping of the marking area to the cleaning surface, and determining the second area coordinate range as an area to be cleaned (shown in figure 9).
Based on the foregoing embodiment, in this embodiment, determining, according to the center coordinates, the imaging capturing angle, and the marker region, a region to be cleaned corresponding to the image on the cleaning surface includes:
determining a circular arc-shaped area of the image on the cleaning surface according to the center coordinates and the imaging capturing angle;
determining a first region coordinate range of the image in the circular arc region according to a corresponding cutting processing process when the image is mapped to the cleaning surface;
determining a second region coordinate range of the marking region on the cleaning surface according to the relative position relation and the relative size relation of the marking region and the image and the first region coordinate range;
and determining the second area coordinate range as an area to be cleaned.
In this embodiment, the foregoing processing procedure is similar to the foregoing embodiment, except that in this embodiment, the second area coordinate range of the marking area on the cleaning surface is determined according to the relative position relationship and the relative size relationship between the marking area and the image, and the first area coordinate range, so that it is seen that the present embodiment fully uses the relative position relationship and the relative size relationship between the marking area and the image, and further determines the relative position relationship and the relative size relationship between the marking area and the image when the marking area is mapped to the cleaning surface and when the image is mapped to the cleaning surface, so that the second area coordinate range of the marking area on the cleaning surface can be accurately determined, and the second area coordinate range is the area to be cleaned.
The relative positional relationship here is to indicate where the marker region is located in the image, for example, 3cm from the left edge of the image, 4cm from the right edge, 6cm from the upper edge, and so on, and the relative size relationship is to indicate what the area ratio of the marker region to the image is. In addition, the relative positional relationship may also indicate which region of the image the marker region is located in, for example, the image may be divided into 20 tiles in sequence in advance, and then, which tiles of the image the marker region is located on may be determined, so as to determine the relative positional relationship between the marker region and the image.
It can be appreciated that by this processing manner of the present embodiment, a novel manner of determining the area to be cleaned is provided for the user, and the user does not need to walk, but only needs to take a photograph of the area to be cleaned to define the cleaning area (as shown in fig. 7), thereby facilitating the use of the user.
Based on the foregoing embodiments, in this embodiment, the determining the first coordinate of the location of the mobile terminal specifically includes:
and determining a first coordinate of the position of the mobile terminal based on the ultra wideband UWB positioning system.
In this embodiment, as described above, when determining the first coordinate of the location of the mobile terminal, the UWB positioning system may be used to perform positioning, so as to determine the first coordinate of the location of the mobile terminal.
It can be appreciated that the UWB positioning system has the advantages of strong penetration, low power consumption, good multipath resistance, high safety, low system complexity, and capability of providing accurate positioning accuracy. Therefore, the ultra wideband technology can be applied to indoor stationary or moving objects and people for positioning tracking and navigation, and can provide very accurate positioning accuracy, so that the embodiment can adopt the UWB positioning system for positioning.
The embodiment of the invention also provides a cleaning control method, which comprises the following steps:
receiving a cleaning instruction which is triggered by a user through voice in real time and is used for arranging a random cleaning task; wherein, the random cleaning task refers to a task which does not belong to a preset cleaning task;
determining a region to be cleaned for defining a cleaning range according to the cleaning instruction;
and controlling the cleaning equipment to execute a cleaning task on the area to be cleaned according to the area to be cleaned.
Further, determining an area to be cleaned for defining a cleaning range according to the cleaning instruction comprises:
determining a reference object and relative position area information of the reference object contained in the cleaning instruction, and determining a to-be-cleaned area for defining a cleaning range according to the position information of the reference object and the relative position area information of the reference object;
And/or the number of the groups of groups,
determining absolute position coordinate information contained in the cleaning instruction, and determining a region to be cleaned for defining a cleaning range according to the absolute position coordinate information;
and/or the number of the groups of groups,
and determining a reference object and relative coordinate information of the reference object contained in the cleaning instruction, and determining a region to be cleaned for defining a cleaning range according to the reference object and the relative coordinate information of the reference object.
Further, according to the area to be cleaned, controlling the cleaning device to perform a cleaning task on the area to be cleaned, including:
identifying the ground type and/or dirt type of the area to be cleaned;
and controlling the cleaning equipment to carry out targeted cleaning treatment on the area to be cleaned according to the ground type and/or dirt type identification result.
Further, the ground type identification for the area to be cleaned includes:
collecting sound signals sent by cleaning equipment in the area to be cleaned;
and determining the material type of the area to be cleaned according to the audio characteristics in the sound signals.
Further, according to the ground type recognition result, the cleaning device is controlled to perform targeted cleaning treatment on the area to be cleaned, including:
And controlling the cleaning equipment to carry out targeted cleaning treatment on the area to be cleaned by adopting a matched cleaning mode according to the ground type identification result.
Further, the method for identifying the dirt type of the area to be cleaned comprises the following steps:
collecting an image of the area to be cleaned;
and identifying the dirt type of the area to be cleaned according to the image.
Further, identifying the dirt type of the area to be cleaned according to the image information, including:
inputting the image into a dirt type recognition model, and acquiring a dirt type recognition result of the area to be cleaned;
the dirt type recognition model is obtained after training by adopting an image sample and a dirt type recognition result label of the image sample.
Further, identifying the dirt type of the area to be cleaned according to the image, including:
comparing the image with each reference image of known dirt types stored in a database, and taking the dirt type corresponding to the reference image matched with the image in the database as the dirt type of the area to be cleaned.
In this embodiment, the voice information may be further instructed to determine the area to be cleaned, and this processing manner provides greater convenience for the user, so that the user only needs to issue the voice instruction without moving around the area to be cleaned, without photographing, without performing a corresponding delineation action, without performing a limb action, and the like. For example, the user may send out a voice "clean the area in front of the television", and after receiving the voice command, the cleaning control device matches the voice (or a keyword in the voice, such as in front of the television) with the voice or the keyword in the database, determines the area to be cleaned (which may be understood as cleaning position information) corresponding to the voice, and then controls the cleaning device to move to the corresponding cleaning position to perform the cleaning operation. It is to be understood that the database stores therein the area to be cleaned (cleaning position information) that matches each voice instruction.
By adopting the treatment mode of the embodiment, the operation burden is relieved, and any area can be designated for targeted cleaning, so that the treatment flexibility is improved.
Based on the foregoing embodiment, in this embodiment, the determining, according to the command voice information, the area to be cleaned for defining the cleaning range specifically includes one or more of the following three manners:
(1) determining a reference object and relative position area information of the reference object contained in the instruction voice information, and determining a to-be-cleaned area for defining a cleaning range according to the position information of the reference object and the relative position area information of the reference object;
(2) determining absolute position coordinate information contained in the instruction voice information, and determining a region to be cleaned for defining a cleaning range according to the absolute position coordinate information;
(3) and determining a reference object and relative coordinate information of the reference object contained in the command voice information, and determining a region to be cleaned for defining a cleaning range according to the reference object and the relative coordinate information of the reference object.
In the present embodiment, when the area to be cleaned for defining the cleaning range is determined by voice, in order to improve the processing efficiency, the determination may be performed in several ways.
One implementation method is to determine a reference object (such as a door, a table, a television, a dining table, a sofa, a bed, a stool, etc.) included in the instruction voice information, then determine relative position area information (such as a front door area, a front television area, a front sofa area, a rear bed area, etc.) with the reference object, and finally determine a to-be-cleaned area for defining a cleaning range according to the position information of the reference object and the relative position area information with the reference object. It will be appreciated that the coordinates of these references may be pre-stored in a database.
Another implementation way is to determine absolute position coordinate information contained in the instruction voice information, and determine an area to be cleaned for defining a cleaning range according to the absolute position coordinate information. For example, assuming that a voice command issued by a user is "please clean an area in a room determined by 1 m radius around coordinates (X, Y)" according to the voice command, an area to be cleaned for defining a cleaning range can be accurately determined. For another example, the voice command sent by the user is "please clean the area surrounded by the indoor coordinates (X1, Y1), (X2, Y2), (X3, Y3)", and according to the voice command, the area to be cleaned for defining the cleaning range can be accurately determined.
In still another implementation manner, a reference object and relative coordinate information of the reference object contained in the instruction voice information are determined, and a to-be-cleaned area for defining a cleaning range is determined according to the reference object and the relative coordinate information of the reference object. For example, assume that a voice command sent by a user is "please clean an area 1.3 meters in front of a television", according to the voice command, a reference object may be determined to be the television, and then an area to be cleaned for defining a cleaning range may be accurately determined according to a position of the television and relative coordinate information 1.3 with the television. For another example, the voice command sent by the user is "take the dining table as the center, please clean the area with a radius of 0.5 meter around the dining table", and according to the voice command, the area to be cleaned for defining the cleaning range can be accurately determined, so that food residues, paper scraps and the like falling around the dining table can be accurately cleaned after meals.
Therefore, the area needing targeted cleaning can be flexibly and efficiently determined in a voice mode, and therefore accurate cleaning is completed.
In this embodiment, reference may be made to the description of the foregoing embodiments for the schemes of ground type recognition and dirt recognition, and the description thereof will not be repeated here.
The embodiment of the invention also provides a cleaning control method, which comprises the following steps:
receiving a cleaning instruction which is triggered by a user through limb actions in real time and is used for arranging a random cleaning task; wherein, the random cleaning task refers to a task which does not belong to a preset cleaning task;
determining a region to be cleaned for defining a cleaning range according to the cleaning instruction;
and controlling the cleaning equipment to execute a cleaning task on the area to be cleaned according to the area to be cleaned.
Further, determining an area to be cleaned for defining a cleaning range according to the cleaning instruction comprises:
determining limb pointing and limb actions sent by a user to an area to be cleaned;
determining a coordinate area range formed on the cleaning surface according to the limb direction and the limb action;
and determining a region to be cleaned for defining a cleaning range according to the coordinate region range.
Further, according to the area to be cleaned, controlling the cleaning device to perform a cleaning task on the area to be cleaned, including:
identifying the ground type and/or dirt type of the area to be cleaned;
and controlling the cleaning equipment to carry out targeted cleaning treatment on the area to be cleaned according to the ground type and/or dirt type identification result.
Further, the ground type identification for the area to be cleaned includes:
collecting sound signals sent by cleaning equipment in the area to be cleaned;
and determining the material type of the area to be cleaned according to the audio characteristics in the sound signals.
Further, according to the ground type recognition result, the cleaning device is controlled to perform targeted cleaning treatment on the area to be cleaned, including:
and controlling the cleaning equipment to carry out targeted cleaning treatment on the area to be cleaned by adopting a matched cleaning mode according to the ground type identification result.
Further, the method for identifying the dirt type of the area to be cleaned comprises the following steps:
collecting an image of the area to be cleaned;
and identifying the dirt type of the area to be cleaned according to the image.
Further, identifying the dirt type of the area to be cleaned according to the image information, including:
inputting the image into a dirt type recognition model, and acquiring a dirt type recognition result of the area to be cleaned;
the dirt type recognition model is obtained after training by adopting an image sample and a dirt type recognition result label of the image sample.
Further, identifying the dirt type of the area to be cleaned according to the image, including:
comparing the image with each reference image of known dirt types stored in a database, and taking the dirt type corresponding to the reference image matched with the image in the database as the dirt type of the area to be cleaned.
In this embodiment, the area to be cleaned for defining the cleaning range may be determined according to the limb information of the user. For example, when the user points to a certain area range through a limb (finger, head, arm, leg, foot, etc.), an area to be cleaned for defining a cleaning range is determined according to the area range to which the user's limb points.
In this embodiment, it may be understood that the position of the user limb and the pointing angle information of the user limb may be determined by sensing the sensing device, so as to determine the area range pointed by the user limb according to the position of the user limb, the pointing direction information and the pointing angle information of the user limb, and further determine the area to be cleaned.
For example, assuming that the user stands at point a on the ground, and then the finger points to the ground, the direction is 30 ° to the east, and the angle of the finger to the ground is 45 °, the extent of the area pointed by the user's limb on the ground can be determined from this information. It will be appreciated that this determination will typically be made at a point on the ground, and then the pattern (e.g. circular or square, etc.) may be drawn centered at that point, and when the pattern is drawn as a circle, the radius may take on a predetermined value. When the drawing pattern is square, the side length can be a preset value, and it can be understood that in this determination mode, the determined area range is a general number, so that in order to avoid cleaning omission, the area of the drawing pattern can be larger than a preset threshold value, and thus the cleaning task assigned by the user can be completed as much as possible.
Based on the foregoing embodiments, in this embodiment, the determining, according to limb information of a user, a to-be-cleaned area for defining a cleaning range specifically includes:
determining limb pointing and limb actions sent by a user to an area to be cleaned;
determining a third pattern formed on the cleaning surface according to the limb direction and the limb action;
and determining a region to be cleaned for defining a cleaning range according to the third pattern.
In this embodiment, it should be noted that, the scheme of this embodiment is similar to the scheme of determining the area to be cleaned for defining the cleaning range according to the delineating pattern formed by the area delineating by pointing to the area to be cleaned provided by the foregoing embodiment (see fig. 5 and 6), and the difference is that this embodiment is only the delineating pattern formed by the area delineating by the user holding the mobile device, but the delineating pattern formed by the body action. For example, when it is desired to define a circular area on the floor surface to be cleaned, the toe may be used to do a limb movement of the circular area toward the floor surface, thereby forming a third pattern on the floor surface.
In addition, the gesture can also be used for making limb actions for carrying out area delineation towards the ground, then a third pattern formed on the cleaning surface is determined according to the angle between the finger and the ground, the height of the finger and other information, and then an area to be cleaned for defining a cleaning range is determined according to the third pattern.
It can be understood that, since the scheme of the present embodiment is similar to the scheme of determining the area to be cleaned for defining the cleaning range according to the delineation pattern formed by the area delineation of the area to be cleaned by pointing to the area to be cleaned provided by the foregoing embodiment, the specific principle will not be described in detail, and reference will be made to the description of the foregoing embodiment.
In this embodiment, it should be noted that, in general, the third pattern may be used as the area to be cleaned for defining the cleaning range, and in special cases, in order to ensure the cleaning effect, a predetermined amount of extension may be performed on the basis of the third pattern, so as to determine the area to be cleaned for defining the cleaning range.
In this embodiment, reference may be made to the description of the foregoing embodiments for the schemes of ground type recognition and dirt recognition, and the description thereof will not be repeated here.
Based on the same inventive concept, another embodiment of the present invention provides a cleaning control device, referring to fig. 10, where the cleaning control device provided in this embodiment includes: the determination module 21 controls the module 22, wherein:
A determining module 21, configured to determine a to-be-cleaned area for defining a cleaning range based on a movement track of the mobile terminal;
and the control module 22 is used for controlling the cleaning equipment to execute cleaning tasks on the area to be cleaned based on the area to be cleaned.
Since the cleaning control device provided in this embodiment may be used to execute the cleaning control method described in the foregoing embodiment, the working principle and the beneficial effects thereof are similar, so that details will not be described herein, and reference will be made to the description of the foregoing embodiment.
Based on the same inventive concept, another embodiment of the present invention provides an intelligent device comprising the cleaning control apparatus as described in the above embodiment.
In this embodiment, it can be understood that, because the processing procedure of the cleaning control device may be implemented on an intelligent device, this embodiment provides an intelligent device including the cleaning control device, so as to implement the cleaning control procedure. It is to be understood that the intelligent device may be a sweeping robot, an intelligent sweeping device, or the like, which is not limited in this embodiment.
Since the intelligent device provided in this embodiment includes the cleaning control device described in the above embodiment, the working principle and the beneficial effects thereof are similar, so that details will not be described here, and the detailed contents can be referred to the description of the above embodiment.
Based on the same inventive concept, another embodiment of the present invention provides a mobile terminal including the cleaning control device as described in the above embodiment.
In this embodiment, it can be understood that, since the processing procedure of the cleaning control device can be implemented on the mobile terminal, this embodiment provides a mobile terminal including the cleaning control device, thereby implementing the cleaning control procedure. It will be appreciated that the mobile terminal may be any of a variety of devices, such as a cell phone, pad, smart watch, notebook, etc., and the present embodiment is not limited thereto.
Since the mobile terminal provided in this embodiment includes the cleaning control device described in the above embodiment, the working principle and the beneficial effects thereof are similar, so that details will not be described here, and the detailed description will be made with reference to the description of the above embodiment.
Based on the same inventive concept, another embodiment of the present invention provides a server including the cleaning control apparatus as described in the above embodiment.
In this embodiment, it can be understood that, since the processing procedure of the cleaning control device may be implemented on a server, this embodiment provides a server including the cleaning control device, thereby implementing the cleaning control procedure. In this embodiment, the server may be a cloud server or other servers, which is not limited in this embodiment. In the case of a cloud server, the cloud server has the advantages of high specific processing speed, high safety and the like.
Since the server provided in this embodiment includes the cleaning control device described in the above embodiment, the working principle and the beneficial effects thereof are similar, so that details will not be described here, and reference will be made to the description of the above embodiment for details.
Based on the same inventive concept, a further embodiment of the present invention provides an intelligent device, see fig. 11, which specifically includes the following: a processor 301, a memory 302, a communication interface 303, and a communication bus 304;
wherein, the processor 301, the memory 302, and the communication interface 303 complete communication with each other through the communication bus 304; the communication interface 303 is used for realizing transmission between related devices such as modeling software, an intelligent manufacturing equipment module library and the like;
the processor 301 is configured to invoke a computer program in the memory 302, where the processor executes the computer program to implement all the steps of the cleaning control method described above, for example, the processor executes the computer program to implement the following steps: determining an area to be cleaned for defining a cleaning range; determining a region to be cleaned according to the region to be cleaned; according to the area to be cleaned, controlling cleaning equipment to execute a cleaning task on the area to be cleaned; or, the processor, when executing the computer program, implements the steps of: determining a region to be cleaned for defining a cleaning range based on a moving track of the mobile terminal; and controlling the cleaning equipment to execute cleaning tasks on the area to be cleaned based on the area to be cleaned.
It will be appreciated that the refinement and expansion functions that the computer program may perform are as described with reference to the above embodiments.
It is to be understood that the intelligent device may be a sweeping robot, a ground sweeping device, a wall surface sweeping device, etc., which is not limited in this embodiment.
Based on the same inventive concept, a further embodiment of the present invention provides a sweeping robot apparatus, comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the above-mentioned sweeping control method when executing the program. It can be understood that when the sweeping robot device implements the steps of the sweeping control method, the sweeping device in the method for controlling the sweeping device to perform the sweeping task on the area to be swept according to the area to be swept is the sweeping robot, that is, the sweeping robot itself can complete the control and the sweeping functions.
Based on the same inventive concept, a further embodiment of the present invention provides a mobile terminal, see fig. 12, which specifically includes the following: a processor 401, a memory 402, a communication interface 403, and a communication bus 404;
Wherein, the processor 401, the memory 402, the communication interface 403 complete the communication with each other through the communication bus 404; the communication interface 403 is used for implementing transmission between related devices such as each modeling software and an intelligent manufacturing equipment module library;
the processor 401 is configured to invoke a computer program in the memory 402, where the processor executes the computer program to implement all the steps of the cleaning control method described above, for example, the processor executes the computer program to implement the following steps: determining an area to be cleaned for defining a cleaning range; determining a region to be cleaned according to the region to be cleaned; according to the area to be cleaned, controlling cleaning equipment to execute a cleaning task on the area to be cleaned; or, the processor, when executing the computer program, implements the steps of: determining a region to be cleaned for defining a cleaning range based on a moving track of the mobile terminal; and controlling the cleaning equipment to execute cleaning tasks on the area to be cleaned based on the area to be cleaned.
It will be appreciated that the refinement and expansion functions that the computer program may perform are as described with reference to the above embodiments.
It will be appreciated that the mobile terminal may be any of a variety of devices, such as a cell phone, pad, smart watch, notebook, etc., and the present embodiment is not limited thereto.
Based on the same inventive concept, a further embodiment of the present invention provides a server, see fig. 13, which comprises in particular: a processor 501, a memory 502, a communication interface 503, and a communication bus 504;
wherein the processor 501, the memory 502, and the communication interface 503 perform communication with each other through the communication bus 504; the communication interface 503 is used for implementing transmission between related devices such as modeling software and intelligent manufacturing equipment module libraries;
the processor 501 is configured to invoke a computer program in the memory 502, where the processor executes the computer program to implement all the steps of the cleaning control method described above, for example, the processor executes the computer program to implement the following steps: determining an area to be cleaned for defining a cleaning range; determining a region to be cleaned according to the region to be cleaned; according to the area to be cleaned, controlling cleaning equipment to execute a cleaning task on the area to be cleaned; or, the processor, when executing the computer program, implements the steps of: determining a region to be cleaned for defining a cleaning range based on a moving track of the mobile terminal; and controlling the cleaning equipment to execute cleaning tasks on the area to be cleaned based on the area to be cleaned.
It will be appreciated that the refinement and expansion functions that the computer program may perform are as described with reference to the above embodiments.
In this embodiment, the server may be a cloud server or other servers, which is not limited in this embodiment. In the case of a cloud server, the cloud server has the advantages of high specific processing speed, high safety and the like.
Based on the same inventive concept, a further embodiment of the present invention provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements all the steps of the above-described cleaning control method, for example, the processor implementing the following steps when executing the computer program: determining an area to be cleaned for defining a cleaning range; determining a region to be cleaned according to the region to be cleaned; according to the area to be cleaned, controlling cleaning equipment to execute a cleaning task on the area to be cleaned; or, the processor, when executing the computer program, implements the steps of: determining a region to be cleaned for defining a cleaning range based on a moving track of the mobile terminal; and controlling the cleaning equipment to execute cleaning tasks on the area to be cleaned based on the area to be cleaned.
It will be appreciated that the refinement and expansion functions that the computer program may perform are as described with reference to the above embodiments.
Further, the logic instructions in the memory described above may be implemented in the form of software functional units and stored in a computer-readable storage medium when sold or used as a stand-alone product. Based on this understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The apparatus embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules can be selected according to actual needs to achieve the purpose of the embodiment of the invention. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
From the above description of the embodiments, it will be apparent to those skilled in the art that the embodiments may be implemented by means of software plus necessary general hardware platforms, or of course may be implemented by means of hardware. Based on this understanding, the foregoing technical solution may be embodied essentially or in a part contributing to the prior art in the form of a software product, which may be stored in a computer readable storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, etc., and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the cleaning control method described in the respective embodiments or some parts of the embodiments.
In the description of the present invention, it should be noted that the azimuth or positional relationship indicated by the terms "upper", "lower", etc. are based on the azimuth or positional relationship shown in the drawings, and are merely for convenience of describing the present invention and simplifying the description, and are not indicative or implying that the apparatus or element in question must have a specific azimuth, be constructed and operated in a specific azimuth, and thus should not be construed as limiting the present invention. Unless specifically stated or limited otherwise, the terms "mounted," "connected," and "coupled" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art according to the specific circumstances.
Furthermore, in the present disclosure, such as "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In the description of the present invention, the meaning of "plurality" means at least two, for example, two, three, etc., unless specifically defined otherwise.
Moreover, in the present invention, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Furthermore, in the description herein, reference to the terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, schematic representations of the above terms are not necessarily directed to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.

Claims (19)

1. A cleaning control method, characterized by comprising:
determining a region to be cleaned for defining a cleaning range based on a moving track of the mobile terminal;
based on the area to be cleaned, controlling cleaning equipment to execute a cleaning task on the area to be cleaned;
wherein the moving track is a closed track or a non-closed track;
if the moving track is a non-closed track, the to-be-cleaned area is a closed track formed by the non-closed track and a fixed barrier in a room when the non-closed track and the fixed barrier in the room form the closed track;
when the area to be cleaned is a closed track consisting of the non-closed track and a fixed barrier in a room, the determining mode of the area to be cleaned comprises any one of the following modes:
determining that a plurality of closed tracks consisting of the non-closed track and the fixed barriers in the room are arranged, displaying the closed tracks consisting of the closed tracks for a user to select, and determining the closed track selected by the user as an area to be cleaned;
determining that a plurality of closed tracks consisting of the non-closed tracks and fixed barriers in a room exist, and determining the closed tracks with smaller cleaning area as an area to be cleaned;
And determining that a plurality of closed tracks consisting of the non-closed track and the fixed barriers in the room are arranged, and determining the closed track corresponding to the pre-calibrated closed track number as the area to be cleaned according to the pre-calibrated closed track number.
2. The cleaning control method according to claim 1, wherein the determining the area to be cleaned for defining the cleaning range based on the movement locus of the mobile terminal includes:
and determining the area to be cleaned for defining the cleaning range according to a moving track formed by moving the mobile terminal around the area to be cleaned.
3. The cleaning control method according to claim 2, wherein the determining the area to be cleaned for defining the cleaning range according to the movement locus formed by the movement of the mobile terminal around the area to be cleaned includes:
determining coordinates of a moving track formed by moving the mobile terminal around the area to be cleaned;
and determining a region to be cleaned for defining a cleaning range according to the coordinates of the moving track.
4. The cleaning control method according to claim 3, wherein determining the area to be cleaned for defining the cleaning range based on the coordinates of the movement locus includes:
And determining whether the moving track is a closed track according to the coordinates of the moving track, and if so, determining the closed track as an area to be cleaned.
5. The cleaning control method according to claim 3, wherein determining the area to be cleaned for defining the cleaning range based on the coordinates of the movement locus includes:
and determining whether the moving track is a closed track or not according to the coordinates of the moving track, if not, determining that the moving track is a non-closed track, and determining whether the non-closed track and a fixed barrier in a room can form a closed track according to the coordinates corresponding to the non-closed track, if so, determining the formed closed track as a region to be cleaned.
6. The cleaning control method according to any one of claims 3 to 5, characterized in that determining coordinates of a movement locus formed by movement of the mobile terminal around the area to be cleaned, includes:
and determining the coordinates of a moving track formed by moving the mobile terminal around the area to be cleaned based on the ultra wideband UWB indoor positioning system.
7. The cleaning control method according to claim 1, wherein the determining the area to be cleaned for defining the cleaning range based on the movement locus of the mobile terminal includes:
And determining the area to be cleaned for defining the cleaning range according to the delineating shape formed by the mobile terminal in-situ immobility and the range delineating by pointing to the area to be cleaned, the angle between the mobile terminal and the cleaning surface and the distance between the mobile terminal and the cleaning surface.
8. The cleaning control method according to claim 7, wherein determining the area to be cleaned for defining the cleaning range according to a delineation shape formed by the mobile terminal being stationary in place and delineating the area to be cleaned by pointing to the area to be cleaned, an angle of the mobile terminal to the cleaning surface, and a distance of the mobile terminal to the cleaning surface, specifically comprises:
selecting a preset number of first coordinate points (x 1, y 1) on a delineating shape formed by the mobile terminal in-situ immobility through pointing to an area to be cleaned and carrying out range delineation;
determining a preset number of second coordinate points (x 2, y 2) corresponding to the preset number of first coordinate points on the cleaning surface according to a first relation model;
determining a region to be cleaned for defining a cleaning range according to the second coordinate points of the preset number;
wherein the first relationship model is:
x2=x1-L×cosθ,y2=y1-L×sinθ
wherein L represents a distance between the first coordinate point (x 1, y 1) and the second coordinate point (x 1, y 1), H 1 Representing a distance between the mobile terminal and the cleaning surface; θ1 represents the angle between the mobile terminal and the cleaning surface, and θ represents the angle between the line segment L and the x-axis of the ground coordinate system.
9. The cleaning control method according to claim 1 or 2 or 7, characterized by further comprising:
collecting an image of the area to be cleaned;
correcting the area to be cleaned based on the image;
accordingly, based on the area to be cleaned, controlling the cleaning device to perform a cleaning task on the area to be cleaned, including:
and controlling the cleaning equipment to execute cleaning tasks on the modified area to be cleaned based on the modified area to be cleaned.
10. The cleaning control method according to claim 9, characterized in that correcting the area to be cleaned based on the image includes:
receiving a marking area obtained by marking operation on the image by a user;
and correcting the area to be cleaned according to the marking area of the user to obtain a corrected area to be cleaned.
11. The cleaning control method according to claim 1, characterized in that controlling a cleaning apparatus to perform a cleaning task on the area to be cleaned based on the area to be cleaned, comprises:
Identifying the ground type and/or dirt type of the area to be cleaned;
and controlling the cleaning equipment to carry out targeted cleaning treatment on the area to be cleaned according to the ground type and/or dirt type identification result.
12. The cleaning control method according to claim 11, characterized in that the identification of the floor type of the area to be cleaned includes:
collecting sound signals sent by cleaning equipment in the area to be cleaned;
determining the material type of the area to be cleaned according to the audio characteristics in the sound signals;
accordingly, according to the ground type recognition result, the cleaning device is controlled to carry out targeted cleaning treatment on the area to be cleaned, and the method comprises the following steps:
and controlling the cleaning equipment to carry out targeted cleaning treatment on the area to be cleaned by adopting a matched cleaning mode according to the ground type identification result.
13. The cleaning control method according to claim 11, characterized in that the identification of the type of dirt on the area to be cleaned includes:
collecting an image of the area to be cleaned;
and identifying the dirt type of the area to be cleaned according to the image.
14. The cleaning control method according to claim 13, characterized in that the identification of the type of dirt in the area to be cleaned from the image includes any one of the following means:
Inputting the image into a dirt type recognition model, and acquiring a dirt type recognition result of the area to be cleaned; the dirt type recognition model is obtained after training by adopting an image sample and a dirt type recognition result label of the image sample;
comparing the image with each reference image of known dirt types stored in a database, and taking the dirt type corresponding to the reference image matched with the image in the database as the dirt type of the area to be cleaned.
15. A smart device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the steps of the cleaning control method of any one of claims 1 to 14 when the program is executed by the processor.
16. A robot cleaner comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps of the cleaning control method according to any one of claims 1 to 14 when the program is executed.
17. A mobile terminal comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the steps of the cleaning control method of any one of claims 1 to 14 when the program is executed.
18. A server comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the steps of the cleaning control method of any one of claims 1 to 14 when the program is executed.
19. A non-transitory computer readable storage medium having stored thereon a computer program, which when executed by a processor, implements the steps of the cleaning control method according to any one of claims 1 to 14.
CN202011599703.2A 2020-12-29 2020-12-29 Cleaning control method and device, intelligent equipment, mobile terminal and server Active CN114680739B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011599703.2A CN114680739B (en) 2020-12-29 2020-12-29 Cleaning control method and device, intelligent equipment, mobile terminal and server

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011599703.2A CN114680739B (en) 2020-12-29 2020-12-29 Cleaning control method and device, intelligent equipment, mobile terminal and server

Publications (2)

Publication Number Publication Date
CN114680739A CN114680739A (en) 2022-07-01
CN114680739B true CN114680739B (en) 2023-08-04

Family

ID=82131683

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011599703.2A Active CN114680739B (en) 2020-12-29 2020-12-29 Cleaning control method and device, intelligent equipment, mobile terminal and server

Country Status (1)

Country Link
CN (1) CN114680739B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02241421A (en) * 1989-03-15 1990-09-26 Matsushita Electric Ind Co Ltd Self-running cleaner
CN105266721A (en) * 2014-06-26 2016-01-27 Lg电子株式会社 Robot cleaner and control method thereof
CN107283429A (en) * 2017-08-23 2017-10-24 北京百度网讯科技有限公司 Control method, device, system and terminal based on artificial intelligence
CN108903816A (en) * 2018-06-21 2018-11-30 上海与德通讯技术有限公司 A kind of cleaning method, controller and intelligent cleaning equipment
CN108958253A (en) * 2018-07-19 2018-12-07 北京小米移动软件有限公司 The control method and device of sweeping robot
CN113679289A (en) * 2020-05-18 2021-11-23 云米互联科技(广东)有限公司 Sweeper control method, sweeper control equipment and computer readable storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102094347B1 (en) * 2013-07-29 2020-03-30 삼성전자주식회사 Auto-cleaning system, cleaning robot and controlling method thereof
KR102070068B1 (en) * 2017-11-30 2020-03-23 엘지전자 주식회사 Moving Robot and controlling method
US10575699B2 (en) * 2018-01-05 2020-03-03 Irobot Corporation System for spot cleaning by a mobile robot

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02241421A (en) * 1989-03-15 1990-09-26 Matsushita Electric Ind Co Ltd Self-running cleaner
CN105266721A (en) * 2014-06-26 2016-01-27 Lg电子株式会社 Robot cleaner and control method thereof
CN107283429A (en) * 2017-08-23 2017-10-24 北京百度网讯科技有限公司 Control method, device, system and terminal based on artificial intelligence
CN108903816A (en) * 2018-06-21 2018-11-30 上海与德通讯技术有限公司 A kind of cleaning method, controller and intelligent cleaning equipment
CN108958253A (en) * 2018-07-19 2018-12-07 北京小米移动软件有限公司 The control method and device of sweeping robot
CN113679289A (en) * 2020-05-18 2021-11-23 云米互联科技(广东)有限公司 Sweeper control method, sweeper control equipment and computer readable storage medium

Also Published As

Publication number Publication date
CN114680739A (en) 2022-07-01

Similar Documents

Publication Publication Date Title
US11709497B2 (en) Method for controlling an autonomous mobile robot
CN109998428B (en) Cleaning method, system and device for sweeping robot
CN109890573B (en) Control method and device for mobile robot, mobile robot and storage medium
CN106983460B (en) A kind of sweeping robot region cleaning display control method
CN108247647B (en) Cleaning robot
CN110313867B (en) Autonomous mobile cleaner, cleaning method for autonomous mobile cleaner, and recording medium
CN109392308B (en) Scheduling and control system for autonomous cleaning robot
CN111096714B (en) Control system and method of sweeping robot and sweeping robot
CN110313863B (en) Autonomous mobile cleaning machine, cleaning method for autonomous mobile cleaning machine, and program
CN112739244A (en) Mobile robot cleaning system
JP2020124508A (en) Computer-implemented method of operating cleaning robot
CN114680740B (en) Cleaning control method and device, intelligent equipment, mobile equipment and server
WO2019114219A1 (en) Mobile robot and control method and control system thereof
WO2015039621A1 (en) Method for controlling cleaning robot by smart phone
KR20160100315A (en) Robotic cleaning device with perimeter recording function
US11269350B2 (en) Method for creating an environment map for a processing unit
CN112890680B (en) Follow-up cleaning operation method, control device, robot and storage medium
CN110897567A (en) Cleaning method based on target object recognition and cleaning robot
CN111142531A (en) Household appliance linkage-based cleaning robot control method and cleaning robot
WO2023125698A1 (en) Cleaning device, and control method and control apparatus therefor
CN109978173B (en) Machine learning DIY method for indoor surveying and mapping and positioning
CN115033002A (en) Mobile robot control method and device, electronic device and storage medium
CN112890690B (en) Robot sweeping control method and device and sweeping robot
CN114680739B (en) Cleaning control method and device, intelligent equipment, mobile terminal and server
CN117297449A (en) Cleaning setting method, cleaning apparatus, computer program product, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant