CN114680740A - Cleaning control method and device, intelligent equipment, mobile equipment and server - Google Patents

Cleaning control method and device, intelligent equipment, mobile equipment and server Download PDF

Info

Publication number
CN114680740A
CN114680740A CN202011599705.1A CN202011599705A CN114680740A CN 114680740 A CN114680740 A CN 114680740A CN 202011599705 A CN202011599705 A CN 202011599705A CN 114680740 A CN114680740 A CN 114680740A
Authority
CN
China
Prior art keywords
area
cleaned
cleaning
determining
mobile terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011599705.1A
Other languages
Chinese (zh)
Other versions
CN114680740B (en
Inventor
梁玉池
梁家勇
罗振宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Midea Group Co Ltd
GD Midea Air Conditioning Equipment Co Ltd
Original Assignee
Midea Group Co Ltd
GD Midea Air Conditioning Equipment Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Midea Group Co Ltd, GD Midea Air Conditioning Equipment Co Ltd filed Critical Midea Group Co Ltd
Priority to CN202011599705.1A priority Critical patent/CN114680740B/en
Publication of CN114680740A publication Critical patent/CN114680740A/en
Application granted granted Critical
Publication of CN114680740B publication Critical patent/CN114680740B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • A47L11/4008Arrangements of switches, indicators or the like
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • H04N23/635Region indicators; Field of view indicators
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/06Control of the cleaning action for autonomous devices; Automatic detection of the surface condition before, during or after cleaning
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B40/00Technologies aiming at improving the efficiency of home appliances, e.g. induction cooking or efficient technologies for refrigerators, freezers or dish washers

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention provides a cleaning control method, a cleaning control device, intelligent equipment, mobile equipment and a server, wherein the method comprises the following steps: determining position information and posture information of a mobile terminal when the mobile terminal shoots an area to be cleaned; determining an imaging capture angle of a camera on the mobile terminal when the mobile terminal shoots an area to be cleaned; determining the area to be cleaned according to the shot image, the position information, the posture information and the imaging capture angle; and controlling a cleaning device to perform a cleaning task on the area to be cleaned according to the area to be cleaned. The invention utilizes the area which can be visually and intuitively seen by the mobile terminal to designate the area which needs to be cleaned by the cleaning equipment, so that the invention can realize the visible and obtained fixed-point cleaning, thereby effectively improving the cleaning efficiency and reducing the power consumption caused by the cleaning.

Description

Cleaning control method and device, intelligent equipment, mobile equipment and server
Technical Field
The invention relates to the technical field of intelligent processing, in particular to a cleaning control method and device, intelligent equipment, mobile equipment and a server.
Background
The sweeping robots are widely used, and users can set fixed sweeping areas (such as a living room, a bedroom and the like) for sweeping. For example, when a certain set area (such as a living room) needs to be cleaned, the cleaning robot can be controlled to perform cleaning work on the set area.
However, sometimes cleaning is not required for all positions in the set area, which is wasteful if the set area is still cleaned. For example, the power consumption and the sweeping time of the sweeping robot are wasted.
Disclosure of Invention
Aiming at the problems in the prior art, the embodiment of the invention provides a cleaning control method, a cleaning control device, intelligent equipment, mobile equipment and a server, which are used for solving the problem that accurate cleaning cannot be realized in the prior art.
In order to solve the problems in the prior art, the embodiment of the invention provides the following technical scheme:
in a first aspect, an embodiment of the present invention provides a sweeping control method, including:
shooting an area to be cleaned by using a mobile terminal;
determining position information and posture information of a mobile terminal when the mobile terminal shoots an area to be cleaned;
determining an imaging capture angle of a camera on the mobile terminal when the mobile terminal shoots an area to be cleaned;
determining the area to be cleaned according to the shot image, the position information, the posture information and the imaging capture angle;
and controlling a cleaning device to perform a cleaning task on the area to be cleaned according to the area to be cleaned.
Further, determining the position information and the posture information of the mobile terminal when the mobile terminal shoots the area to be cleaned includes:
determining the position coordinate of the mobile terminal when the mobile terminal shoots an area to be cleaned;
the method comprises the steps of determining the pointing direction of the mobile terminal and the first angle between the mobile terminal and a cleaning surface when the mobile terminal shoots an area to be cleaned.
Further, determining the area to be cleaned according to the shot image, the position information, the posture information and the imaging capture angle comprises:
determining a central coordinate of a central point on the image mapped to the cleaning surface according to the shot image, the position coordinate, the pointing direction and the first angle;
and determining the corresponding area to be cleaned of the image on the cleaning surface according to the central coordinate and the imaging capturing angle.
Further, determining the area to be cleaned according to the shot image, the position information, the posture information and the imaging capture angle comprises:
determining a first distance between the mobile terminal and a cleaning surface when the mobile terminal shoots the area;
determining a central coordinate of a central point on the image mapped to the cleaning surface according to the shot image, the position coordinate, the first distance, the pointing direction and the first angle; and determining the corresponding area to be cleaned of the image on the cleaning surface according to the central coordinate and the imaging capturing angle.
Further, determining a region to be cleaned corresponding to the image on the cleaning surface according to the central coordinate and the imaging capturing angle, including:
determining an arc-shaped area of the image on the cleaning surface according to the central coordinate and the imaging capturing angle;
determining a first area coordinate range of the image in the arc-shaped area according to a corresponding cutting processing process when the image is mapped to the cleaning surface;
and determining the first area coordinate range as an area to be cleaned.
Further, determining the position coordinate of the mobile terminal when the mobile terminal shoots the area to be cleaned includes:
and determining the position coordinate of the mobile terminal when the mobile terminal shoots the area to be cleaned based on the ultra-wideband UWB indoor positioning system.
In a second aspect, an embodiment of the present invention further provides a cleaning control method, including:
determining position information and posture information of a mobile terminal when the mobile terminal shoots an area;
determining an imaging capture angle of a camera on the mobile terminal when the mobile terminal shoots the area;
receiving a mark area of a user on a shot image;
determining an area to be cleaned according to a mark area of a user on a shot image, the position information, the posture information and the imaging capture angle;
and controlling a cleaning device to perform a cleaning task on the area to be cleaned according to the area to be cleaned.
Further, determining the position information and the posture information of the mobile terminal when the mobile terminal shoots an area comprises:
determining the position coordinates of the mobile terminal when the mobile terminal shoots the area;
the method comprises the steps of determining the pointing direction of the mobile terminal and the first angle between the mobile terminal and a cleaning surface when the mobile terminal shoots an area to be cleaned.
Further, determining an area to be cleaned according to the mark area of the user on the shot image, the position information, the posture information and the imaging capture angle, includes:
determining a central coordinate of a central point on the image mapped to the cleaning surface according to the shot image, the position coordinate and the first angle;
and determining a corresponding area to be cleaned on the cleaning surface according to the central coordinate, the imaging capturing angle and the marking area.
Further, determining an area to be cleaned according to the mark area of the user on the shot image, the position information, the posture information and the imaging capture angle, includes:
determining a first distance between the mobile terminal and a cleaning surface when the mobile terminal shoots the area;
determining a central coordinate of a central point on the image mapped to the cleaning surface according to the shot image, the position coordinate, the first distance and the first angle;
and determining a corresponding area to be cleaned on the cleaning surface according to the central coordinate, the imaging capturing angle and the marking area.
Further, determining a corresponding area to be cleaned on the cleaning surface according to the central coordinate, the imaging capturing angle and the marking area, including:
determining an arc-shaped area of the image on the cleaning surface according to the central coordinate and the imaging capturing angle;
determining a sub-imaging capture angle corresponding to the mark region according to the distance and the direction relation between the mark region and the central point of the image; wherein the sub-imaging capture angle is less than the imaging capture angle;
determining a sub-circular arc area of the marking area in the circular arc area according to the central coordinate and the sub-imaging capturing angle;
determining a second area coordinate range of the marked area in the sub-arc-shaped area according to a corresponding cutting processing process when the marked area is mapped to the cleaning surface;
and determining the coordinate range of the second area as an area to be cleaned.
Further, determining a corresponding area to be cleaned on the cleaning surface according to the central coordinate, the imaging capturing angle and the marking area, including:
determining an arc-shaped area of the image on the cleaning surface according to the central coordinate and the imaging capturing angle;
determining a first area coordinate range of the image in the arc-shaped area according to a corresponding cutting processing process when the image is mapped to the cleaning surface;
determining a second area coordinate range of the marked area on the cleaning surface according to the relative position relationship and the relative size relationship between the marked area and the image and the first area coordinate range;
and determining the second area coordinate range as an area to be cleaned.
Further, according to the area to be cleaned, the cleaning equipment is controlled to carry out cleaning tasks on the area to be cleaned, and the cleaning tasks comprise:
identifying the ground type and/or the dirt type of the area to be cleaned;
and controlling the cleaning equipment to carry out targeted cleaning treatment on the area to be cleaned according to the ground type and/or dirt type identification result.
Further, the ground type recognition of the area to be cleaned comprises:
collecting sound signals emitted by the cleaning equipment in the area to be cleaned,
determining the material type of the area to be cleaned according to the audio features in the sound signals;
correspondingly, according to the ground type recognition result, the control cleaning equipment carries out targeted cleaning treatment on the area to be cleaned, and the method comprises the following steps:
and controlling the cleaning equipment to carry out targeted cleaning treatment on the area to be cleaned by adopting a matched cleaning mode according to the ground type identification result.
Further, the dirt type identification of the area to be cleaned comprises the following steps:
collecting an image of the area to be cleaned;
and identifying the dirt type of the area to be cleaned according to the image.
Further, the method for identifying the dirt type of the area to be cleaned according to the image information comprises any one of the following modes:
inputting the image into a dirt type identification model, and acquiring a dirt type identification result of the area to be cleaned; the dirt type identification model is obtained by training an image sample and a dirt type identification result label of the image sample;
and comparing the image with each reference image of known dirt types stored in a database, and taking the dirt type corresponding to the reference image matched with the image in the database as the dirt type of the area to be cleaned.
In a third aspect, an embodiment of the present invention further provides a cleaning control apparatus, including:
the mobile terminal comprises a first determining module, a second determining module and a control module, wherein the first determining module is used for determining the position information and the posture information of the mobile terminal when the mobile terminal shoots an area to be cleaned;
the second determining module is used for determining an imaging capture angle of a camera on the mobile terminal when the mobile terminal shoots an area to be cleaned;
the third determining module is used for determining the area to be cleaned according to the shot image, the position information, the posture information and the imaging capturing angle;
and the cleaning control module is used for controlling the cleaning equipment to carry out cleaning tasks on the area to be cleaned according to the area to be cleaned.
In a fourth aspect, an embodiment of the present invention further provides a cleaning control apparatus, including:
the fourth determining module is used for determining the position information and the posture information of the mobile terminal when the mobile terminal shoots an area;
a fifth determining module, configured to determine an imaging capture angle of a camera on the mobile terminal when the mobile terminal shoots the area;
the receiving module is used for receiving a mark area of a shot image of a user;
the sixth determining module is used for determining the area to be cleaned according to the mark area of the image shot by the user, the position information, the posture information and the imaging capturing angle;
and the task control module is used for controlling the cleaning equipment to carry out a cleaning task on the area to be cleaned according to the area to be cleaned.
In a fifth aspect, an embodiment of the present invention further provides an intelligent device, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor implements the steps of the cleaning control method according to any one of the first aspect when executing the program.
In a sixth aspect, an embodiment of the present invention further provides a sweeping robot device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the sweeping control method according to any one of the first aspect when executing the program.
In a seventh aspect, an embodiment of the present invention further provides a mobile terminal, including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor implements the steps of the cleaning control method according to any one of the first aspect when executing the program.
In an eighth aspect, an embodiment of the present invention further provides a server, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the cleaning control method according to any one of the first aspect when executing the program.
In a ninth aspect, embodiments of the present invention further provide a non-transitory computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the cleaning control method according to any one of the first aspect.
In a tenth aspect, an embodiment of the present invention further provides a cleaning control method, including:
receiving a cleaning instruction which is triggered by a user in real time through voice and is used for arranging a random cleaning task; the random cleaning task is a task which does not belong to a preset cleaning task;
determining an area to be cleaned for defining a cleaning range according to the cleaning instruction;
and controlling a cleaning device to perform a cleaning task on the area to be cleaned according to the area to be cleaned.
Further, the step of determining the area to be cleaned for defining the cleaning range according to the cleaning instruction comprises the following steps:
determining a reference object and relative position area information of the reference object contained in the cleaning command, and determining a to-be-cleaned area for defining a cleaning range according to the position information of the reference object and the relative position area information of the reference object;
and/or the presence of a gas in the gas,
determining absolute position coordinate information contained in the cleaning instruction, and determining a to-be-cleaned area for defining a cleaning range according to the absolute coordinate position information;
and/or the presence of a gas in the gas,
and determining a reference object contained in the cleaning instruction and relative coordinate information of the reference object, and determining an area to be cleaned for defining a cleaning range according to the reference object and the relative coordinate information of the reference object.
Further, according to the area to be cleaned, the cleaning equipment is controlled to carry out cleaning tasks on the area to be cleaned, and the cleaning tasks comprise:
identifying the ground type and/or the dirt type of the area to be cleaned;
and controlling the cleaning equipment to carry out targeted cleaning treatment on the area to be cleaned according to the ground type and/or dirt type identification result.
Further, the ground type recognition of the area to be cleaned comprises:
collecting sound signals emitted by cleaning equipment in the area to be cleaned;
and determining the material type of the area to be cleaned according to the audio features in the sound signals.
Further, according to the ground type recognition result, the control cleaning equipment carries out targeted cleaning treatment on the area to be cleaned, and the method comprises the following steps:
and controlling the cleaning equipment to carry out targeted cleaning treatment on the area to be cleaned by adopting a matched cleaning mode according to the ground type identification result.
Further, the dirt type identification of the area to be cleaned comprises the following steps:
collecting an image of the area to be cleaned;
and identifying the dirt type of the area to be cleaned according to the image.
Further, according to the image information, identifying the dirt type of the area to be cleaned comprises:
inputting the image into a dirt type identification model, and acquiring a dirt type identification result of the area to be cleaned;
the dirt type identification model is obtained by training an image sample and a dirt type identification result label of the image sample.
Further, according to the image, identifying the dirt type of the area to be cleaned comprises:
and comparing the image with each reference image of known dirt types stored in a database, and taking the dirt type corresponding to the reference image matched with the image in the database as the dirt type of the area to be cleaned.
In an eleventh aspect, an embodiment of the present invention further provides a sweeping control method, including:
receiving a cleaning instruction which is triggered by a user in real time through limb actions and used for arranging random cleaning tasks; the random cleaning task is a task which does not belong to a preset cleaning task;
determining an area to be cleaned for defining a cleaning range according to the cleaning instruction;
and controlling a cleaning device to perform a cleaning task on the area to be cleaned according to the area to be cleaned.
Further, the step of determining the area to be cleaned for defining the cleaning range according to the cleaning instruction comprises the following steps:
determining limb pointing and limb actions sent by a user to an area needing cleaning;
determining a coordinate area range formed on a cleaning surface according to the limb direction and the limb action;
and determining an area to be cleaned for defining a cleaning range according to the coordinate area range.
Further, according to the area to be cleaned, the cleaning equipment is controlled to carry out cleaning tasks on the area to be cleaned, and the cleaning tasks comprise:
identifying the ground type and/or the dirt type of the area to be cleaned;
and controlling the cleaning equipment to perform targeted cleaning treatment on the area to be cleaned according to the ground type and/or dirt type identification result.
Further, the ground type recognition of the area to be cleaned comprises:
collecting sound signals emitted by cleaning equipment in the area to be cleaned;
and determining the material type of the area to be cleaned according to the audio features in the sound signals.
Further, according to the ground type recognition result, the control cleaning equipment carries out targeted cleaning treatment on the area to be cleaned, and the method comprises the following steps:
and controlling the cleaning equipment to carry out targeted cleaning treatment on the area to be cleaned by adopting a matched cleaning mode according to the ground type identification result.
Further, the dirt type identification of the area to be cleaned comprises the following steps:
collecting an image of the area to be cleaned;
and identifying the dirt type of the area to be cleaned according to the image.
Further, according to the image information, identifying the dirt type of the area to be cleaned comprises:
inputting the image into a dirt type identification model, and acquiring a dirt type identification result of the area to be cleaned;
the dirt type identification model is obtained by training an image sample and a dirt type identification result label of the image sample.
Further, according to the image, identifying the dirt type of the area to be cleaned comprises:
and comparing the image with each reference image of known dirt types stored in a database, and taking the dirt type corresponding to the reference image matched with the image in the database as the dirt type of the area to be cleaned.
According to the technical scheme, the cleaning control method, the cleaning control device, the intelligent equipment, the mobile equipment and the server provided by the invention have the advantages that the area to be cleaned is accurately determined according to the position information and the posture information of the mobile terminal when the area to be cleaned is shot by the mobile terminal, the imaging capture angle of the camera on the mobile terminal when the area to be cleaned is shot by the mobile terminal, and the image shot by the mobile terminal in the area to be cleaned is obtained. Therefore, by adopting the cleaning control method provided by the invention, the area needing to be cleaned can be determined in real time only by shooting the area to be cleaned by the mobile terminal before cleaning, and then the cleaning equipment is controlled to clean the area accurately in a targeted manner, so that the cleaning efficiency can be effectively improved, and the power consumption caused by cleaning can be reduced.
It is to be understood that additional aspects and advantages of the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a flowchart of a cleaning control method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a room with a designated area to be cleaned according to an embodiment of the present invention;
fig. 3 is a schematic diagram of an UWB positioning system architecture according to an embodiment of the present invention;
fig. 4 is a flowchart illustrating a specific implementation of a cleaning control method according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a user drawing a region to be cleaned without moving the user according to an embodiment of the present invention;
FIG. 6 is a top plan view of the ground of FIG. 5;
fig. 7 is a schematic diagram illustrating a principle that an area to be cleaned is determined according to an imaging view angle of a camera by taking a picture according to an embodiment of the present invention;
fig. 8 and 9 are schematic diagrams illustrating the principle of determining the cleaning area to be cleaned by the cleaning area marked on the photographed image after photographing according to an embodiment of the present invention;
fig. 10 is a schematic structural diagram of a cleaning control device according to an embodiment of the present invention;
fig. 11 is a schematic structural diagram of an intelligent device according to an embodiment of the present invention;
fig. 12 is a schematic structural diagram of a mobile device according to an embodiment of the present invention;
fig. 13 is a schematic structural diagram of a server according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that, in the embodiment of the present invention, the term "and/or" describes an association relationship of an associated object, and indicates that three relationships may exist, for example, a and/or B may indicate: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. In addition, the term "plurality" in the embodiments of the present invention means two or more, and other terms are similar thereto. In the present invention, "shooting" refers to a process of recording an image with a special apparatus, including shooting a still picture, recording a moving image, and live view for a purpose other than recording.
Fig. 1 is a flowchart illustrating a cleaning control method according to an embodiment of the present invention, and referring to fig. 1, the cleaning control method according to the embodiment of the present invention includes:
step 101: shooting an area to be cleaned by using a mobile terminal, and determining position information and posture information of the mobile terminal when the mobile terminal shoots the area to be cleaned;
in this step, the mobile terminal may refer to a mobile phone, a PAD, a laptop, an intelligent bracelet, an intelligent watch, an intelligent reader, and other mobile intelligent devices. The method comprises the steps of shooting a region to be cleaned by using the mobile terminal, and then determining the position information and the posture information of the mobile terminal when the mobile terminal shoots the region to be cleaned.
In this step, when the mobile terminal photographs the area to be cleaned, the position information of the mobile terminal includes: the position information of the mobile terminal when shooting is performed, and the attitude information of the mobile terminal includes angle information with a cleaning surface (generally, the ground) when shooting is performed by the mobile terminal, a pointing direction of the mobile terminal when shooting is performed by the mobile terminal, and the like.
It can be understood that the purpose of acquiring the position information and the posture information of the mobile terminal during shooting in this step is to automatically convert an image shot by the mobile terminal for an area to be cleaned into an area to be cleaned on a cleaning surface in a subsequent step, so as to realize what you see is what you get, namely, fixed-point cleaning.
It should be noted that, the embodiment has the advantage that after the user takes the fixed-point shot of the area to be cleaned, the shot image can be directly converted into the area to be cleaned on the cleaning surface, so as to achieve targeted fixed-point cleaning. It should be emphasized that, unlike the prior art that a certain room or a certain area in the room (the certain area includes an area that needs to be cleaned and an area that does not need to be cleaned) is photographed, and then the area that needs to be cleaned is automatically identified from the photographed image, in this embodiment, the area that needs to be cleaned is not required to be identified from the photographed image, but the area that needs to be cleaned is directly photographed at a specific point by using an intelligent device, and then the photographed image is directly converted into the area that needs to be cleaned on the cleaning surface, so as to implement what you see is what you get at a specific point cleaning. Therefore, the mobile phone cleaning device has the advantages that the mobile phone is convenient for a user to use, the user can take the mobile phone to shoot the area needing to be cleaned, for example, the user is supposed to watch TV, melon seeds are eaten before a sofa, melon seed peels are left, at the moment, the user only needs to take the mobile phone to shoot the area needing to be cleaned before the sofa at a fixed point, the scheme of the follow-up mobile phone cleaning device can automatically convert the image shot at the fixed point of the user into the area needing to be cleaned on the cleaning surface of the cleaning device (namely, the area needing to be cleaned), the situation that the user sees the image and cleans the area at the fixed point is achieved, and meanwhile, the unnecessary image recognition processing process is greatly simplified.
It follows that the term "area to be cleaned" is understood to mean the (actual) area to be cleaned (i.e. the area which actually needs to be cleaned). For example, assuming that the area to be cleaned is a small area in front of the sofa, for the present embodiment, the "area to be cleaned" refers to the small area in front of the sofa, and does not include other areas other than the "small area in front of the sofa", so that the present embodiment does not need to identify the area to be cleaned from the captured image, but directly uses the smart device to capture the area to be cleaned at a specific point, and then directly converts the captured image into the area to be cleaned on the cleaning surface (as shown in fig. 7), thereby implementing what you see is what you get fixed-point cleaning.
Step 102: determining an imaging capture angle of a camera on the mobile terminal when the mobile terminal shoots an area to be cleaned;
in this step, it should be noted that the imaging capture angle of the camera on the mobile terminal is related to the parameter setting (e.g., wide angle of lens) of the camera during photographing, and the imaging capture angle can be obtained by reading the relevant parameter setting condition (e.g., wide angle of lens) during photographing of the camera. It will be appreciated that, with the same camera, the range of images taken by lenses of different wide angles is different.
It should be noted that, the above steps 101 and 102 are not in sequence.
Step 103: determining the area to be cleaned according to the shot image, the position information, the attitude information and the imaging capturing angle;
in the step, when the area to be cleaned is shot according to the mobile terminal, the position information and the posture information of the mobile terminal and the shooting of the area to be cleaned by the mobile terminal are carried out, the imaging capture angle of a camera on the mobile terminal is obtained, and the image obtained by shooting the area to be cleaned by the mobile terminal is accurately converted into the area to be cleaned, which can be visually seen by a user, of the cleaning equipment, wherein the area to be cleaned needs to be cleaned on the cleaning surface, so that the what you see is what you get is cleaned at a fixed point, the cleaning efficiency can be effectively improved, and the power consumption caused by cleaning can be reduced. Therefore, the mobile terminal is utilized to accurately convert the area to be cleaned, which can be visually seen by the user, into the area to be cleaned of the cleaning equipment, so that the visible and obtained fixed-point cleaning is realized, the cleaning efficiency can be effectively improved, and the power consumption caused by cleaning can be reduced.
In this embodiment, the cleaning surface refers to a surface on which the cleaning device performs a cleaning task. For example, if the cleaning device is cleaning on the floor surface, the cleaning surface is the floor surface, and if the cleaning device is cleaning on the wall surface, the cleaning surface is the wall surface.
Step 104: and controlling a cleaning device to perform a cleaning task on the area to be cleaned according to the area to be cleaned.
In this step, after the area to be cleaned is obtained, the cleaning equipment is controlled to operate to the area to be cleaned to perform the cleaning task. In this step, it should be noted that when the cleaning device is controlled to operate to the area to be cleaned to perform the cleaning task, the cleaning device may be controlled to adopt a default working mode, or the cleaning device may be controlled to select a working mode adapted to the type of dirt in the current area to be cleaned.
In this embodiment, it should be noted that an execution main body of the cleaning control method provided in this embodiment may be a cleaning control device for controlling an operation of the cleaning apparatus, or may be the cleaning apparatus itself, such as a cleaning robot, or may be a mobile apparatus, such as a mobile phone, a PAD, or the like, or may be control software installed on the mobile apparatus, or may be a server, and this embodiment is not limited thereto.
According to the cleaning control method provided by the embodiment, the area to be cleaned is accurately determined according to the position information and the posture information of the mobile terminal when the area to be cleaned is shot by the mobile terminal, the imaging capture angle of the camera on the mobile terminal when the area to be cleaned is shot by the mobile terminal, and the image obtained by shooting and shooting the area to be cleaned by the mobile terminal. Therefore, by adopting the cleaning control method provided by the invention, the area needing to be cleaned can be determined in real time only by shooting the area to be cleaned by the mobile terminal before cleaning, and then the cleaning equipment is controlled to clean the area accurately in a targeted manner, so that the cleaning efficiency can be effectively improved, and the power consumption caused by cleaning can be reduced.
As described above, it should be emphasized that, in the present embodiment, when the mobile terminal is used to capture an area to be cleaned, unlike the prior art that a certain room or a certain area in the room (the certain area includes both an area that needs to be cleaned and an area that does not need to be cleaned) is captured, the present embodiment does not need to identify the area that needs to be cleaned from the captured image. That is, the intelligent device is directly utilized to perform targeted fixed-point shooting on the area to be cleaned, and then the shot image is directly converted into the area to be cleaned on the cleaning surface, so that what you see is what you get is cleaned at a fixed point. However, in the prior art, a scheme of shooting a certain room or a certain area in the room (the certain area includes both an area which needs to be cleaned and an area which does not need to be cleaned) and then determining the area to be cleaned is to automatically identify an image obtained by shooting and then identify the area to be cleaned, but the embodiment does not need to identify the area which needs to be cleaned from the image obtained by shooting.
In addition, in this embodiment, it should be further noted that the "area to be cleaned" in this embodiment is different from the "fixed cleaning area set in the prior art," for example, an existing sweeping robot generally sets several fixed cleaning areas, such as a living room, a main bed, a secondary bed, and the like, in a preset manner, so that a subsequent user can select one or more of the several cleaning areas to clean, and accordingly, the sweeping robot can perform cleaning according to the one or more cleaning areas selected by the user from the several fixed cleaning areas.
Unlike the present application, the "area to be cleaned" in the present application refers to an area to be cleaned generated in real time according to an actual range to be cleaned, and it can be understood that a place located in the area to be cleaned needs to be cleaned, and a place located outside the area to be cleaned does not need to be cleaned.
For example, suppose a user leaves some snack residues in front of a couch after watching television, and the area to be cleaned is a small area in front of the couch. According to the scheme of the embodiment, the mobile phone can be picked up to shoot the small area in a targeted manner, an image which only contains the small area is obtained, then the small area contained in the image is mapped into an area to be cleaned on a cleaning surface (ground) (namely the small area in front of a sitting room sofa) through algorithm processing according to the position information, the height information and the angle information when the mobile phone shoots and the imaging capture angle information of the camera on the mobile phone, and then the sweeping robot can be controlled to move to the area to be cleaned on the cleaning surface to clean.
Therefore, the embodiment can realize the visible and instant fixed-point cleaning without cleaning the whole living room, thereby improving the cleaning efficiency, shortening the cleaning time, reducing the cleaning energy consumption, meeting the requirements and improving the user experience.
It can be seen that the "defined cleaning zone to be used to define the cleaning range" differs from the "fixed cleaning zone defined in the prior art" in that: firstly, the area to be cleaned in the application is generated in real time; secondly, the area to be cleaned in the application is a targeted area to be cleaned generated aiming at the current area to be cleaned; the area to be cleaned in the application can be an area to be cleaned determined in a certain existing fixed cleaning area, or an area to be cleaned determined by spanning multiple existing fixed cleaning areas. It can be understood that the three differences between the present application and the prior art are not independent, but are combined organically, that is, the area to be cleaned in the present application is a pattern for defining the cleaning range in real time according to the area currently required to be cleaned, the pattern does not use the room as the division standard, does not use the functional room as the division standard, but uses the area range currently to be cleaned as the division standard, and defines the area range currently to be cleaned in the area to be cleaned, so that the user can determine the matched area to be cleaned according to the cleaning range corresponding to the current cleaning requirement, so that the area to be cleaned can clearly define the cleaning range corresponding to the cleaning requirement of the user, and further can perform the cleaning treatment on the area corresponding to the area to be cleaned according to the area to be cleaned, thereby completing the cleaning task in a targeted manner, when satisfying the current demand of cleaning, improve and clean efficiency, reduce the energy consumption, shorten and clean the time. It is particularly emphasized that, the present embodiment can accurately convert the region to be cleaned, which can be visually and intuitively seen by the user, into the region that needs to be cleaned on the cleaning surface of the cleaning device, so as to realize what-you-see-is-what-you-get fixed-point cleaning (which is the key point of the present application), thereby effectively improving the cleaning efficiency and simultaneously reducing the power consumption caused by cleaning.
Therefore, the embodiment is different from the processing scheme of the prior art that a large-range area is shot and then a small-range area to be cleaned is determined by identification, and is also different from the processing scheme of the prior art that a fixed cleaning area is preset. The innovation point of the embodiment is that the area to be cleaned which is seen visually can be accurately converted into the area which needs to be cleaned on the cleaning surface by the cleaning equipment, so that the point-of-sight cleaning is realized.
For the above example, assuming that the user only needs to clean a small area of the melon seed skin in front of the couch, the user can take a targeted photo of the area by using the mobile phone to obtain an image of the area (except for the area, the area which does not need to be cleaned is no longer within the image range), and then the area to be cleaned on the image is accurately converted into the area which needs to be cleaned on the cleaning surface by the cleaning device through processing, so that the what you see is what you get fixed-point cleaning is realized. That is, the embodiment has accurately determined the region to be cleaned that needs to be cleaned, and then according to should waiting to clean the region (also be a little region before the sitting room sofa) execution and clean the processing, and need not to clean other regions to can improve and clean efficiency, shorten and clean the time, reduce and clean the energy consumption, can satisfy the demand simultaneously.
In this embodiment, it should be noted that the cleaning control method provided in this embodiment can be applied to common cleaning equipment, such as a sweeping robot, a floor cleaner, a tabletop cleaner, a bed cleaner, and the like. In addition, the cleaning control method provided in this embodiment may also be used for other intelligent cleaning devices, such as a robot for cleaning a wall, a robot for cleaning an indoor roof, a robot for cleaning an outdoor roof, and the like, which is not limited in this embodiment.
It should be noted that the cleaning control method provided in the present embodiment can be widely used in a home, a factory, an airport, outdoors, and the like, and the present embodiment is not limited thereto.
Based on the content of the foregoing embodiment, in this embodiment, determining the position information and the posture information of the mobile terminal when the mobile terminal photographs the area to be cleaned includes:
determining the position coordinate of the mobile terminal when the mobile terminal shoots an area to be cleaned;
the method comprises the steps of determining the pointing direction of the mobile terminal and a first angle between the mobile terminal and a cleaning surface when the mobile terminal shoots an area to be cleaned.
In this embodiment, to simplify the processing process, the position coordinates of the mobile terminal when the mobile terminal shoots the area to be cleaned may be determined based on an indoor positioning system; in addition, the pointing direction of the mobile terminal and the first angle between the mobile terminal and the cleaning surface when the mobile terminal shoots the area to be cleaned can be determined based on a direction sensor and an angular velocity sensor carried on the mobile terminal.
In this embodiment, the indoor positioning system may be an ultra Wide uwb (ultra Wide band) positioning system. In addition, the system may be a global Positioning system (gps). In addition, the current position information of the mobile terminal can be acquired through a human body infrared sensor (pyroelectric infrared sensor).
In the present embodiment, the angular velocity sensor may be implemented as a gyro sensor.
Based on the content of the above embodiment, in the present embodiment, determining the area to be cleaned according to the captured image, the position information, the posture information, and the imaging capture angle includes:
determining a central coordinate of a central point on the image mapped to the cleaning surface according to the shot image, the position coordinate, the pointing direction and the first angle;
and determining the corresponding area to be cleaned of the image on the cleaning surface according to the central coordinate and the imaging capturing angle.
In this embodiment, according to the captured image, the position coordinate, the pointing direction, and the first angle, a preset camera height may be used to determine a center coordinate where a center point on the image is mapped onto the cleaning surface, and then according to the center coordinate and the imaging capture angle, a corresponding to-be-cleaned area of the image on the cleaning surface is determined.
Based on the content of the above embodiment, in the present embodiment, determining the area to be cleaned according to the captured image, the position information, the posture information, and the imaging capture angle includes:
determining a first distance between the mobile terminal and a cleaning surface when the mobile terminal shoots the area;
determining a central coordinate of a central point on the image mapped to the cleaning surface according to the shot image, the position coordinate, the first distance, the pointing direction and the first angle;
and determining the corresponding area to be cleaned of the image on the cleaning surface according to the central coordinate and the imaging capturing angle.
In this embodiment, it is understood that, when the cleaning surface is the ground, the first distance actually refers to the height of the mobile terminal from the ground. In this embodiment, by combining the captured image, the position coordinate, the first distance, the pointing direction, and the first angle, the central coordinate of the image, where the central point is mapped onto the cleaning surface, can be determined more accurately, and then the area to be cleaned, corresponding to the image on the cleaning surface, can be determined accurately according to the central coordinate and the imaging capturing angle.
In this embodiment, the scheme provided in this embodiment will be described with reference to fig. 7. In the present embodiment, it can be realized by means of a UWB positioning system. Specifically, the UWB positioning base stations may be set to 4 or more, so that three-dimensional coordinates can be located to identify the height between the mobile phone and the ground. The direction of the mobile phone is identified through the mobile phone gyroscope, and the mobile phone shoots towards the ground. The coordinates of the central point a on the ground mapped by the image are calculated according to the positioning coordinates of the mobile phone, the height from the ground and the angle θ 2 (first angle) between the Z axis of the mobile phone and the ground, and then the area to be cleaned on the ground corresponding to the area to be cleaned on the image can be determined according to the coordinates of the central point a and the shooting angle θ 3 (imaging capture angle) of the mobile phone camera (as shown in fig. 7).
It can be understood that, by the processing method of this embodiment, a novel method for determining the area to be cleaned is provided for the user, and the user can define the area to be cleaned by taking a picture of the area to be cleaned (as shown in fig. 7) without walking, thereby facilitating the use of the user.
Based on the content of the above embodiment, in this embodiment, determining the area to be cleaned corresponding to the image on the cleaning surface according to the center coordinate and the imaging capture angle includes:
determining an arc-shaped area of the image on the cleaning surface according to the central coordinate and the imaging capturing angle;
determining a first area coordinate range of the image in the arc-shaped area according to a corresponding cutting processing process when the image is mapped to the cleaning surface;
and determining the first area coordinate range as an area to be cleaned.
In this embodiment, the scheme provided in this embodiment will be described with reference to fig. 7. In the present embodiment, it can be realized by means of a UWB positioning system. Specifically, the UWB positioning base stations may be set to 4 or more, so that three-dimensional coordinates can be located to identify the height between the mobile phone and the ground. The direction of the mobile phone is identified through the mobile phone gyroscope, and the mobile phone shoots towards the ground. The method comprises the steps of calculating the coordinate of a central point A mapped by an image on the ground according to a mobile phone positioning coordinate, the ground clearance and an angle theta 2 (a first angle) between a Z axis of a mobile phone and the ground, further reading a shooting angle theta 3 (an imaging capturing angle) of a mobile phone camera according to the coordinate of the central point A, drawing an arc-shaped area, calculating the actual area coordinate range of the image according to a cutting process in the process of imaging and switching a camera lens into a directional image, and taking the range of the image as the area cleaned by a sweeper (as shown in figure 7).
Therefore, the image shot aiming at the area to be cleaned can be effectively converted into the area to be cleaned on the ground, and the visible and acquired fixed-point cleaning control can be easily and accurately finished.
Based on the same inventive concept, another embodiment of the present invention provides a sweeping control method, including:
s101, shooting an area by using a mobile terminal, and determining position information and posture information of the mobile terminal when the mobile terminal shoots the area;
in this step, it should be noted that, the "area" herein is different from the "area to be cleaned" in the step 101 "in the above embodiment, when it is determined that the mobile terminal photographs the area to be cleaned, the position information and the posture information of the mobile terminal in the" area to be cleaned "are different, and the" area "herein refers to an area having a range larger than (actual) the area to be cleaned, that is, an area including both the (actual) area to be cleaned and the area not to be cleaned. In the above embodiment, the "area to be cleaned" in the step 101 "shooting the area to be cleaned by using the mobile terminal, and determining the position information and the posture information of the mobile terminal when the mobile terminal shoots the area to be cleaned" refers to the (actual) area to be cleaned.
For example, assuming that the area to be cleaned is a small area in front of the sofa, the "area to be cleaned" in step 101 in the above embodiment refers to a small area in front of the sofa (as shown in fig. 7, a small area in front of the sofa is directly photographed, and then the photographed image is directly converted into an area to be cleaned on the cleaning surface, so as to realize the fixed-point cleaning, and for this embodiment, the "area" may refer to a large area in the living room, or to an area containing the contents of the sofa, the television, the tea table, and the like, as shown in fig. 8, and is a large area in the living room, and then the area to be cleaned is marked from the photographed image (as shown in the dashed line in fig. 8, the area marked by the user).
S102, determining an imaging capture angle of a camera on the mobile terminal when the mobile terminal shoots the area;
in this step, it should be noted that the imaging capture angle of the camera on the mobile terminal is related to the parameter setting (e.g., wide angle of lens) of the camera during photographing, and the imaging capture angle can be obtained by reading the relevant parameter setting condition (e.g., wide angle of lens) during photographing of the camera. It is understood that the same camera, the range of taking pictures with different wide-angle lenses is different.
S103, receiving a mark area of a shot image of a user;
in this step, it should be noted that, in order to facilitate the user to define the cleaning area more accurately, area marking or area delineation may be performed on the captured image (as shown in fig. 8 and 9). It can be understood that the mode of area marking or area delineation on the shot image can give more operation freedom to the user, so that the user has more time to select the proper area to be cleaned. In addition, the manner of area marking or area delineation on the captured image may be to mark or delineate one or more areas to be cleaned as needed, which is not limited in this embodiment.
S104, determining a region to be cleaned according to a mark region of a user on a shot image, the position information, the posture information and the imaging capture angle;
in this step, according to the position information, the posture information of the mobile terminal, the imaging capture angle of the camera on the mobile terminal when the mobile terminal shoots, and the mark area of the user on the shot image, the mark area is accurately converted into the area which needs to be cleaned on the cleaning surface of the cleaning equipment, so that the accurate fixed-point cleaning can be realized, the cleaning efficiency can be effectively improved, and the power consumption caused by the cleaning can be reduced.
And S105, controlling a cleaning device to perform a cleaning task on the area to be cleaned according to the area to be cleaned.
In this step, after the area to be cleaned is obtained, the cleaning equipment is controlled to operate to the area to be cleaned to perform the cleaning task.
In this embodiment, it should be noted that the position information and the posture information of the mobile terminal when determining that the mobile terminal shoots the area may be: determining the position coordinates of the mobile terminal when the mobile terminal shoots the area based on an indoor positioning system; and determining the direction of the mobile terminal and a first angle between the mobile terminal and a cleaning surface when the mobile terminal shoots the area based on a direction sensor and an angular velocity sensor carried on the mobile terminal.
In this embodiment, it should be noted that the position information and the posture information of the mobile terminal when the mobile terminal is determined to photograph the area may also be: determining the position coordinates of the mobile terminal when the mobile terminal shoots the area based on an indoor positioning system; determining a first distance between the mobile terminal and a cleaning surface when the mobile terminal shoots the area based on an indoor positioning system; and determining the direction of the mobile terminal and a first angle between the mobile terminal and a cleaning surface when the mobile terminal shoots the area based on a direction sensor and an angular velocity sensor carried on the mobile terminal.
Based on the content of the above embodiment, in the present embodiment, determining the area to be cleaned according to the mark area of the user on the captured image, the position information, the posture information, and the imaging capture angle includes:
determining a central coordinate of a central point on the image mapped to the cleaning surface according to the image obtained by shooting, the position coordinate, the pointing direction and the first angle;
and determining a corresponding area to be cleaned on the cleaning surface according to the central coordinate, the imaging capturing angle and the marking area.
Based on the content of the above embodiment, in the present embodiment, determining the area to be cleaned according to the mark area of the user on the captured image, the position information, the posture information, and the imaging capture angle includes:
determining a first distance between the mobile terminal and a cleaning surface when the mobile terminal shoots the area;
determining a central coordinate of a central point on the image mapped to the cleaning surface according to the shot image, the position coordinate, the first distance, the pointing direction and the first angle;
and determining a corresponding area to be cleaned on the cleaning surface according to the central coordinate, the imaging capturing angle and the marking area.
In this embodiment, to facilitate the user to define the cleaning area more accurately, area marking or area delineation may be performed on the captured image (as shown in fig. 8 and 9). It can be understood that the mode of area marking or area delineation on the shot image can give more operation freedom to the user, so that the user has more time to select the proper area to be cleaned. In addition, the manner of area marking or area delineation on the captured image may be to mark or delineate one or more areas to be cleaned as needed, which is not limited in this embodiment.
The scheme provided by the present embodiment will be described below with reference to fig. 8 and 9. An area can be defined as a cleaning area of the sweeper (other marking modes can be adopted) by sliding and coiling the area on a screen of the mobile phone from the shot image. In the present embodiment, it can be realized by means of a UWB positioning system. Specifically, the UWB positioning base stations may be set to 4 or more, so that three-dimensional coordinates can be located to identify the height between the mobile phone and the ground. The direction of the mobile phone is identified through the mobile phone gyroscope, and the mobile phone shoots towards the ground. The coordinates of the central point a on the ground mapped by the image are calculated according to the positioning coordinates of the mobile phone, the height from the ground and the angle θ 2 (first angle) between the Z axis of the mobile phone and the ground, and then the area to be cleaned on the ground corresponding to the marked area on the image can be determined according to the coordinates of the central point a, the shooting angle θ 3 (imaging capture angle) of the mobile phone camera and the marked area (as shown in fig. 9).
It should be noted that, in the present embodiment, the mode of determining the to-be-cleaned area and then determining the to-be-cleaned area by performing the area marking in the photographed image is adopted, compared with the mode of identifying the to-be-cleaned area by using an image in the prior art, the processing pressure is greatly reduced.
Based on the content of the above embodiment, in this embodiment, determining the area to be cleaned corresponding to the image on the cleaning surface according to the center coordinate, the imaging capture angle and the mark area includes:
determining an arc-shaped area of the image on the cleaning surface according to the central coordinate and the imaging capturing angle;
determining a sub-imaging capture angle corresponding to the mark region according to the distance and the direction relation between the mark region and the central point of the image; wherein the sub-imaging capture angle is less than the imaging capture angle;
determining a sub-circular arc area of the marking area in the circular arc area according to the central coordinate and the sub-imaging capturing angle;
determining a second area coordinate range of the marked area in the sub-arc-shaped area according to a corresponding cutting processing process when the marked area is mapped to the cleaning surface;
and determining the second area coordinate range as an area to be cleaned.
The scheme provided by the present embodiment will be described below with reference to fig. 8 and 9. An area can be defined as a cleaning area of the sweeper (other marking modes can be adopted) by sliding and coiling the area on a screen of the mobile phone from the shot image. In the present embodiment, this can be realized by means of a UWB positioning system. Specifically, the UWB positioning base stations may be set to 4 or more, so that three-dimensional coordinates can be located to identify the height between the mobile phone and the ground. The direction of the mobile phone is identified through the mobile phone gyroscope, and the mobile phone shoots towards the ground. Calculating the coordinate of a central point A mapped on the ground by an image according to the positioning coordinate of the mobile phone, the ground clearance and the angle theta 2 (first angle) between the Z axis of the mobile phone and the ground, determining the arc-shaped area of the image on the cleaning surface according to the coordinate of the central point A and the shooting angle theta 3 (imaging capture angle) of the mobile phone camera, then determining the sub-imaging capture angle theta 4 (the angle theta 4 is not more than theta 3 in the image range) corresponding to the mark area according to the distance and the direction relation between the mark area and the central point of the image, and the image capture angle), and further determining the second area coordinate range of the mark area in the sub-arc-shaped area according to the cutting process of switching the imaging of the camera lens into the direction image process, namely according to the corresponding cutting process of mapping the mark area to the cleaning surface, and determining the second area coordinate range as an area to be cleaned (as shown in figure 9).
Based on the content of the foregoing embodiment, in this embodiment, determining the area to be cleaned corresponding to the image on the cleaning surface according to the center coordinate, the imaging capture angle and the mark area includes:
determining an arc-shaped area of the image on the cleaning surface according to the central coordinate and the imaging capturing angle;
determining a first area coordinate range of the image in the arc-shaped area according to a corresponding cutting processing process when the image is mapped to the cleaning surface;
determining a second area coordinate range of the marked area on the cleaning surface according to the relative position relationship and the relative size relationship between the marked area and the image and the first area coordinate range;
and determining the second area coordinate range as an area to be cleaned.
In this embodiment, the foregoing processing procedure is similar to that in the foregoing embodiment, except that in this embodiment, a second area coordinate range of the mark area on the cleaning surface is determined according to the relative position relationship and the relative size relationship between the mark area and the image, and the first area coordinate range, so that, in this embodiment, the relative position relationship and the relative size relationship between the mark area and the image are fully utilized, and then the relative position relationship and the size relationship between the mark area when being mapped to the cleaning surface and the image when being mapped to the cleaning surface are determined, so that the second area coordinate range of the mark area on the cleaning surface can be accurately determined, and the second area coordinate range is the area to be cleaned.
It should be noted that the relative positional relationship here refers to where the mark region is located in the image, such as the mark region being 3cm away from the left edge of the image, 4cm away from the right edge, 6cm away from the upper edge, and the like, and the relative size relationship refers to what the area ratio of the mark region to the image is. In addition, the relative positional relationship may also refer to which region of the image the mark region is located, for example, the image may be divided into 20 small squares in advance, and then the small squares of the image on which the mark region is located may be determined, thereby determining the relative positional relationship between the mark region and the image.
Based on the content of the above embodiment, in this embodiment, controlling a cleaning device to perform a cleaning task on the area to be cleaned according to the area to be cleaned includes:
identifying the ground type and/or the dirt type of the area to be cleaned;
and controlling the cleaning equipment to carry out targeted cleaning treatment on the area to be cleaned according to the ground type and/or dirt type identification result.
In this embodiment, the types of floor surfaces include different types of floors, carpets, tiles, and the like. The types of dirt include dust, debris, hard shells, water dirt, oil dirt, and the like.
In this embodiment, a targeted cleaning process is performed on the area to be cleaned according to the floor type and/or dirt type recognition result. For example, if the floor type is determined to be carpet material, a cleaning method matching the carpet material, such as increasing the suction force, may be used. If the floor type is determined to be the floor material, a cleaning mode matched with the floor material can be adopted, for example, a mode of adopting standard adsorption force to adsorb and then adopting micro-wet cloth to mop. If the ground type is determined to be the floor tile material, a cleaning mode matched with the floor tile material can be adopted, for example, a cleaning mode of carrying out adsorption by adopting a small adsorption force, then carrying out mopping by adopting wet cloth and carrying out mopping by adopting dry cloth.
In addition, different cleaning methods can be used for different types of dirt, for example, if the determined type of dirt is dust, the dirt can be cleaned by wet cloth mopping. If the determined type of dirt is debris, then the cleaning can be performed by suction. If the determined dirt type is water dirt, the cleaning can be carried out in a dry cloth mopping mode. If the determined dirt type is oil dirt, cleaning can be carried out by firstly spraying a detergent and then adopting a wet cloth mopping mode, if the dirt type is a carpet, cleaning can be carried out by firstly spraying the detergent and then adopting an adsorption mode, or cleaning can be carried out by firstly spraying the detergent and then adopting a brushing mode, and the like.
It can be understood that, when the ground type is identified, the identification can be performed by adopting a mode of image acquisition plus machine learning, and can also be performed by adopting a mode of sound acquisition plus sound characteristics, wherein the sound refers to the sound emitted by the cleaning equipment when the cleaning equipment passes through the ground. With respect to a specific recognition process, a neural network model can be used for training and recognition, and the embodiment is not described in detail.
It can be understood that, when the dirt type is identified, the identification can be performed in a mode of image acquisition and machine learning. With respect to a specific recognition process, a neural network model can be used for training and recognition, and the embodiment is not described in detail.
Based on the content of the foregoing embodiment, in this embodiment, performing ground type identification on the area to be cleaned includes:
collecting sound signals emitted by cleaning equipment in the area to be cleaned;
and determining the material type of the area to be cleaned according to the audio features in the sound signals.
In this embodiment, it can be understood that, during the walking process of the cleaning device, internal components such as a motor operate to generate sound, and the sound is transmitted in the environment where the cleaning device is located, so that, by arranging a sound collector on the cleaning device, a sound signal of the environment where the cleaning device is located can be collected, and the sound signal is subjected to audio feature extraction and analysis to determine the type of the ground material of the current environment where the cleaning device is located.
It should be noted that there may be various propagation paths of the sound generated by the sweeping device, and one of the propagation paths is directed to the ground and reflected by the ground. The sound is affected differently by different floor materials, for example, when the floor material is a carpet, the carpet tends to absorb much of the energy of the sound, resulting in a significant attenuation of the energy of the sound signal reflected by the carpet. Therefore, the different influences of different ground material environments on the sound signals emitted by the cleaning equipment can be utilized, so that the characteristics of different audio characteristics of the sound signals reflected by different ground materials are caused, and the purpose of accurately identifying the ground material types is realized based on the audio characteristics in the sound signals reflected by the ground. Therefore, the collecting of the sound signal emitted by the cleaning device in the area to be cleaned may specifically be: collecting sound signals which are emitted by the cleaning equipment in the cleaning process and are reflected by the ground.
The following description will take the example of collecting the sound signal reflected by the ground surface emitted by the cleaning device during the cleaning process, but it should be understood that the sound signal collecting the environment where the cleaning device is located is not limited to this example.
In this embodiment, the shape and the structure of the cleaning device are not specifically limited, but it can be understood that when the cleaning device is placed on the ground to perform cleaning tasks such as sweeping and mopping, a gap with a certain height is often formed between the bottom of the mechanical body of the cleaning device and the ground, so that the cleaning device can smoothly walk on the ground and the friction between the cleaning device and the ground in the walking process is reduced. Therefore, a sound collector such as a microphone can be arranged at the bottom of the mechanical body of the cleaning equipment for collecting the sound signals emitted by the cleaning equipment during the cleaning process and reflected by the ground. Before specifically describing the ground material identification method provided by the embodiment of the invention, the ground material type is described: sweeping devices are often used in home or airport or factory settings, and in these settings, especially in home settings, the floor environment is often complex, such as in some homes, floor tiles are laid in kitchens and toilets, floors are laid in other rooms, and small areas of carpet, floor mats, etc. may also be laid in living rooms or some rooms. Therefore, in this embodiment, the types of the ground material can be divided into carpet, floor and floor tile materials according to the common ground material condition in the actual home scene.
It should be noted that the material of the floor in this embodiment refers to the material of the medium on which the cleaning device directly travels, that is, the floor in this embodiment refers to the medium on which the cleaning device directly travels. For example, when a carpet is laid on a floor surface, the cleaning device is directly driven on the carpet when cleaning the carpet, and thus the floor material at this time is the carpet, not the floor under the carpet.
It can be understood that, no matter how the type of the ground material is specifically divided in advance, the influence of each type of ground material obtained by the division on the sound signal emitted by the cleaning equipment is different, especially the influence on the sound signal reflected by the ground is different, and the influence can be represented by different audio feature information contained in the sound signal reflected by the different types of ground materials, so that the embodiment can determine the type of the material of the area to be cleaned according to the audio feature in the sound signal.
Specifically, when the material type of the area to be cleaned is determined according to the audio features in the sound signal, a neural network model may be used for automatic identification, or a database matching identification may be used, which is not limited in this embodiment.
Therefore, the embodiment identifies the ground type, so that the cleaning mode adaptive to the current ground type can be selected, the cleaning effect can be improved, and the improvement of user experience is facilitated.
Based on the content of the foregoing embodiment, in this embodiment, controlling the cleaning device to perform targeted cleaning processing on the area to be cleaned according to the ground type identification result includes:
and controlling the cleaning equipment to carry out targeted cleaning treatment on the area to be cleaned by adopting a matched cleaning mode according to the ground type identification result.
In this embodiment, according to the ground type recognition result, the cleaning device can be controlled to perform targeted cleaning processing on the area to be cleaned by adopting the matched cleaning mode. For example, if the floor type is determined to be carpet material, a cleaning method matching the carpet material, such as increasing the suction force, may be used. If the floor type is determined to be the floor material, a cleaning mode matched with the floor material can be adopted, for example, a mode of adopting standard adsorption force to adsorb and then adopting micro-wet cloth to mop. If the ground type is determined to be the floor tile material, a cleaning mode matched with the floor tile material can be adopted, for example, a cleaning mode of carrying out adsorption by adopting a small adsorption force, then carrying out mopping by adopting wet cloth and carrying out mopping by adopting dry cloth. Therefore, according to the ground type recognition result, the cleaning equipment is controlled to carry out targeted cleaning treatment on the area to be cleaned by adopting the matched cleaning mode, and the cleaning effect can be effectively improved.
Based on the content of the foregoing embodiment, in this embodiment, identifying the type of dirt in the area to be cleaned includes:
collecting an image of the area to be cleaned;
and identifying the dirt type of the area to be cleaned according to the image.
In the present embodiment, when identifying the dirt type of the area to be cleaned, an effective means is to determine the dirt type of the area to be cleaned by means of image identification. For example, different types of dust, debris, hard shells, water dirt, oil dirt and the like have obvious image distinguishing features, so that the corresponding dirt type can be accurately identified according to an image identification mode.
Based on the content of the foregoing embodiment, in this embodiment, identifying the dirt type of the area to be cleaned according to the image information includes:
inputting the image into a dirt type identification model, and acquiring a dirt type identification result of the area to be cleaned;
the dirt type recognition model is obtained by training an image sample and a dirt type recognition result label of the image sample based on a neural network.
In this embodiment, a specific identification manner is provided, and the embodiment identifies the dirt type by using a dirt type identification model implemented based on a neural network, so that the dirt type identification result of the area to be cleaned can be accurately obtained. Since conventional algorithms can be adopted for the construction and training of the neural network model, they are not described in detail here.
Based on the content of the above embodiment, in the present embodiment, identifying the dirt type of the area to be cleaned according to the image includes:
and comparing the image with each reference image of known dirt types stored in a database, and taking the dirt type corresponding to the reference image matched with the image in the database as the dirt type of the area to be cleaned.
In this embodiment, a processing manner different from that of the above embodiment is provided, in this embodiment, a pixel feature comparison is performed between an acquired image and each reference image of known dirt types stored in a database, and a dirt type corresponding to a reference image matched with the image in the database is used as a dirt type of the area to be cleaned.
Another embodiment of the present invention provides a sweeping control method, including the following processing steps:
a1, determining an area to be cleaned for defining a cleaning range based on the moving track of the mobile terminal;
and A2, controlling a cleaning device to perform a cleaning task on the area to be cleaned based on the area to be cleaned.
In this embodiment, the determining, based on the moving track of the mobile terminal, an area to be cleaned for defining a cleaning range includes:
and determining the area to be cleaned for defining the cleaning range according to a movement track formed by the movement of the mobile terminal around the area to be cleaned.
In this embodiment, the determining the area to be cleaned for defining the cleaning range according to the movement track formed by the mobile terminal moving around the area to be cleaned includes any one of the following manners:
determining that a moving track formed by the movement of the mobile terminal around the area to be cleaned is a closed track, and determining the closed track as the area to be cleaned;
determining whether a moving track formed by the mobile terminal moving around the area to be cleaned is a non-closed track, and determining whether the non-closed track and a fixed barrier in a room form a closed track, if so, determining the formed closed track as the area to be cleaned.
In this embodiment, determining that a moving track formed by the mobile terminal moving around the area to be cleaned is a closed track, and determining the closed track as the area to be cleaned includes:
determining the coordinates of a moving track formed by the movement of the mobile terminal around the area to be cleaned based on an indoor positioning system;
and determining whether the moving track is a closed track or not according to the coordinates of the moving track, and if so, determining the closed track as an area to be cleaned.
In this embodiment, based on the area to be cleaned, controlling the cleaning device to perform a cleaning task on the area to be cleaned includes:
and controlling a cleaning device to clean the area in the closed track based on the coordinate corresponding to the closed track.
In this embodiment, determining that a movement track formed by the mobile terminal moving around the area to be cleaned is a non-closed track, determining whether the non-closed track and a fixed barrier in a room form a closed track, and if so, determining the formed closed track as the area to be cleaned includes:
determining the coordinates of a moving track formed by the movement of the mobile terminal around the area to be cleaned based on an indoor positioning system;
and determining whether the moving track is a closed track or not according to the coordinates of the moving track, if not, determining that the moving track is a non-closed track, and determining whether the non-closed track and a fixed barrier in a room form a closed track or not according to the coordinates corresponding to the non-closed track, and if so, determining the formed closed track as an area to be cleaned.
In the present embodiment, the composed closed-type trajectory is determined as the area to be cleaned, and includes any one of the following manners:
determining that a plurality of closed tracks formed by the non-closed tracks and fixed barriers in the room exist, displaying the plurality of closed tracks for a user to select, and determining the closed tracks selected by the user as an area to be cleaned;
and determining that a plurality of closed tracks are formed by the unclosed tracks and fixed barriers in the room, and determining the closed tracks with smaller cleaning area as the areas to be cleaned.
In this embodiment, the area to be cleaned may be a circle, an ellipse, a triangle, a rectangle, a trapezoid, a diamond, an irregular polygon, etc., which is not limited in this embodiment. In addition, the region to be cleaned may be a closed pattern surrounded by free curves (as shown in fig. 2, a closed pattern surrounded by curves in a living room, and a region in the pattern belongs to a region that needs to be cleaned), or may be a closed pattern surrounded by straight lines and curves.
In general, the area to be cleaned is an area surrounded by a closed track, and in a special case, for example, in a corner area, the area to be cleaned may be an area surrounded by a semi-closed track.
In this embodiment, the mobile device may be a common mobile phone, PAD, smart band, computer, etc.
In the embodiment, an implementation manner of determining the area to be cleaned is given, that is, the area to be cleaned for defining the cleaning range is determined according to the movement positioning track of the mobile device.
It can be understood that, because the area to be cleaned is determined according to the movement positioning track of the mobile device, the mobile device can be carried or held by hand to accurately move around the area to be cleaned, so that the area to be cleaned, which can accurately define the cleaning range, is formed, and the implementation mode is simple and convenient.
It can be understood that, for the solution of this embodiment, it is necessary that the mobile device carries the positioning software or has a positioning function, so that when the mobile device moves, the mobile positioning track of the mobile device can be determined through the positioning software or the positioning function, and then the area to be cleaned for defining the cleaning range is determined according to the track.
For example, if the user needs to clean the interactive zone area of the main bedroom and the living room, that is, the area including a small part of the bedroom and the area including a small part of the living room, then, if the existing sweeping robot works, one way is to perform a full room cleaning (including each room in the room), and the other way is to perform a cleaning of the living room and the main bedroom, that is, only two rooms, namely the living room and the main bedroom, are cleaned. However, no matter which way is adopted, the problem of low cleaning efficiency exists, actually, only the interactive zone area of the main bedroom and the living room needs to be cleaned, according to the scheme of the embodiment, the area to be cleaned for defining the interactive zone area can be determined, for example, a user can hold a mobile phone to go round the interactive zone area, then the area to be cleaned is determined according to the moving track of the mobile phone, and then the cleaning robot is controlled to start cleaning according to the determined area to be cleaned, so that the cleaning of the interactive zone area of the main bedroom and the living room can be finished in a targeted manner, the cleaning efficiency is effectively improved, the energy consumption is reduced, and the cleaning time is shortened.
For another example, if the user needs to clean the right area of the balcony, in this case, if the existing sweeping robot works in the mode, one way is to clean the whole house (including each room in the house), and the other way is to clean the balcony of a specific area, that is, only clean the functional room of the balcony. However, no matter which way is adopted, the problem of low cleaning efficiency exists, in fact, only the balcony right side area needs to be cleaned, according to the scheme of the embodiment, the area to be cleaned for defining the balcony right side area can be determined, for example, a user can hold a mobile phone to walk around the balcony right side area for a circle, then the area to be cleaned is determined according to the moving track of the mobile phone, and then the cleaning robot is controlled to start to perform cleaning work according to the determined area to be cleaned, so that the cleaning of the balcony right side area can be finished in a targeted manner, the cleaning efficiency is effectively improved, the energy consumption is reduced, and the cleaning time is shortened. For example, for the sweeping robot, the power consumption of the sweeping robot can be effectively reduced, and the cruising ability of the sweeping robot is improved.
Therefore, the random area with any shape can be designated in real time, and the designated area can be cleaned accurately in a targeted manner, so that the cleaning efficiency is improved.
It will be appreciated that in one implementation, the movement location track may be directly taken as the area to be cleaned. In other implementation manners, the pattern corresponding to the movement positioning track can be slightly enlarged to be used as the area to be cleaned. For example, the magnification may be 0.1 times, 0.2 times, or the like. It can be understood that the advantage of using the slightly enlarged pattern corresponding to the moving positioning track as the region to be cleaned can ensure the cleaning effect, and avoid the problem of missing cleaning or poor cleaning effect of the edge of the region to be cleaned.
Based on the content of the foregoing embodiment, in this embodiment, the determining, according to the movement positioning track of the mobile device, an area to be cleaned for defining a cleaning range specifically includes:
and determining an area to be cleaned for defining a cleaning range according to a movement positioning track formed by moving the mobile equipment around the area to be cleaned.
In this embodiment, at least two implementation manners may be adopted to determine the area to be cleaned according to the movement positioning track of the mobile device, one implementation manner is that a user carries the mobile device to move around the area to be cleaned to form a movement positioning track, and then the area to be cleaned is determined according to the movement positioning track. The other method is that the user does not need to carry the mobile device to move around the area to be cleaned, namely the user can stay in place, the mobile device is only required to point to the area to be cleaned to perform area delineation to form a delineation pattern, and then the area to be cleaned for defining the cleaning range is determined according to the delineation pattern.
The present embodiment mainly describes the first implementation manner, and for the second implementation manner, reference may be made to the description of the following embodiments. In this embodiment, a mobile device with a positioning function may be used, and then the mobile device is carried or held by hand to move around the area to be cleaned. In a special case, a closed moving and positioning track may not be formed, for example, a semi-closed moving and positioning track may be formed, the semi-closed moving and positioning track is suitable for surrounding areas such as wall corners, sofa sides and the like, and the areas are just enough that the semi-closed moving and positioning track can form a closed area to be cleaned by means of the walls, the sofa sides and the like.
It will be appreciated that this implementation of the present embodiment has the advantage of being easy to operate and the area to be cleaned being determined more accurately.
Based on the content of the foregoing embodiment, in this embodiment, the determining, according to a movement positioning track formed by moving the mobile device around the area needing to be cleaned, the area to be cleaned for defining the cleaning range specifically includes:
determining the coordinates of a mobile positioning track formed by the mobile equipment moving around the area needing to be cleaned based on an ultra-wideband UWB positioning system;
and determining an area to be cleaned for defining a cleaning range according to the coordinates of the mobile positioning track.
In this embodiment, the coordinates of a mobile positioning track formed by moving the mobile device around an area that needs to be cleaned may be determined based on an ultra Wide band (uwb) positioning system, and then the area to be cleaned for defining a cleaning range may be determined according to the coordinates of the mobile positioning track.
In this embodiment, firstly, the UWB positioning system shown in fig. 3 needs to be established, and the procedure for establishing the UWB positioning system is as follows:
first, a reference tag needs to be set for calibrating the reference position, where at least 1 reference tag is needed. In addition, a positioning base station is required to be arranged, and the base station has a UWB transceiving function and is used for receiving positioning tag information and sending the information to a mobile phone or a sweeping robot. At least 3 positioning base stations are also required. In addition, an exchange for positioning the merged connection of the base stations needs to be provided, the exchange being connected to a positioning system server, and in addition, a server for positioning data processing needs to be provided.
It can be understood that the sweeping robot may be a WiFi intelligent sweeping robot integrated with a UWB positioning transceiver, and the mobile phone may be a smart phone integrated with a UWB positioning transceiver.
The specific implementation process is shown in fig. 4, and comprises the following steps:
A. the method comprises the steps that a UWB positioning system is arranged in a home house and comprises a server, a switch, a positioning base station and a reference tag, so that the UWB positioning system is arranged in the home house and covers all positions of the house;
B. the mobile phone enters a sweeping robot set sweeping area mode through the APP;
C. moving the mobile phone, and drawing a closed graph in an area needing to be cleaned according to the mobile phone positioning track;
D. drawing graphic coordinate information is directly transmitted to the sweeping robot through UWB;
E. the sweeping robot interrupts and stores the current state according to the graph coordinate information, and automatically moves to the area to carry out sweeping operation;
F. after the completion of the cleaning of the pattern area, the robot returns to the state before the interruption (cleaning or standby charging).
In this embodiment, it can be understood that, in the setting of the APP sweeping robot for the mobile phone, the setting of real-time designated area sweeping (that is, the setting of the sweeping control method provided in this embodiment) may be added, after the area drawing is confirmed to be opened, the mobile phone is moved, the area to be swept is defined by the mobile phone track, and after a closed shape is drawn, the APP end is prompted to confirm the area to be swept. After the user confirms to clean the region, the coordinate of the region is determined through calculation of the UWB server, an instruction is sent to the sweeper, the sweeper stores the current position and state, the sweeper moves to the region to be cleaned according to the sweeping instruction to perform cleaning operation, after the whole designated region is covered by sweeping, the sweeping robot feeds back information to the APP to prompt the completion of the region cleaning task, and the sweeper returns to the position and state before the sweeping.
It will be appreciated that this implementation of the present embodiment has the advantage of being easy to operate and the area to be cleaned is determined more accurately.
Based on the content of the foregoing embodiment, in this embodiment, the determining, according to the movement location track of the mobile device, an area to be cleaned for defining a cleaning range specifically includes:
and determining the area to be cleaned for defining the cleaning range according to a delineation pattern formed by performing area delineation by pointing to the area to be cleaned by the mobile equipment.
In this embodiment, different from the above embodiments, in this embodiment, it is not necessary for the user to carry the mobile device to move around the area to be cleaned, that is, the user may stay in place, and only needs to point the mobile device to the area to be cleaned to perform area delineation to form a delineation pattern, and then the area to be cleaned for defining the cleaning range may be determined according to the delineation pattern, as shown in fig. 5.
In one implementation mode, the area to be cleaned for defining the cleaning range is determined according to a circle shape formed by the mobile terminal pointing to the area to be cleaned and performing in-situ moving circle, an angle between the mobile terminal and the cleaning surface and a distance between the mobile terminal and the cleaning surface.
In this implementation manner, it can be understood that, when there is an angle between the mobile device and the cleaning surface, a delineation pattern formed by performing in-situ area delineation on the mobile device by pointing to an area to be cleaned may be mapped onto the cleaning surface according to a distance between the mobile device and the cleaning surface and the angle between the mobile device and the cleaning surface, and a to-be-cleaned area corresponding to the delineation pattern on the cleaning surface may be determined.
In other implementations, it is understood that when the mobile device is facing the cleaning surface (i.e., the mobile device is parallel to the cleaning surface), the area to be cleaned for defining the cleaning range may be determined directly according to the area delineation pattern formed by the mobile device pointing to the area to be cleaned, i.e., the two are equivalent. For example, when the wall surface needs to be cleaned, the mobile device can be moved over the wall surface, a circle pattern can be formed by circling the area to be cleaned, for example, the circle pattern is a circle, and then the circle pattern can be directly used as the area to be cleaned for defining the cleaning range. In addition, when the ground is needed to be cleaned, the mobile equipment can be moved to the ground, a circle pattern can be formed by circle the area to be cleaned, for example, the circle pattern is in a square shape, and then the circle pattern can be directly used as the area to be cleaned for defining the cleaning range.
It can be understood that, with the method of this embodiment, the user does not need to move to reach the purpose of defining the area (defining the area to be cleaned), thereby providing more convenience for the user to use, and enabling the user to control the cleaning device to perform targeted cleaning on the defined area at any time and any place.
Based on the content of the foregoing embodiment, in this embodiment, the determining, according to a delineation pattern formed by performing area delineation by pointing to an area that needs to be cleaned by a mobile device, an area to be cleaned for defining a cleaning range specifically includes:
according to a delineation pattern formed by delineating an area by pointing to the area needing to be cleaned by the mobile equipment, the angle between the mobile equipment and a cleaning surface and the distance between the mobile equipment and the cleaning surface, the area to be cleaned for defining a cleaning range is determined.
In this embodiment, when there is an angle between the mobile device and the cleaning surface, a delineation pattern formed by performing area delineation on the mobile device by pointing to an area to be cleaned may be mapped onto the cleaning surface according to a distance between the mobile device and the cleaning surface and the angle between the mobile device and the cleaning surface, and a to-be-cleaned area corresponding to the delineation pattern on the cleaning surface may be determined.
In the embodiment, the angle of the mobile phone can be identified through the mobile phone gyroscope, and then the mobile phone is mapped to the cleaning surface through the circled pattern formed by the circled pattern in the air by matching with the distance between the mobile phone and the ground, so that the user can carry out the demarcation of the ground cleaning area without walking, and the use of the user is facilitated.
Based on the content of the above embodiment, in this embodiment, determining an area to be cleaned for defining a cleaning range according to a delineation pattern formed by a mobile device by pointing to an area to be cleaned to perform area delineation, an angle between the mobile device and a cleaning surface, and a distance between the mobile device and the cleaning surface specifically includes:
selecting a preset number (such as 10, 20 or 30) of first coordinate points (x1, y1) on a circle shape formed by pointing to an area to be cleaned to perform range delineation when the mobile terminal is not in place; it should be noted that, in order to ensure the accuracy of the subsequently determined region to be cleaned, the first coordinate points of the selected preset number are preferably uniformly distributed on the delineation shape;
determining a preset number of second coordinate points (x2, y2) on the cleaning surface corresponding to the preset number of first coordinate points respectively according to the first relation model;
determining an area to be cleaned for defining a cleaning range according to the preset number of second coordinate points;
wherein the first relationship model is:
Figure BDA0002870959380000381
x2=x1-L×cosθ,y2=y1-L×sinθ
wherein L represents a distance between the first coordinate point (x1, y1) and the second coordinate point (x1, y1), H1Representing the distance between the mobile terminal and the cleaning surface; theta 1 represents an angle between the mobile terminal and a cleaning surface, and theta represents an included angle between a line segment L and an x axis of a ground coordinate system.
In this embodiment, drawing the first pattern here refers to automatically drawing the first pattern by a computer, and may be understood as generating the first pattern.
The scheme provided by the present embodiment will be described below with reference to fig. 5 and 6. In the present embodiment, it can be realized by means of a UWB positioning system. Specifically, the UWB positioning base stations may be set to 4 or more, so that three-dimensional coordinates can be located to identify the height between the mobile phone and the ground. Identifying the direction of the mobile phone through a mobile phone gyroscope, wherein the mobile phone is inclined to the ground; calculating the coordinate of the intersection point of the X axial direction of the mobile phone and the ground according to the positioning coordinate of the mobile phone, the ground clearance and the angle theta 1 between the X axial direction of the mobile phone and the ground; the mobile phone rotates a small circle in the air, and the intersection point of the X axis of the mobile phone and the ground can simultaneously depict a large ground area. The floor area can be used as the area to be swept by the sweeper. The purpose of defining the area can be achieved without moving people (as shown in figure 5). The coordinate calculation method comprises the following steps: UWB locates the coordinate position of known (x1, y1), distance between the coordinates (x1, y1) and (x2, y2) on the ground:
Figure BDA0002870959380000391
the UWB positioning system has generated a ground coordinate system, and the orientation of the cell phone and the angle θ between the L-line segment and the x-axis of the coordinate system are identified by the cell phone gyroscope, so as to calculate the ground plane coordinates (x2, y2) (as shown in fig. 6):
x2=x1-L×cosθ,y2=y1-L×sinθ
(x2,y2)=(x1-L×cosθ,y1-L×sinθ)
in the embodiment, the angle of the mobile phone can be identified through the mobile phone gyroscope, and then the mobile phone is matched with the distance between the mobile phone and the ground, and the mobile phone is mapped to the cleaning surface through the circled pattern formed by the circled pattern in the air, so that the user can carry out the demarcation of the ground cleaning area without walking, and the mobile phone gyroscope is convenient for the user to use.
Based on the content of the foregoing embodiment, in this embodiment, the determining the first coordinate of the location of the mobile device specifically includes:
based on an ultra-wideband UWB positioning system, a first coordinate of a position where a mobile device is located is determined.
In this embodiment, as described above, when determining the first coordinate of the location of the mobile device, the UWB positioning system may be used to perform positioning, so as to determine the first coordinate of the location of the mobile device.
It can be understood that the UWB positioning system has the advantages of strong penetrating power, low power consumption, good multipath resistance effect, high safety, low system complexity, capability of providing accurate positioning precision and the like. Therefore, the ultra-wideband technology can be applied to positioning, tracking and navigation of indoor stationary or moving objects and people, and can provide very accurate positioning accuracy, so that the embodiment can adopt the UWB positioning system to position.
The embodiment of the invention also provides a sweeping control method, which comprises the following steps:
receiving a cleaning instruction which is triggered by a user in real time through voice and is used for arranging a random cleaning task; the random cleaning task is a task which does not belong to a preset cleaning task;
determining an area to be cleaned for defining a cleaning range according to the cleaning instruction;
and controlling a cleaning device to perform a cleaning task on the area to be cleaned according to the area to be cleaned.
Further, the step of determining the area to be cleaned for defining the cleaning range according to the cleaning instruction comprises the following steps:
determining a reference object and relative position area information of the reference object contained in the cleaning command, and determining a to-be-cleaned area for defining a cleaning range according to the position information of the reference object and the relative position area information of the reference object;
and/or the presence of a gas in the gas,
determining absolute position coordinate information contained in the cleaning instruction, and determining a to-be-cleaned area for defining a cleaning range according to the absolute coordinate position information;
and/or the presence of a gas in the gas,
and determining a reference object contained in the cleaning instruction and relative coordinate information of the reference object, and determining an area to be cleaned for defining a cleaning range according to the reference object and the relative coordinate information of the reference object.
Further, according to the area to be cleaned, the cleaning equipment is controlled to carry out cleaning tasks on the area to be cleaned, and the cleaning tasks comprise:
identifying the ground type and/or the dirt type of the area to be cleaned;
and controlling the cleaning equipment to carry out targeted cleaning treatment on the area to be cleaned according to the ground type and/or dirt type identification result.
Further, the ground type recognition of the area to be cleaned comprises:
collecting sound signals emitted by cleaning equipment in the area to be cleaned;
and determining the material type of the area to be cleaned according to the audio features in the sound signals.
Further, according to the ground type recognition result, the control cleaning equipment carries out targeted cleaning treatment on the area to be cleaned, and the method comprises the following steps:
and controlling the cleaning equipment to perform targeted cleaning treatment on the area to be cleaned by adopting a matched cleaning mode according to the ground type identification result.
Further, the dirt type identification of the area to be cleaned comprises the following steps:
collecting an image of the area to be cleaned;
and identifying the dirt type of the area to be cleaned according to the image.
Further, according to the image information, identifying the dirt type of the area to be cleaned comprises:
inputting the image into a dirt type identification model, and acquiring a dirt type identification result of the area to be cleaned;
the dirt type identification model is obtained by training an image sample and a dirt type identification result label of the image sample.
Further, according to the image, identifying the dirt type of the area to be cleaned comprises:
and comparing the image with each reference image of known dirt types stored in a database, and taking the dirt type corresponding to the reference image matched with the image in the database as the dirt type of the area to be cleaned.
In the embodiment, voice information can be instructed to determine the area to be cleaned, and the processing mode provides greater convenience for the user, so that the user does not need to move around the area to be cleaned, take pictures, do corresponding delineation actions, do limb actions and the like, and only needs to issue voice instructions. For example, the user may send a voice "clean an area in front of a television", and after receiving the voice instruction, the cleaning control device matches the voice (or a keyword in the voice, such as in front of the television) with the voice or the keyword in the database, determines an area to be cleaned (which may be understood as cleaning location information) corresponding to the voice, and then controls the cleaning device to move to a corresponding cleaning location to perform cleaning work. It is understood that the database stores therein the area to be cleaned (cleaning position information) matching each voice command.
By adopting the processing mode of the embodiment, the method is simple and convenient to use, relieves the operation burden, and can also appoint any area to carry out targeted cleaning, thereby improving the processing flexibility.
Based on the content of the above embodiment, in this embodiment, the determining, according to the instruction voice information, the area to be cleaned for defining the cleaning range specifically includes one or more of the following three ways:
determining a reference object and relative position area information of the reference object contained in the instruction voice information, and determining an area to be cleaned for defining a cleaning range according to the position information of the reference object and the relative position area information of the reference object;
absolute position coordinate information contained in the instruction voice information is determined, and an area to be cleaned for defining a cleaning range is determined according to the absolute position coordinate information;
and thirdly, determining a reference object contained in the instruction voice information and relative coordinate information of the reference object, and determining an area to be cleaned for defining a cleaning range according to the reference object and the relative coordinate information of the reference object.
In the present embodiment, when the area to be cleaned for defining the cleaning range is determined by voice, in order to improve the processing efficiency, the determination may be made in several ways.
One implementation way is to determine a reference object (such as a door, a table, a television, a dining table, a sofa, a bed, a stool, etc.) contained in the instruction voice message, then determine relative position area information (such as a door front area, a television front area, a sofa front area, a bed tail area, etc.) with respect to the reference object, and finally determine a to-be-cleaned area for defining a cleaning range according to the position information of the reference object and the relative position area information with respect to the reference object. It will be appreciated that the coordinates of these references may be pre-stored in a database.
Another implementation manner is to determine absolute position coordinate information contained in the instruction voice information, and determine an area to be cleaned for defining a cleaning range according to the absolute position coordinate information. For example, suppose that the voice command sent by the user is "please clean the area determined by coordinates (X, Y) as the center of a circle and radius of 1 meter" and the area to be cleaned for defining the cleaning range can be accurately determined according to the voice command. For another example, the voice command issued by the user is "please clean the area enclosed by the indoor coordinates (X1, Y1), (X2, Y2), (X3, Y3)", and according to the voice command, the area to be cleaned for defining the cleaning range can be accurately determined.
Still another implementation mode is that a reference object contained in the instruction voice information and relative coordinate information of the reference object are determined, and an area to be cleaned for defining a cleaning range is determined according to the reference object and the relative coordinate information of the reference object. For example, assuming that the voice command sent by the user is "please clean the area 1.3 meters before the tv", according to the voice command, the reference object can be determined to be the tv, and then the area to be cleaned for defining the cleaning range is accurately determined according to the position of the tv and the relative coordinate information 1.3 with the tv. For another example, the voice command issued by the user is "please clean the area determined by the radius of 0.5 meter around the dining table, according to the voice command, the area to be cleaned for defining the cleaning range can be accurately determined, so that food residues, paper scraps and the like around the dining table can be accurately cleaned after meals.
Therefore, the embodiment can flexibly and efficiently determine the area needing to be cleaned in a targeted manner in a voice mode, so that accurate cleaning is completed.
In this embodiment, for the scheme of ground type identification and dirt identification, reference may be made to the description of the foregoing embodiments, and details are not described here.
The embodiment of the invention also provides a cleaning control method, which comprises the following steps:
receiving a cleaning instruction which is triggered by a user in real time through limb actions and is used for arranging random cleaning tasks; the random cleaning task is a task which does not belong to a preset cleaning task;
determining an area to be cleaned for defining a cleaning range according to the cleaning instruction;
and controlling a cleaning device to perform a cleaning task on the area to be cleaned according to the area to be cleaned.
Further, the step of determining the area to be cleaned for defining the cleaning range according to the cleaning instruction comprises the following steps:
determining limb pointing and limb actions sent by a user to an area needing cleaning;
determining a coordinate area range formed on a cleaning surface according to the limb direction and the limb action;
and determining an area to be cleaned for defining a cleaning range according to the coordinate area range.
Further, according to the area to be cleaned, the cleaning equipment is controlled to carry out cleaning tasks on the area to be cleaned, and the cleaning tasks comprise:
identifying the ground type and/or the dirt type of the area to be cleaned;
and controlling the cleaning equipment to carry out targeted cleaning treatment on the area to be cleaned according to the ground type and/or dirt type identification result.
Further, the ground type recognition of the area to be cleaned comprises:
collecting sound signals emitted by cleaning equipment in the area to be cleaned;
and determining the material type of the area to be cleaned according to the audio features in the sound signals.
Further, according to the ground type recognition result, the control cleaning equipment carries out targeted cleaning treatment on the area to be cleaned, and the method comprises the following steps:
and controlling the cleaning equipment to carry out targeted cleaning treatment on the area to be cleaned by adopting a matched cleaning mode according to the ground type identification result.
Further, the dirt type identification of the area to be cleaned comprises the following steps:
collecting an image of the area to be cleaned;
and identifying the dirt type of the area to be cleaned according to the image.
Further, according to the image information, identifying the dirt type of the area to be cleaned comprises:
inputting the image into a dirt type identification model, and acquiring a dirt type identification result of the area to be cleaned;
the dirt type identification model is obtained by training an image sample and a dirt type identification result label of the image sample.
Further, according to the image, identifying the dirt type of the area to be cleaned comprises:
and comparing the image with each reference image of known dirt types stored in a database, and taking the dirt type corresponding to the reference image matched with the image in the database as the dirt type of the area to be cleaned.
In this embodiment, the area to be cleaned for defining the cleaning range may be determined according to the limb information of the user. For example, when a user points to a certain area range through limbs (fingers, head, arms, legs, feet, and the like), an area to be cleaned for defining a cleaning range is determined according to the area range pointed by the user limbs.
In this embodiment, it can be understood that the position of the user's limb and the pointing angle information of the user's limb can be determined in a manner of sensing by the sensing device, and then the range of the area pointed by the user's limb is determined according to the position of the user's limb and the pointing direction information and pointing angle information of the user's limb, and then the area to be cleaned, which needs to be cleaned, is determined.
For example, assuming that the user stands at point a on the ground, then the finger points to the ground 30 ° off the east and the finger is at an angle of 45 ° to the ground, the range of the area on the ground where the user's limb points can be determined from this information. It will be appreciated that the determination is generally a point determined on the ground, and then a figure (such as a circle or a square) may be drawn with this point as the center, and when the figure is drawn as a circle, the radius may take a predetermined value. When the drawing graph is square, the side length can adopt a preset value, and it can be understood that in the determining mode, the determined area range is an approximate number, so that in order to avoid cleaning omission, the area of the drawing graph can be larger than a preset threshold value, and the cleaning task assigned by a user can be guaranteed to be completed as far as possible.
Based on the content of the foregoing embodiment, in this embodiment, the determining, according to the limb information of the user, the area to be cleaned for defining the cleaning range specifically includes:
determining limb pointing and limb actions sent by a user to an area needing cleaning;
determining a third pattern formed on the cleaning surface according to the limb direction and the limb action;
according to the third pattern, an area to be cleaned for defining a cleaning range is determined.
In this embodiment, it should be noted that the solution of this embodiment is similar to the solution provided in the previous embodiment of "determining the area to be cleaned for defining the cleaning range according to the delineation pattern formed by pointing the mobile device to the area to be cleaned" and the difference is that the embodiment is not the delineation pattern formed by the mobile device being held by the user for delineation of the area, but the delineation pattern formed by the limb action for delineation of the area. For example, when a circular area to be cleaned is to be defined on the ground, the toes of the foot may be used to perform a limb movement of the circular area towards the ground, thereby forming a third pattern on the ground.
In addition, limb motions for area delineation can be performed towards the ground by gestures, then a third pattern formed on the cleaning surface is determined according to information such as angles between fingers and the ground, heights of the fingers and the like, and an area to be cleaned for defining a cleaning range is determined according to the third pattern.
It is understood that, since the solution of the present embodiment is similar to the solution provided by the previous embodiment of determining the area to be cleaned for defining the cleaning range according to the delineation pattern formed by the mobile device by pointing to the area to be cleaned, the specific principle is not described in detail, and reference may be made to the description of the previous embodiment.
In this embodiment, it should be noted that, in general, the third pattern may be used as the area to be cleaned for defining the cleaning range, and in special cases, in order to ensure the cleaning effect, a preset amount of extension may be performed on the basis of the third pattern, so as to determine the area to be cleaned for defining the cleaning range.
In this embodiment, for the scheme of ground type identification and dirt identification, reference may be made to the description of the foregoing embodiments, and details are not described here.
Based on the same inventive concept, another embodiment of the present invention provides a cleaning control device, referring to fig. 10, the cleaning control device provided in this embodiment includes: a first determination module 21, a second determination module 22, a third determination module 23, and a purge control module 24, wherein:
the first determining module 21 is configured to determine position information and posture information of the mobile terminal when the mobile terminal shoots a to-be-cleaned area;
the second determining module 22 is configured to determine an imaging capture angle of a camera on the mobile terminal when the mobile terminal shoots a region to be cleaned;
a third determining module 23, configured to determine the area to be cleaned according to the captured image, the position information, the posture information, and the imaging capturing angle;
and the cleaning control module 24 is used for controlling the cleaning equipment to perform cleaning tasks on the area to be cleaned according to the area to be cleaned.
Based on the same inventive concept, another embodiment of the present invention provides a cleaning control device, including: the task control device comprises a fourth determining module, a fifth determining module, a receiving module, a sixth determining module and a task control module, wherein:
the fourth determining module is used for determining the position information and the posture information of the mobile terminal when the mobile terminal shoots an area;
a fifth determining module, configured to determine an imaging capture angle of a camera on the mobile terminal when the mobile terminal shoots the area;
the receiving module is used for receiving a mark area of a shot image of a user;
the sixth determining module is used for determining the area to be cleaned according to the mark area of the image shot by the user, the position information, the posture information and the imaging capturing angle;
and the task control module is used for controlling the cleaning equipment to carry out a cleaning task on the area to be cleaned according to the area to be cleaned.
Since the cleaning control device provided in this embodiment can be used to execute the cleaning control method described in the above embodiment, and the operation principle and the beneficial effect are similar, detailed descriptions are omitted here, and specific contents can be referred to the description of the above embodiment.
Based on the same inventive concept, another embodiment of the present invention provides an intelligent device including the sweeping control apparatus as described in the above embodiments.
In this embodiment, it can be understood that, since the processing procedure of the cleaning control device can be implemented on an intelligent device, this embodiment provides an intelligent device including the cleaning control device, and further implements the cleaning control procedure. It is understood that the intelligent device may be a sweeping robot, an intelligent sweeping device, etc., which is not limited in this embodiment.
Since the intelligent device provided by this embodiment includes the cleaning control device described in the above embodiment, the operation principle and the beneficial effect thereof are similar, and therefore detailed description is omitted here, and specific contents can be referred to the description of the above embodiment.
Based on the same inventive concept, another embodiment of the present invention provides a mobile device including the sweeping control apparatus as described in the above embodiments.
In this embodiment, it can be understood that, since the processing procedure of the cleaning control device can be implemented on a mobile device, this embodiment provides a mobile device including the cleaning control device, thereby implementing the cleaning control procedure. It is understood that the mobile device may be various devices, such as a mobile phone, a pad, a smart watch, a notebook, and the like, which is not limited in this embodiment.
Since the mobile device provided by this embodiment includes the cleaning control device described in the above embodiment, the operation principle and the beneficial effects thereof are similar, and therefore detailed description thereof is omitted here, and specific contents thereof can be referred to the description of the above embodiment.
Based on the same inventive concept, another embodiment of the present invention provides a server including the sweeping control apparatus as described in the above embodiments.
In this embodiment, it can be understood that, since the processing procedure of the cleaning control device can be implemented on a server, this embodiment provides a server including the cleaning control device, and further implements the cleaning control procedure. In this embodiment, the server may be a cloud server, or may be another server, which is not limited in this embodiment. When the cloud server is used, the method has the advantages of high specific processing speed, high safety and the like.
Since the server provided in this embodiment includes the cleaning control device described in the above embodiment, the operation principle and the beneficial effects thereof are similar, and therefore detailed description thereof is omitted here, and specific contents thereof can be referred to the description of the above embodiment.
Based on the same inventive concept, another embodiment of the present invention provides an intelligent device, which specifically includes the following contents, with reference to fig. 11: a processor 301, a memory 302, a communication interface 303, and a communication bus 304;
the processor 301, the memory 302 and the communication interface 303 complete mutual communication through the communication bus 304; the communication interface 303 is used for realizing transmission between related devices such as modeling software, an intelligent manufacturing equipment module library and the like;
the processor 301 is configured to call a computer program in the memory 302, and when the processor executes the computer program, the processor implements all the steps of the cleaning control method, for example, when the processor executes the computer program, the processor implements the following steps: determining an area to be cleaned for defining a cleaning range; determining a region to be cleaned according to the region to be cleaned; controlling a cleaning device to perform a cleaning task on the area to be cleaned according to the area to be cleaned; or, the processor implements the following steps when executing the computer program: determining position information and posture information of a mobile terminal when the mobile terminal shoots an area to be cleaned; determining an imaging capture angle of a camera on the mobile terminal when the mobile terminal shoots an area to be cleaned; determining the area to be cleaned according to the shot image, the position information, the posture information and the imaging capture angle; controlling a cleaning device to perform a cleaning task on the area to be cleaned according to the area to be cleaned; or, the processor implements the following steps when executing the computer program: determining position information and posture information of a mobile terminal when the mobile terminal shoots an area; determining an imaging capture angle of a camera on the mobile terminal when the mobile terminal shoots the area; receiving a mark area of a user on a shot image; determining an area to be cleaned according to a mark area of a user on a shot image, the position information, the posture information and the imaging capture angle; and controlling a cleaning device to perform a cleaning task on the area to be cleaned according to the area to be cleaned.
It will be appreciated that the detailed functions and extended functions that the computer program may perform may be as described with reference to the above embodiments.
It can be understood that the intelligent device may be a sweeping robot, a ground sweeping device, a wall sweeping device, or the like, which is not limited in this embodiment.
Based on the same inventive concept, another embodiment of the present invention provides a sweeping robot device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor implements the steps of the above cleaning control method when executing the program. It can be understood that when the sweeping robot device implements the steps of the sweeping control method, the "sweeping device" in the method, that is, the sweeping robot itself, may perform the control and sweeping functions in the step of "controlling the sweeping device to perform the sweeping task on the area to be swept according to the area to be swept".
Based on the same inventive concept, another embodiment of the present invention provides a mobile device, which specifically includes the following components, with reference to fig. 12: a processor 401, a memory 402, a communication interface 403, and a communication bus 404;
the processor 401, the memory 402 and the communication interface 403 complete mutual communication through the communication bus 404; the communication interface 403 is used for realizing transmission between related devices such as modeling software, an intelligent manufacturing equipment module library and the like;
the processor 401 is configured to call a computer program in the memory 402, and when the processor executes the computer program, the processor implements all the steps of the cleaning control method, for example, when the processor executes the computer program, the processor implements the following steps: determining an area to be cleaned for defining a cleaning range; determining a region to be cleaned according to the region to be cleaned; controlling a cleaning device to perform a cleaning task on the area to be cleaned according to the area to be cleaned; or, the processor implements the following steps when executing the computer program: determining position information and posture information of a mobile terminal when the mobile terminal shoots an area to be cleaned; determining an imaging capture angle of a camera on the mobile terminal when the mobile terminal shoots an area to be cleaned; determining the area to be cleaned according to the shot image, the position information, the posture information and the imaging capture angle; controlling a cleaning device to perform a cleaning task on the area to be cleaned according to the area to be cleaned; or, the processor implements the following steps when executing the computer program: determining position information and posture information of a mobile terminal when the mobile terminal shoots an area; determining an imaging capture angle of a camera on the mobile terminal when the mobile terminal shoots the area; receiving a mark area of a user on a shot image; determining an area to be cleaned according to a mark area of a user on a shot image, the position information, the posture information and the imaging capture angle; and controlling a cleaning device to perform a cleaning task on the area to be cleaned according to the area to be cleaned.
It will be appreciated that the detailed functions and extended functions that the computer program may perform may be as described with reference to the above embodiments.
It is understood that the mobile device may be various devices, such as a mobile phone, a pad, a smart watch, a notebook, and the like, which is not limited in this embodiment.
Based on the same inventive concept, another embodiment of the present invention provides a server, referring to fig. 13, where the server specifically includes the following contents: a processor 501, a memory 502, a communication interface 503, and a communication bus 504;
the processor 501, the memory 502 and the communication interface 503 complete mutual communication through the communication bus 504; the communication interface 503 is used for realizing transmission between related devices such as modeling software, an intelligent manufacturing equipment module library and the like;
the processor 501 is configured to call a computer program in the memory 502, and when the processor executes the computer program, the processor implements all the steps of the cleaning control method, for example, when the processor executes the computer program, the processor implements the following steps: determining position information and posture information of a mobile terminal when the mobile terminal shoots an area to be cleaned; determining an imaging capture angle of a camera on the mobile terminal when the mobile terminal shoots a region to be cleaned; determining the area to be cleaned according to the shot image, the position information, the posture information and the imaging capture angle; controlling a cleaning device to perform a cleaning task on the area to be cleaned according to the area to be cleaned; or, the processor implements the following steps when executing the computer program: determining position information and posture information of a mobile terminal when the mobile terminal shoots an area; determining an imaging capture angle of a camera on the mobile terminal when the mobile terminal shoots the area; receiving a mark area of a user on a shot image; determining an area to be cleaned according to a mark area of a user on a shot image, the position information, the posture information and the imaging capture angle; and controlling a cleaning device to perform a cleaning task on the area to be cleaned according to the area to be cleaned.
It will be appreciated that the detailed functions and extended functions that the computer program may perform may be as described with reference to the above embodiments.
In this embodiment, the server may be a cloud server, or may be another server, which is not limited in this embodiment. When the cloud server is used, the method has the advantages of high specific processing speed, high safety and the like.
Based on the same inventive concept, another embodiment of the present invention provides a non-transitory computer-readable storage medium, having a computer program stored thereon, which when executed by a processor implements all the steps of the above cleaning control method, for example, when the processor executes the computer program, the processor implements the following steps: determining position information and posture information of a mobile terminal when the mobile terminal shoots an area to be cleaned; determining an imaging capture angle of a camera on the mobile terminal when the mobile terminal shoots a region to be cleaned; determining the area to be cleaned according to the shot image, the position information, the posture information and the imaging capture angle; controlling a cleaning device to perform a cleaning task on the area to be cleaned according to the area to be cleaned; or, the processor implements the following steps when executing the computer program: determining position information and attitude information of a mobile terminal when the mobile terminal shoots an area; determining an imaging capture angle of a camera on the mobile terminal when the mobile terminal shoots the area; receiving a mark area of a user on a shot image; determining a region to be cleaned according to a mark region, the position information, the posture information and the imaging capturing angle of a user on a shot image; and controlling a cleaning device to perform a cleaning task on the area to be cleaned according to the area to be cleaned.
It will be appreciated that the detailed functions and extended functions that the computer program may perform may be as described with reference to the above embodiments.
In addition, the logic instructions in the memory may be implemented in the form of software functional units and stored in a computer readable storage medium when the logic instructions are sold or used as independent products. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the embodiment of the present invention. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. Based on such understanding, the above technical solutions may be essentially or partially implemented in the form of software products, which may be stored in computer readable storage media, such as ROM/RAM, magnetic disk, optical disk, etc., and include several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the cleaning control method according to the embodiments or some parts of the embodiments.
In the description of the present invention, it should be noted that the terms "upper", "lower", and the like indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, which are merely for convenience in describing the present invention and simplifying the description, and do not indicate or imply that the referred device or element must have a specific orientation, be constructed in a specific orientation, and operate, and thus, should not be construed as limiting the present invention. Unless expressly stated or limited otherwise, the terms "mounted," "connected," and "connected" are intended to be inclusive and mean, for example, that they may be fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
In addition, in the present invention, terms such as "first" and "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present invention, "a plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Moreover, in the present invention, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
Furthermore, in the description herein, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (21)

1. A cleaning control method is characterized by comprising:
shooting an area to be cleaned by using a mobile terminal;
determining position information and posture information of a mobile terminal when the mobile terminal shoots an area to be cleaned;
determining an imaging capture angle of a camera on the mobile terminal when the mobile terminal shoots an area to be cleaned;
determining the area to be cleaned according to the shot image, the position information, the posture information and the imaging capture angle;
and controlling a cleaning device to perform a cleaning task on the area to be cleaned according to the area to be cleaned.
2. The cleaning control method according to claim 1, wherein determining the position information and the posture information of the mobile terminal when the mobile terminal photographs the area to be cleaned comprises:
determining the position coordinate of the mobile terminal when the mobile terminal shoots an area to be cleaned;
the method comprises the steps of determining the pointing direction of the mobile terminal and a first angle between the mobile terminal and a cleaning surface when the mobile terminal shoots an area to be cleaned.
3. The cleaning control method according to claim 2, wherein determining the area to be cleaned based on the captured image, the position information, the orientation information, and the imaging capture angle includes:
determining a central coordinate of a central point on the image mapped to the cleaning surface according to the image obtained by shooting, the position coordinate, the pointing direction and the first angle;
and determining the corresponding area to be cleaned of the image on the cleaning surface according to the central coordinate and the imaging capturing angle.
4. The cleaning control method according to claim 2, wherein determining the area to be cleaned based on the captured image, the position information, the posture information, and the imaging capture angle includes:
determining a first distance between the mobile terminal and a cleaning surface when the mobile terminal shoots the area;
determining a central coordinate of a central point on the image mapped to the cleaning surface according to the shot image, the position coordinate, the first distance, the pointing direction and the first angle; and determining the corresponding area to be cleaned of the image on the cleaning surface according to the central coordinate and the imaging capturing angle.
5. The cleaning control method according to claim 3 or 4, wherein determining the area to be cleaned corresponding to the image on the cleaning surface according to the center coordinate and the imaging capture angle comprises:
determining an arc-shaped area of the image on the cleaning surface according to the central coordinate and the imaging capturing angle;
determining a first area coordinate range of the image in the arc-shaped area according to a corresponding cutting processing process when the image is mapped to the cleaning surface;
and determining the first area coordinate range as an area to be cleaned.
6. The cleaning control method according to claim 2, wherein determining the position coordinates of the mobile terminal when the mobile terminal photographs the area to be cleaned comprises:
based on an ultra-wideband UWB indoor positioning system, the position coordinates of the mobile terminal are determined when the mobile terminal shoots an area to be cleaned.
7. A cleaning control method is characterized by comprising:
determining position information and posture information of a mobile terminal when the mobile terminal shoots an area;
determining an imaging capture angle of a camera on the mobile terminal when the mobile terminal shoots the area;
receiving a mark area of a user on a shot image;
determining an area to be cleaned according to a mark area of a user on a shot image, the position information, the posture information and the imaging capture angle;
and controlling a cleaning device to perform a cleaning task on the area to be cleaned according to the area to be cleaned.
8. The cleaning control method according to claim 7, wherein determining the position information and the orientation information of the mobile terminal when the mobile terminal photographs an area, includes:
determining the position coordinates of the mobile terminal when the mobile terminal shoots the area;
the method comprises the steps of determining the pointing direction of the mobile terminal and the first angle between the mobile terminal and a cleaning surface when the mobile terminal shoots an area to be cleaned.
9. The cleaning control method according to claim 8, wherein determining an area to be cleaned based on a mark area of a user on a captured image, the position information, the posture information, and the imaging capturing angle, includes:
determining a central coordinate of a central point on the image mapped to the cleaning surface according to the shot image, the position coordinate, the pointing direction and the first angle;
and determining a corresponding area to be cleaned on the cleaning surface according to the central coordinate, the imaging capturing angle and the marking area.
10. The cleaning control method according to claim 8, wherein determining the area to be cleaned based on the mark area of the user on the captured image, the position information, the posture information, and the imaging capturing angle includes:
determining a first distance between the mobile terminal and a cleaning surface when the mobile terminal shoots the area;
determining a central coordinate of a central point on the image mapped to the cleaning surface according to the shot image, the position coordinate, the first distance, the pointing direction and the first angle;
and determining a corresponding area to be cleaned on the cleaning surface according to the central coordinate, the imaging capturing angle and the marking area.
11. The cleaning control method according to claim 9 or 10, wherein determining a corresponding area to be cleaned on the cleaning surface based on the center coordinates, the imaging capturing angle, and the mark area includes:
determining an arc-shaped area of the image on the cleaning surface according to the central coordinate and the imaging capturing angle;
determining a sub-imaging capture angle corresponding to the mark region according to the distance and the direction relation between the mark region and the central point of the image; wherein the sub-imaging capture angle is less than the imaging capture angle;
determining a sub-circular arc area of the marking area in the circular arc area according to the central coordinate and the sub-imaging capturing angle;
determining a second area coordinate range of the marked area in the sub-arc-shaped area according to a corresponding cutting processing process when the marked area is mapped to the cleaning surface;
and determining the coordinate range of the second area as an area to be cleaned.
12. The cleaning control method according to claim 9 or 10, wherein determining a corresponding area to be cleaned on the cleaning surface based on the center coordinates, the imaging capturing angle, and the mark area includes:
determining an arc-shaped area of the image on the cleaning surface according to the central coordinate and the imaging capturing angle;
determining a first area coordinate range of the image in the arc-shaped area according to a corresponding cutting processing process when the image is mapped to the cleaning surface;
determining a second area coordinate range of the marked area on the cleaning surface according to the relative position relationship and the relative size relationship between the marked area and the image and the first area coordinate range;
and determining the coordinate range of the second area as an area to be cleaned.
13. The cleaning control method according to claim 1, wherein controlling a cleaning apparatus to perform a cleaning task on the area to be cleaned in accordance with the area to be cleaned includes:
identifying the ground type and/or the dirt type of the area to be cleaned;
and controlling the cleaning equipment to carry out targeted cleaning treatment on the area to be cleaned according to the ground type and/or dirt type identification result.
14. The cleaning control method according to claim 13, wherein performing ground type recognition on the area to be cleaned includes:
collecting sound signals emitted by the cleaning equipment in the area to be cleaned,
determining the material type of the area to be cleaned according to the audio features in the sound signals;
correspondingly, according to the ground type recognition result, the cleaning equipment is controlled to carry out targeted cleaning treatment on the area to be cleaned, and the method comprises the following steps:
and controlling the cleaning equipment to carry out targeted cleaning treatment on the area to be cleaned by adopting a matched cleaning mode according to the ground type identification result.
15. The sweeping control method according to claim 13, wherein performing soil type identification on the area to be swept includes:
collecting an image of the area to be cleaned;
and identifying the dirt type of the area to be cleaned according to the image.
16. The cleaning control method according to claim 15, wherein the identifying the type of dirt in the area to be cleaned based on the image information includes any one of:
inputting the image into a dirt type identification model, and acquiring a dirt type identification result of the area to be cleaned; the dirt type identification model is obtained by training an image sample and a dirt type identification result label of the image sample;
and comparing the image with each reference image of known dirt types stored in a database, and taking the dirt type corresponding to the reference image matched with the image in the database as the dirt type of the area to be cleaned.
17. A smart device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the steps of the cleaning control method according to any one of claims 1 to 16 are implemented when the program is executed by the processor.
18. A sweeping robot apparatus comprising a memory, a processor and a computer program stored on the memory and operable on the processor, wherein the processor when executing the program implements the steps of the sweeping control method of any one of claims 1 to 16.
19. A mobile terminal comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps of the cleaning control method according to any one of claims 1 to 16 when executing the program.
20. A server comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the steps of the cleaning control method according to any one of claims 1 to 16 are implemented when the program is executed by the processor.
21. A non-transitory computer-readable storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the steps of the cleaning control method according to any one of claims 1 to 16.
CN202011599705.1A 2020-12-29 2020-12-29 Cleaning control method and device, intelligent equipment, mobile equipment and server Active CN114680740B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011599705.1A CN114680740B (en) 2020-12-29 2020-12-29 Cleaning control method and device, intelligent equipment, mobile equipment and server

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011599705.1A CN114680740B (en) 2020-12-29 2020-12-29 Cleaning control method and device, intelligent equipment, mobile equipment and server

Publications (2)

Publication Number Publication Date
CN114680740A true CN114680740A (en) 2022-07-01
CN114680740B CN114680740B (en) 2023-08-08

Family

ID=82132701

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011599705.1A Active CN114680740B (en) 2020-12-29 2020-12-29 Cleaning control method and device, intelligent equipment, mobile equipment and server

Country Status (1)

Country Link
CN (1) CN114680740B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115500740A (en) * 2022-11-18 2022-12-23 科大讯飞股份有限公司 Cleaning robot and cleaning robot control method
DE102022210911A1 (en) 2022-10-17 2024-04-18 Robert Bosch Gesellschaft mit beschränkter Haftung Method for determining a selection area in an environment for a mobile device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103784079A (en) * 2012-10-26 2014-05-14 Lg电子株式会社 Robot cleaner system and control method of the same
US20150032260A1 (en) * 2013-07-29 2015-01-29 Samsung Electronics Co., Ltd. Auto-cleaning system, cleaning robot and method of controlling the cleaning robot
CN105310604A (en) * 2014-07-30 2016-02-10 Lg电子株式会社 Robot cleaning system and method of controlling robot cleaner
CN106444786A (en) * 2016-11-29 2017-02-22 北京小米移动软件有限公司 Control method, device and electronic equipment of floor mopping robot
CN109276190A (en) * 2018-10-23 2019-01-29 中国人民解放军陆军工程大学 UWB-based floor sweeping robot monitoring method and equipment thereof
EP3508935A1 (en) * 2018-01-05 2019-07-10 iRobot Corporation System for spot cleaning by a mobile robot
CN110916574A (en) * 2019-10-18 2020-03-27 上海善解人意信息科技有限公司 Sweeping robot system and sweeping robot control method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103784079A (en) * 2012-10-26 2014-05-14 Lg电子株式会社 Robot cleaner system and control method of the same
US20150032260A1 (en) * 2013-07-29 2015-01-29 Samsung Electronics Co., Ltd. Auto-cleaning system, cleaning robot and method of controlling the cleaning robot
CN105310604A (en) * 2014-07-30 2016-02-10 Lg电子株式会社 Robot cleaning system and method of controlling robot cleaner
CN106444786A (en) * 2016-11-29 2017-02-22 北京小米移动软件有限公司 Control method, device and electronic equipment of floor mopping robot
EP3508935A1 (en) * 2018-01-05 2019-07-10 iRobot Corporation System for spot cleaning by a mobile robot
CN109276190A (en) * 2018-10-23 2019-01-29 中国人民解放军陆军工程大学 UWB-based floor sweeping robot monitoring method and equipment thereof
CN110916574A (en) * 2019-10-18 2020-03-27 上海善解人意信息科技有限公司 Sweeping robot system and sweeping robot control method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102022210911A1 (en) 2022-10-17 2024-04-18 Robert Bosch Gesellschaft mit beschränkter Haftung Method for determining a selection area in an environment for a mobile device
CN115500740A (en) * 2022-11-18 2022-12-23 科大讯飞股份有限公司 Cleaning robot and cleaning robot control method

Also Published As

Publication number Publication date
CN114680740B (en) 2023-08-08

Similar Documents

Publication Publication Date Title
US11709497B2 (en) Method for controlling an autonomous mobile robot
CN109643127B (en) Map construction, positioning, navigation and control method and system, and mobile robot
WO2019114219A1 (en) Mobile robot and control method and control system thereof
US10545497B1 (en) Control method and device for mobile robot, mobile robot
CN108247647B (en) Cleaning robot
CN110989631B (en) Self-moving robot control method, device, self-moving robot and storage medium
WO2019232806A1 (en) Navigation method, navigation system, mobile control system, and mobile robot
WO2022027869A1 (en) Robot area dividing method based on boundary, chip, and robot
CN105407774B (en) Automatic sweeping system, sweeping robot and the method for controlling sweeping robot
US11116374B2 (en) Self-actuated cleaning head for an autonomous vacuum
CN114680740A (en) Cleaning control method and device, intelligent equipment, mobile equipment and server
CN109421067A (en) Robot virtual boundary
CN105760106A (en) Interaction method and interaction device of intelligent household equipment
WO2003102706A1 (en) Remotely-operated robot, and robot self position identifying method
CN207115193U (en) A kind of mobile electronic device for being used to handle the task of mission area
CN108903816A (en) A kind of cleaning method, controller and intelligent cleaning equipment
CN109200576A (en) Somatic sensation television game method, apparatus, equipment and the storage medium of robot projection
CN207067803U (en) A kind of mobile electronic device for being used to handle the task of mission area
CN109839827B (en) Gesture recognition intelligent household control system based on full-space position information
CN111142531A (en) Household appliance linkage-based cleaning robot control method and cleaning robot
CN108089713A (en) A kind of interior decoration method based on virtual reality technology
Xompero et al. Multi-view shape estimation of transparent containers
CN114680739B (en) Cleaning control method and device, intelligent equipment, mobile terminal and server
CN110122958A (en) Three-dimensional scanner and application method
CN107145220A (en) Man-machine interaction self-adapting regulation method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant