CN108903816A - A kind of cleaning method, controller and intelligent cleaning equipment - Google Patents

A kind of cleaning method, controller and intelligent cleaning equipment Download PDF

Info

Publication number
CN108903816A
CN108903816A CN201810642601.0A CN201810642601A CN108903816A CN 108903816 A CN108903816 A CN 108903816A CN 201810642601 A CN201810642601 A CN 201810642601A CN 108903816 A CN108903816 A CN 108903816A
Authority
CN
China
Prior art keywords
user
position coordinates
house
cleaned
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810642601.0A
Other languages
Chinese (zh)
Inventor
王文斌
李亚军
黄俊岚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Wind Communication Technologies Co Ltd
Original Assignee
Shanghai Wind Communication Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Wind Communication Technologies Co Ltd filed Critical Shanghai Wind Communication Technologies Co Ltd
Priority to CN201810642601.0A priority Critical patent/CN108903816A/en
Publication of CN108903816A publication Critical patent/CN108903816A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The present embodiments relate to Smart Home technical field, a kind of cleaning method, controller and intelligent cleaning equipment are disclosed.A kind of cleaning method is provided in the present invention, including:It treats housecleaning and establishes building model, be preset with reference frame in building model;Obtain spatial position coordinate of several positions of user's hand in reference frame;The gesture vector of user is determined according to acquired spatial position coordinate;Determine the intersecting point coordinate of gesture vector and building model;Region to be cleaned is calculated according to intersecting point coordinate;Region to be cleaned is sent to intelligent cleaning equipment, is cleaned for intelligent cleaning equipment.The cleaning range that intelligent cleaning equipment delimit the gesture of user cleans, and realizes and only cleans to the place for needing to clean in house, specific aim is stronger, it is more intelligent, and user delimit cleaning range with gesture, is convenient for the user to use, improves user experience.

Description

Cleaning method, controller and intelligent cleaning equipment
Technical Field
The embodiment of the invention relates to the technical field of intelligent home furnishing, in particular to a cleaning method, a controller and intelligent cleaning equipment.
Background
The floor sweeping robot is also called a lazy floor sweeping machine, and is an intelligent household appliance capable of automatically absorbing dust on the ground. Because the robot can detect factors such as room size, furniture placement and ground cleanliness, and a reasonable cleaning route is made by means of a built-in program, and the robot has certain intelligence, the robot is called as a robot. The sweeping robot is used as a new concept of intelligent home for a runner, and can finally walk into thousands of households for the robot, and forward power is injected. Of course, the existing sweeping robot has a mechanism such as a water tank for mopping the floor in addition to an automatic dust suction function.
The inventor finds that at least the following problems exist in the prior art: the existing sweeping robot cannot only sweep a certain region when sweeping, is not strong in pertinence and not intelligent enough, and influences user experience.
Disclosure of Invention
The embodiment of the invention aims to provide a cleaning method, a controller and intelligent cleaning equipment, so that the intelligent cleaning equipment can clean a cleaning range defined by gestures of a user, the cleaning of only places needing to be cleaned in a house is realized, the pertinence is stronger, the intelligence is higher, the user can define the cleaning range by the gestures, the use of the user is facilitated, and the user experience is improved.
In order to solve the above technical problem, an embodiment of the present invention provides a cleaning method, including: building a house model of a house to be cleaned, wherein a reference coordinate system is preset in the house model; acquiring spatial position coordinates of a plurality of parts of the hand of the user in a reference coordinate system; determining a gesture vector of the user according to the acquired space position coordinates; determining the coordinates of the intersection point of the gesture vector and the house model; calculating an area to be cleaned according to the position coordinates of the intersection points; the area to be cleaned is sent to the intelligent cleaning equipment for the intelligent cleaning equipment to clean.
An embodiment of the present invention also provides a controller, including: at least one processor; and a memory communicatively coupled to the at least one processor; the memory stores instructions executable by the at least one processor, and the instructions are executed by the at least one processor to enable the at least one processor to execute the cleaning method.
The embodiment of the invention also provides intelligent cleaning equipment, and the intelligent cleaning equipment comprises the controller.
Compared with the prior art, the embodiment of the invention provides a cleaning method, which comprises the steps of establishing a house model for a house to be cleaned, presetting a reference coordinate system in the house model, acquiring spatial position coordinates of a plurality of parts of a hand of a user in the reference coordinate system, determining a gesture vector of the user according to the acquired spatial position coordinates, determining intersection point position coordinates of the gesture vector and the house model, calculating a region to be cleaned according to the intersection point position coordinates, namely calculating a cleaning range defined by a gesture of the user, and sending the region to be cleaned to intelligent cleaning equipment for cleaning. Because current robot of sweeping floor cleans when cleaning, can not only clean to a certain region, and the pertinence is not strong, and is not intelligent enough, influences user experience. And intelligence cleans the scope of cleaning that equipment can be demarcated to user's gesture in this application, has realized only cleaning the place that need clean in the house, and the pertinence is stronger, and more intelligence, and the user with the gesture demarcation clean the scope, convenience of customers's use has improved user experience.
In addition, the step of calculating the area to be cleaned according to the coordinates of the intersection position specifically comprises the following steps: if the intersection point position coordinate is one, the area to be cleaned is an area which takes the intersection point position coordinate as an original point and takes the preset distance as a radius. A way of determining the area to be cleaned from the position coordinates of the single intersection point is provided.
In addition, the step of calculating the area to be cleaned according to the coordinates of the intersection position specifically comprises the following steps: if the intersection point position coordinates are multiple, the area to be cleaned is an area defined by the intersection point position coordinates. A manner is provided by which to determine an area to be cleaned based on a plurality of intersection location coordinates.
In addition, before the step of obtaining the spatial position coordinates of a plurality of parts of the hand of the user in the reference coordinate system, the method further comprises the following steps: acquiring voice information of a user; and when the voice information is recognized to contain the preset keywords, the step of acquiring the space position coordinates of a plurality of parts of the hand of the user in the reference coordinate system is executed. When the preset keywords are identified, the spatial position coordinates of the parts of the user hand in the house reference coordinate system are obtained, and the problem of high energy consumption caused by obtaining the spatial position coordinates in real time is solved.
Drawings
One or more embodiments are illustrated by way of example in the accompanying drawings, which correspond to the figures in which like reference numerals refer to similar elements and which are not to scale unless otherwise specified.
FIG. 1 is a detailed flowchart of a sweeping method according to a first embodiment of the present invention;
FIG. 2 is a detailed flowchart of a sweeping method according to a second embodiment of the present invention;
FIG. 3 is a schematic diagram of a controller according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of an intelligent sweeping apparatus according to a fourth embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention more apparent, embodiments of the present invention will be described in detail below with reference to the accompanying drawings. However, it will be appreciated by those of ordinary skill in the art that numerous technical details are set forth in order to provide a better understanding of the present application in various embodiments of the present invention. However, the technical solution claimed in the present application can be implemented without these technical details and various changes and modifications based on the following embodiments.
The first embodiment of the invention relates to a cleaning method, which is characterized in that a house model is built in a house to be cleaned, a reference coordinate system is preset in the house model, spatial position coordinates of a plurality of parts of a hand of a user in the reference coordinate system are obtained, a gesture vector of the user is determined according to the obtained spatial position coordinates, intersection point position coordinates of the gesture vector and the house model are determined, an area to be cleaned is calculated according to the intersection point position coordinates, and the area to be cleaned is sent to intelligent cleaning equipment for cleaning. Because current robot of sweeping floor cleans when cleaning, can not only clean to a certain region, and the pertinence is not strong, and is not intelligent enough, influences user experience. And intelligence cleans the scope of cleaning that equipment can be demarcated to user's gesture in this application, has realized only cleaning the place that need clean in the house, and the pertinence is stronger, and more intelligence, and the user with the gesture demarcation clean the scope, convenience of customers's use has improved user experience. The following describes in detail the implementation of the cleaning method of the present embodiment, and the following is only provided for the convenience of understanding and is not essential to the implementation of the present embodiment.
A specific flowchart of a cleaning method in the present embodiment is shown in fig. 1, and specifically includes:
step 101: and establishing a house model for the house to be cleaned, wherein a reference coordinate system is preset in the house model.
Specifically, a house model is built in advance for the house to be cleaned, that is, a three-dimensional model of the house is built. The specific building model has two modes, one is to use a camera in the house to be cleaned to obtain photos of the house from different angles; and obtaining the house model according to the pictures at different angles. The method comprises the steps that two cameras are respectively arranged at feet of a house to be cleaned, the two cameras respectively shoot photos of the house to obtain two photos of the house at different angles, the two photos are combined to obtain an overall image of the whole house, and the computing equipment performs image processing and three-dimensional computing on the obtained photos of the house at different angles, so that a three-dimensional model of the house is generated. Preferably, four cameras are respectively arranged at four corners of the house to be cleaned, the four cameras take pictures of the house to obtain four pictures of the house at different angles, and the four pictures of the house are used for image processing and three-dimensional calculation to generate a three-dimensional model of the house. The three-dimensional model of the house is obtained by the house photos at a plurality of angles, so that the obtained three-dimensional model of the house is more accurate. The other method is to obtain a panoramic photo in a house by using a mobile terminal; and obtaining the house model according to the panoramic photo. With mobile terminals such as: the method comprises the steps that a camera of mobile equipment such as a mobile phone and an iPad scans the whole house to obtain a panoramic photo of the whole house, and image processing and three-dimensional calculation are carried out according to the panoramic photo obtained by scanning the mobile terminal to generate a three-dimensional model of the house. It should be noted that it is prior art to build a three-dimensional model of a house by using image modeling techniques based on photographs of the house to be modeled, and not described here too much. After a three-dimensional model of the house to be cleaned is established, a reference coordinate system is established by taking a certain point in the house model as a coordinate origin, so that the spatial position coordinates of the hand in the house can be conveniently determined subsequently.
Step 102: spatial position coordinates of a plurality of parts of the hand of the user in a reference coordinate system are obtained.
Specifically, at least two cameras with different angles in a house are used for obtaining pictures of a user in the house, the hands of the user are identified by using an image identification technology, the angles of the cameras are adjusted to enable the cameras to be aligned with the hands of the user, and the distance between the cameras and the hands of the user is obtained by using an infrared sensor on the cameras. Because the position of the camera in the house is fixed, the spatial position coordinates of the hand of the user in the reference coordinate system in the house can be obtained according to the angle of the connecting line of the camera and the hand relative to the horizontal plane where the camera is located and the distance between the hand of the user and the camera.
Further, before the step of obtaining the spatial position coordinates of the plurality of parts of the hand of the user in the reference coordinate system, the method further comprises the following steps: acquiring voice information of a user; and when the voice information is recognized to contain the preset keywords, the step of acquiring the space position coordinates of a plurality of parts of the hand of the user in the reference coordinate system is executed.
Specifically, before acquiring the space position coordinates of the parts of the user hand in the house reference coordinate system, namely before the camera of the house is opened to acquire the space position coordinates of the parts of the user hand in the house reference coordinate system, the voice information of the user is acquired first, semantic recognition is carried out on the voice information of the user to determine whether the voice information of the user contains preset keywords, and when the voice information contains the preset keywords, the camera of the house is opened, so that the problem of high energy consumption caused by the fact that the camera of the house is opened all the time is avoided. The preset keyword is a word indicating that the user desires to clean a specific area. For example, the preset keyword may be set as a single word such as "this place" or "sweep", or may be a short sentence such as "sweep this place". The specific keywords are only given as examples, and can be set by the user. Preferably, the set keywords should fit the speaking habits of the user as much as possible, so that convenience is brought to the user.
Step 103: and determining the gesture vector of the user according to the acquired space position coordinates.
Specifically, after spatial position coordinates of a plurality of parts of the hand of the user in a reference coordinate system are acquired by using at least two cameras with different angles in a house, a gesture vector of the user is determined according to the acquired spatial position coordinates. Specifically, acquiring spatial position coordinates of a plurality of parts of the hand of the user in a reference coordinate system specifically includes: acquiring the spatial position coordinates of the wrist and the fingers in a reference coordinate system; determining a gesture vector of the user according to the acquired space position coordinates, specifically: the gesture vector is a connecting line pointing from the spatial position coordinate of the wrist to the spatial position coordinate of the finger. And respectively taking a feature point on the wrist and the finger of the user, and acquiring the space position coordinates of the feature point in the reference coordinate system, so that the gesture vector of the hand of the user in the house can be determined, and the gesture vector is a connecting line pointing to the space position coordinates of the finger from the space position coordinates of the wrist. Preferably, a plurality of feature points are respectively taken from the wrist, the palm and the fingers of the user, and the connection lines of the feature points in the reference coordinate system are respectively obtained, so that the position of the plane formed by the hand in the house can be determined, the gesture vector of the user is obtained, and the obtained pointing direction of the hand of the user, namely the gesture vector, is more accurate.
Step 104: and determining the coordinates of the intersection point of the gesture vector and the house model.
Specifically, after spatial position coordinates of a plurality of parts of the hand of the user in a reference coordinate system are obtained by using at least two cameras with different angles in a house, a gesture vector of the user is determined according to the obtained spatial position coordinates. Because the house model is obtained in advance and the reference coordinate system is established in the house model, after the gesture vector of the hand of the user in the reference coordinate system is determined, the gesture vector is extended in the reference coordinate system along the direction of the gesture vector, the intersection point position of the extension line and the house model is determined according to the established reference coordinate system, and the intersection point position coordinate, namely the coordinate of the position pointed by the gesture of the user in the house is determined.
Step 105: and calculating the area to be cleaned according to the intersection position coordinates, wherein the area to be cleaned is an area which takes the intersection position coordinates as an original point and takes the preset distance as a radius.
Specifically, after spatial position coordinates of a plurality of parts of the hand of the user in a reference coordinate system are obtained by using at least two cameras with different angles in a house, a gesture vector of the user is determined according to the obtained spatial position coordinates. And then determining intersection point position coordinates of the gesture vector and the house model, wherein if the intersection point position coordinates are determined to be one, the area to be cleaned is an area which takes the intersection point position coordinates as an original point and takes the preset distance as a radius. A way of determining the area to be cleaned from the position coordinates of the single intersection point is provided. The preset distance can be set by a user, and the user can set a distance as large as possible to ensure that the place needing to be cleaned in the house is cleaned as completely as possible.
Step 106: the area to be cleaned is sent to the intelligent cleaning equipment for the intelligent cleaning equipment to clean.
Specifically, the determined origin position coordinates (i.e., intersection position coordinates) of the area to be cleaned and the preset distance value are sent to the intelligent cleaning equipment, so that the intelligent cleaning equipment can clean the area.
Compared with the prior art, the embodiment provides a cleaning method, a house model is built for a house to be cleaned, a reference coordinate system is preset in the house model, spatial position coordinates of a plurality of parts of a hand of a user in the reference coordinate system are obtained, a gesture vector of the user is determined according to the obtained spatial position coordinates, intersection point position coordinates of the gesture vector and the house model are determined, an area to be cleaned is calculated according to the intersection point position coordinates, namely a cleaning range defined by the gesture of the user is calculated, and the area to be cleaned is sent to intelligent cleaning equipment for cleaning. Because current robot of sweeping floor cleans when cleaning, can not only clean to a certain region, and the pertinence is not strong, and is not intelligent enough, influences user experience. And intelligence cleans the scope of cleaning that equipment can be demarcated to user's gesture in this application, has realized only cleaning the place that need clean in the house, and the pertinence is stronger, and more intelligence, and the user with the gesture demarcation clean the scope, convenience of customers's use has improved user experience.
A second embodiment of the present invention relates to a cleaning method. The second embodiment is substantially the same as the first embodiment, and mainly differs in that: in the first embodiment, the step of calculating the area to be cleaned according to the coordinates of the intersection position specifically includes: if the intersection point position coordinate is one, the area to be cleaned is an area which takes the intersection point position coordinate as an original point and takes the preset distance as a radius. In this embodiment, the step of calculating the area to be cleaned according to the coordinates of the intersection position specifically includes: if the intersection point position coordinates are multiple, the area to be cleaned is an area defined by the intersection point position coordinates. Another way of determining the area to be cleaned based on the coordinates of the location of the plurality of intersection points is provided.
Step 201: and establishing a house model for the house to be cleaned, wherein a reference coordinate system is preset in the house model.
Step 202: spatial position coordinates of a plurality of parts of the hand of the user in a reference coordinate system are obtained.
Step 203: and determining the gesture vector of the user according to the acquired space position coordinates.
Step 204: and determining the coordinates of the intersection point of the gesture vector and the house model.
The steps 201 to 204 are substantially the same as the steps 101 to 104 in the first embodiment, and are not repeated herein.
Step 205: and calculating the area to be cleaned according to the intersection position coordinates, wherein the area to be cleaned is defined by a plurality of intersection position coordinates.
Specifically, after spatial position coordinates of a plurality of parts of the hand of the user in a reference coordinate system are obtained by using at least two cameras with different angles in a house, a gesture vector of the user is determined according to the obtained spatial position coordinates. And then determining intersection point position coordinates of the gesture vector and the house model, wherein if the intersection point position coordinates are multiple, the area to be cleaned is an area defined by the intersection point position coordinates. A manner is provided by which to determine an area to be cleaned based on a plurality of intersection location coordinates. The hand of the user can be moved, intersection point position coordinates of different positions of the hand of the user and the house model in the moving process are obtained, and the determined area to be cleaned is an area defined by the intersection point position coordinates. If the hand of the user moves in a circling manner, the determined area to be cleaned is a circular area defined by a plurality of intersection position coordinates.
Step 206: the area to be cleaned is sent to the intelligent cleaning equipment for the intelligent cleaning equipment to clean.
Specifically, a plurality of determined intersection position coordinates of the area to be cleaned are sent to the intelligent cleaning equipment, and the intelligent cleaning equipment can clean according to the intersection position coordinates.
Compared with the prior art, the embodiment provides a cleaning method, and the step of calculating the area to be cleaned according to the coordinates of the intersection position specifically includes: if the intersection point position coordinates are multiple, the area to be cleaned is an area defined by the intersection point position coordinates. A manner is provided by which to determine an area to be cleaned based on a plurality of intersection location coordinates.
The steps of the above methods are divided for clarity, and the implementation may be combined into one step or split some steps, and the steps are divided into multiple steps, so long as the same logical relationship is included, which are all within the protection scope of the present patent; it is within the scope of the patent to add insignificant modifications to the algorithms or processes or to introduce insignificant design changes to the core design without changing the algorithms or processes.
A third embodiment of the present invention relates to a controller including: at least one processor 301; and a memory 302 communicatively coupled to the at least one processor 301; the memory 302 stores instructions executable by the at least one processor 301, and the instructions are executed by the at least one processor 301, so that the at least one processor 301 can execute the cleaning method according to any of the above embodiments.
Where the memory 302 and the processor 301 are coupled in a bus, the bus may comprise any number of interconnected buses and bridges, the buses coupling one or more of the various circuits of the processor 301 and the memory 302. The bus may also connect various other circuits such as peripherals, voltage regulators, power management circuits, and the like, which are well known in the art, and therefore, will not be described any further herein. A bus interface provides an interface between the bus and the transceiver. The transceiver may be one element or a plurality of elements, such as a plurality of receivers and transmitters, providing a means for communicating with various other apparatus over a transmission medium. The data processed by the processor 301 is transmitted over a wireless medium through an antenna, which further receives the data and transmits the data to the processor 301.
The processor 301 is responsible for managing the bus and general processing and may also provide various functions including timing, peripheral interfaces, voltage regulation, power management, and other control functions. And memory 302 may be used to store data used by processor 301 in performing operations.
A fourth embodiment of the present invention relates to an intelligent cleaning apparatus including the controller of the above embodiment.
Those skilled in the art will understand that all or part of the steps in the method according to the above embodiments may be implemented by a program instructing related hardware to complete, where the program is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, etc.) or a processor (processor) to execute all or part of the steps in the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It will be understood by those of ordinary skill in the art that the foregoing embodiments are specific examples for carrying out the invention, and that various changes in form and details may be made therein without departing from the spirit and scope of the invention in practice.

Claims (9)

1. A sweeping method, comprising:
building a house model of a house to be cleaned, wherein a reference coordinate system is preset in the house model;
acquiring spatial position coordinates of a plurality of parts of the hand of the user in the reference coordinate system;
determining a gesture vector of the user according to the acquired spatial position coordinates;
determining intersection point position coordinates of the gesture vector and the house model;
calculating an area to be cleaned according to the position coordinates of the intersection points;
and sending the area to be cleaned to intelligent cleaning equipment for cleaning by the intelligent cleaning equipment.
2. The cleaning method according to claim 1, wherein the step of calculating the area to be cleaned from the intersection position coordinates specifically includes:
and if the intersection point position coordinate is one, the area to be cleaned is an area which takes the intersection point position coordinate as an origin and takes a preset distance as a radius.
3. The cleaning method according to claim 1, wherein the step of calculating the area to be cleaned from the intersection position coordinates specifically includes:
and if the intersection point position coordinates are multiple, the area to be cleaned is an area defined by the intersection point position coordinates.
4. The cleaning method according to claim 1, further comprising, before the step of obtaining spatial position coordinates of a plurality of portions of the user's hand in the reference coordinate system:
acquiring voice information of a user;
and when the voice information is recognized to contain preset keywords, the step of acquiring the space position coordinates of a plurality of parts of the hand of the user in the reference coordinate system is executed.
5. The cleaning method according to claim 1, wherein the obtaining spatial position coordinates of the plurality of parts of the user's hand in the reference coordinate system comprises: acquiring the space position coordinates of the user wrist and the space position coordinates of the user finger in the reference coordinate system;
the step of determining the gesture vector of the user according to the acquired spatial position coordinates specifically includes:
the gesture vector is a connecting line pointing from the spatial position coordinate of the wrist to the spatial position coordinate of the finger.
6. The sweeping method according to claim 1, wherein the building of the house model for the house to be swept specifically comprises:
acquiring photos of the house from different angles by using a camera in the house to be cleaned;
and obtaining the house model according to the pictures at different angles.
7. The sweeping method according to claim 1, wherein the building of the house model for the house to be swept specifically comprises:
acquiring a panoramic photo in the house by using a mobile terminal;
and obtaining the house model according to the panoramic photo.
8. A controller, comprising:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the cleaning method of any one of claims 1 to 8.
9. An intelligent cleaning device, characterized in that the controller of claim 8 is included in the intelligent cleaning device.
CN201810642601.0A 2018-06-21 2018-06-21 A kind of cleaning method, controller and intelligent cleaning equipment Pending CN108903816A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810642601.0A CN108903816A (en) 2018-06-21 2018-06-21 A kind of cleaning method, controller and intelligent cleaning equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810642601.0A CN108903816A (en) 2018-06-21 2018-06-21 A kind of cleaning method, controller and intelligent cleaning equipment

Publications (1)

Publication Number Publication Date
CN108903816A true CN108903816A (en) 2018-11-30

Family

ID=64420355

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810642601.0A Pending CN108903816A (en) 2018-06-21 2018-06-21 A kind of cleaning method, controller and intelligent cleaning equipment

Country Status (1)

Country Link
CN (1) CN108903816A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111830998A (en) * 2020-06-05 2020-10-27 科沃斯机器人股份有限公司 Operation method, virtual wall adding method, equipment and storage medium
CN113116224A (en) * 2020-01-15 2021-07-16 科沃斯机器人股份有限公司 Robot and control method thereof
CN113126632A (en) * 2020-01-15 2021-07-16 科沃斯机器人股份有限公司 Virtual wall defining and operating method, equipment and storage medium
CN114680739A (en) * 2020-12-29 2022-07-01 美的集团股份有限公司 Cleaning control method and device, intelligent equipment, mobile terminal and server
WO2023123457A1 (en) * 2021-12-31 2023-07-06 深圳市闪至科技有限公司 Robot control method and apparatus, method and apparatus for controlling robot to return to base, and robot

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113116224A (en) * 2020-01-15 2021-07-16 科沃斯机器人股份有限公司 Robot and control method thereof
CN113126632A (en) * 2020-01-15 2021-07-16 科沃斯机器人股份有限公司 Virtual wall defining and operating method, equipment and storage medium
CN113116224B (en) * 2020-01-15 2022-07-05 科沃斯机器人股份有限公司 Robot and control method thereof
CN113126632B (en) * 2020-01-15 2022-08-16 科沃斯机器人股份有限公司 Virtual wall defining and operating method, equipment and storage medium
CN111830998A (en) * 2020-06-05 2020-10-27 科沃斯机器人股份有限公司 Operation method, virtual wall adding method, equipment and storage medium
CN114680739A (en) * 2020-12-29 2022-07-01 美的集团股份有限公司 Cleaning control method and device, intelligent equipment, mobile terminal and server
CN114680739B (en) * 2020-12-29 2023-08-04 美的集团股份有限公司 Cleaning control method and device, intelligent equipment, mobile terminal and server
WO2023123457A1 (en) * 2021-12-31 2023-07-06 深圳市闪至科技有限公司 Robot control method and apparatus, method and apparatus for controlling robot to return to base, and robot

Similar Documents

Publication Publication Date Title
CN108903816A (en) A kind of cleaning method, controller and intelligent cleaning equipment
US11027425B1 (en) Space extrapolation for robot task performance
CN111528732B (en) Cleaning robot operation control method, device and system and storage medium
CN113284240A (en) Map construction method and device, electronic equipment and storage medium
US20230057965A1 (en) Robot and control method therefor
CN112075879A (en) Information processing method, device and storage medium
CN108803586B (en) Working method of sweeping robot
US10938912B2 (en) Sweeper, server, sweeper control method and sweeper control system
CN113080768A (en) Sweeper control method, sweeper control equipment and computer readable storage medium
CN115205470B (en) Continuous scanning repositioning method, device, equipment, storage medium and three-dimensional continuous scanning method
WO2022028110A1 (en) Map creation method and apparatus for self-moving device, and device and storage medium
US11132002B2 (en) Method and device for displaying motion path of robot and robot
CN113116229A (en) Robot control method and device, sweeping robot and storage medium
CN114359692A (en) Room identification method and device, electronic equipment and storage medium
CN112333439B (en) Face cleaning equipment control method and device and electronic equipment
CN110962132B (en) Robot system
CN110919644B (en) Method and system for positioning interaction by using camera equipment and robot
CN114680740A (en) Cleaning control method and device, intelligent equipment, mobile equipment and server
CN112655021A (en) Image processing method, image processing device, electronic equipment and storage medium
WO2022089548A1 (en) Service robot and control method therefor, and mobile robot and control method therefor
CN115670308A (en) Sweeping method, device, equipment and storage medium of sweeping robot
CN111419117B (en) Returning control method of visual floor sweeping robot and visual floor sweeping robot
US20170024635A1 (en) Image processing system and method for identifying content within image data
CN111475018B (en) Control method and device of sweeping robot, sweeping robot and electronic equipment
CN109986577A (en) Diet nursing method and apparatus based on robot operating system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination