CN113907652B - Cleaning robot control method, chip and cleaning robot - Google Patents

Cleaning robot control method, chip and cleaning robot Download PDF

Info

Publication number
CN113907652B
CN113907652B CN202111229047.1A CN202111229047A CN113907652B CN 113907652 B CN113907652 B CN 113907652B CN 202111229047 A CN202111229047 A CN 202111229047A CN 113907652 B CN113907652 B CN 113907652B
Authority
CN
China
Prior art keywords
cleaning robot
area
user
cleaning
users
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111229047.1A
Other languages
Chinese (zh)
Other versions
CN113907652A (en
Inventor
许登科
唐以廷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Amicro Semiconductor Co Ltd
Original Assignee
Zhuhai Amicro Semiconductor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Amicro Semiconductor Co Ltd filed Critical Zhuhai Amicro Semiconductor Co Ltd
Priority to CN202111229047.1A priority Critical patent/CN113907652B/en
Publication of CN113907652A publication Critical patent/CN113907652A/en
Application granted granted Critical
Publication of CN113907652B publication Critical patent/CN113907652B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/28Floor-scrubbing machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • A47L11/4005Arrangements of batteries or cells; Electric power supply arrangements
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • A47L11/4008Arrangements of switches, indicators or the like
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation

Abstract

The invention discloses a cleaning robot control method, a chip and a cleaning robot, wherein under the condition of detecting human voice, the method distinguishes the types of users through a face recognition technology, then detects the motion state of the users, and switches proper cleaning gears to reduce the noise of the cleaning robot, reduce the disturbance degree of the cleaning robot to the users of specific types during working, simultaneously ensure the cleaning effect and improve the user experience.

Description

Cleaning robot control method, chip and cleaning robot
Technical Field
The invention relates to the field of intelligent robots, in particular to a cleaning robot control method, a chip and a cleaning robot.
Background
With the rapid development of science and technology, more and more intelligent living electrical appliances enter thousands of households, and the living comfort and convenience of people are greatly improved. The floor sweeping robot is used as household equipment, and is widely popular because the labor intensity of people at home can be reduced, and the labor efficiency of people at home is improved. In the using process of the conventional sweeping robot, the starting or stopping instruction of the user is obtained, and the robot automatically moves in the current scene, for example, in the current room or in a public area in a set range according to a set moving path to perform a sweeping operation until sweeping of all spaces in the current scene is completed.
The existing sweeping robot realizes dust collection and sweeping through high-speed rotation of a dust collection fan, and the high-speed rotation of the dust collection fan brings great noise. Although the existing sweeping robot is more and more intelligent, noise cannot be completely eliminated, so that great interference is generated to a user in certain scenes, and user experience is affected. In the prior art, most of the technical schemes reduce noise by optimizing the structure of the sweeping robot, for example, a dust collection fan or an air duct is provided with a corresponding noise reduction means (such as soundproof cotton). However, this reduces only a part of the noise and has a limited noise reduction effect. In addition, a part of the technology also reduces noise by controlling the behavior of the sweeping robot, for example, the document with the publication number of CN106137043B controls the rotating speed of the fan by the size of dust amount to reduce noise, and for example, the document with the publication number of CN110946510a switches the working mode to reduce noise by different time periods. At present, no technical scheme for realizing noise reduction by distinguishing user categories is disclosed.
Disclosure of Invention
In order to solve the above problems, the present invention provides a cleaning robot control method, a chip and a cleaning robot, which can reduce the disturbance degree of the cleaning robot to a user. The specific technical scheme of the invention is as follows:
a cleaning robot control method, the method comprising the steps of: step S1, in the cleaning process, the cleaning robot detects whether the voice is generated, if the voice is not detected, cleaning is carried out according to a preset plan, and if the voice is detected, the step S2 is carried out; s2, the cleaning robot determines the direction of a user, then opens a vision sensor to further determine the area where the user is located and carries out user identification, if only one type of user is identified, the cleaning robot uses a mute file to clean the area, and if two types of users are identified, the step S3 is carried out; the cleaning robot is internally stored with a global map containing divided region information; s3, the cleaning robot judges the motion state of the second class of users, if the second class of users are in a moving state in the area, the cleaning robot uses a mute file to clean the area, if the second class of users are in a static state in the area, the cleaning robot skips the cleaning of the area, uses the mute file to clean the area directly adjacent to the area, and then the step S4 is carried out; and S4, after the cleaning robot finishes cleaning all the areas except the area, detecting whether two types of users exist in the area, if not, cleaning the area and then returning to the base, if so, directly returning to the base, and then returning to the area at preset time intervals for detection until the cleaning of the area is finished. Compared with the prior art, the technical scheme has the advantages that the user classification means is adopted, the disturbance degree of the cleaning robot to the users of specific categories during working is reduced, the cleaning effect is guaranteed, and the user experience is improved.
Further, the cleaning robot comprises three cleaning gears, namely a mute gear, a standard gear and a strong gear, wherein the noise of the mute gear is minimum, the noise of the standard gear is second, and the noise of the strong gear is maximum. Different cleaning gears can meet different use scenes.
Further, in the step S2, the cleaning robot determines the user orientation using a sound source localization technique.
Further, in step S2, the method for the cleaning robot to perform the user recognition includes that the cleaning robot captures the face of the user by using the visual sensor and performs the face recognition, if there is a matched user in the database, it is determined as a first-class user, and if there is no matched user in the database, it is determined as a second-class user. Identifying different user categories may allow for more appropriate noise reduction strategies.
Further, in step S3, the method for determining the motion state of the second type of user by the cleaning robot includes determining a relative position between the second type of user and another object or user in the area by the cleaning robot, if the relative position changes, the second type of user is in a moving state in the area, and if the relative position does not change, the second type of user is in a static state in the area. Identifying different user behaviors may lead to more appropriate noise reduction strategies.
Further, in the step S4, when the second type of user does not exist in the area and the cleaning robot cleans the area, if the human voice is detected in the area, the cleaning robot uses the standard gear to clean the area, and if the human voice is not detected, the cleaning robot uses the strong gear to clean the area. The use of a strong bumper can provide deep cleaning of potential debris in areas where the user has left.
A cleaning robot provided with a vision sensor for implementing the cleaning robot control method, the cleaning robot comprising: the voice detection module is used for detecting whether voice exists and carrying out sound source positioning on the user; the user identification module is used for distinguishing a first class of users from a second class of users; the cleaning gear switching module is used for switching the cleaning gear of the cleaning robot; the user state judging module is used for judging the motion states of the second class of users; the cleaning robot is internally stored with a global map containing information of divided areas. Compared with the prior art, the technical scheme has the advantages that under the condition that the voice is detected, the user type is distinguished through the user identification module, the motion state of the user is detected through the user state judgment module, the proper cleaning gear is switched to reduce the noise of the cleaning robot, the disturbance degree of the cleaning robot to the user with the specific type during working is reduced, meanwhile, the cleaning effect is also ensured, and the user experience is improved.
Further, the human voice detection module comprises a plurality of microphones, and the microphones are arranged at preset angles and preset distances. The requirement of sound source positioning technology is met.
A chip storing a computer program which, when executed, implements the cleaning robot control method. Compared with the prior art, the technical scheme can reduce the disturbance degree of the cleaning robot to specific category users during working, simultaneously ensure the cleaning effect and improve the user experience.
Drawings
Fig. 1 is a flowchart illustrating a method for controlling a cleaning robot according to an embodiment of the present invention.
Detailed Description
The following further describes embodiments of the present invention with reference to the drawings. It should be noted that the following detailed description is exemplary and is intended to provide further explanation of the disclosure. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments according to the present application. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, and it should be understood that when the terms "comprises" and/or "comprising" are used in this specification, they specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof, unless the context clearly indicates otherwise.
As shown in fig. 1, a cleaning robot controlling method includes the steps of:
step S1, in the cleaning process, the cleaning robot detects whether the voice is generated, if the voice is not detected, cleaning is carried out according to a preset plan, and if the voice is detected, the step S2 is carried out.
In the process of executing step S1, the purpose of detecting the human voice by the cleaning robot is to determine whether a person is chatting in the current scene. If a person chats and the cleaning robot works beside with high power, the generated noise causes a user's trouble and affects the user experience. The voice can be detected by using a voice recognition technology, which is the prior art and is not described in detail. It should be noted that the cleaning robot includes three cleaning gears, namely a mute gear, a standard gear and a strong gear, wherein the noise of the mute gear is minimum, the noise of the standard gear is second, and the noise of the strong gear is maximum. Different cleaning gears can meet different use scenes.
S2, the cleaning robot determines the direction of a user, then opens a vision sensor to further determine the area where the user is located and carries out user identification, if only one type of user is identified, the cleaning robot uses a mute file to clean the area, and if two types of users are identified, the step S3 is carried out; wherein, the cleaning robot is internally stored with a global map containing the divided region information.
In performing step S2, the cleaning robot determines the user orientation using a sound source localization technique. Preferably, the location of the user is confirmed using a positioning method based on a Time Difference of Arrival (TDOA) estimation. The principle of the method can be summarized as that sound waves are transmitted in the air at a certain speed, the phases of the sound waves reaching microphones arranged at different positions are different, and the time difference of the same sound reaching the microphones can be obtained according to the phase difference of the microphones recording the same sound. After obtaining the time difference, it can be determined that the sound source is located on the hyperboloid with the position of the microphones as the focus and the sound transmission distance corresponding to the time difference of arrival as the parameter. Using multiple microphones results in multiple hyperboloids with the sound source location at the intersection of these hyperboloids.
After the direction of the user is determined, the area where the user is located is further judged, and the area is a known area which is divided in advance. The cleaning robot then identifies the user to develop an appropriate noise reduction strategy according to the different user categories. The method for the cleaning robot to recognize the user comprises the steps that the cleaning robot captures the face of the user by using a visual sensor and recognizes the face of the user, if matched users exist in a database, the cleaning robot is judged as a first-class user, and if no matched users exist in the database, the cleaning robot is judged as a second-class user. Wherein, the class of users mainly comprises family resident persons, and the class of persons as the family owner has higher tolerance for noise of the cleaning robot. Identity information of a class of users is stored in a database and set autonomously by the user. The second category of users mainly refers to users as guests, and does not need to be stored in a database. When guests are present at home, the noise of the cleaning robot may interfere with the conversation, so that different cleaning strategies are required in the face of different users. When the area of one type of users is cleaned, the mute gear is used for reducing interference.
And S3, the cleaning robot judges the motion state of the second class of users, if the second class of users are in a moving state in the area, the cleaning robot uses a mute file to clean the area, if the second class of users are in a static state in the area, the cleaning robot skips the cleaning of the area, uses the mute file to clean the area directly adjacent to the area, and then the step S4 is carried out.
And if the second type of users are identified in the area, further making a corresponding noise reduction strategy according to the behaviors of the second type of users. If the user is moving, the area is still cleaned in order to ensure the cleaning effect, because of uncertainty of the movement of the user, but a silent stage is used to reduce the interference to the user. If the user is stationary, for example sitting on a sofa for a chat, the cleaning robot temporarily ignores cleaning of the area and cleans the adjacent area using a silent block in order to minimize interference with the user since the user's position is determined. The method for judging the motion state of the second class of users by the cleaning robot comprises the steps that the cleaning robot judges the relative positions of the second class of users and other objects or users in the area, if the relative positions change, the second class of users are in a moving state in the area, and if the relative positions do not change, the second class of users are in a static state in the area. By recognizing different user behaviors, a more appropriate noise reduction strategy can be formulated.
And S4, after the cleaning robot finishes cleaning all the areas except the area, detecting whether two types of users exist in the area, if not, cleaning the area and then returning to the base, if so, directly returning to the base, and then returning to the area at preset time intervals for detection until the cleaning of the area is finished.
In the process of executing step S4, when there is no second-class user in the area and the cleaning robot cleans the area, if a human voice is detected in the area, the cleaning is performed using the standard gear, and if no human voice is detected, the cleaning is performed using the strong gear. The use of a strong bumper can provide deep cleaning of potential debris in areas where the user has left.
The present invention also discloses a cleaning robot provided with a vision sensor, the cleaning robot comprising: the voice detection module is used for detecting whether voice exists and carrying out sound source positioning on the user; the user identification module is used for distinguishing a first class of users from a second class of users; the cleaning gear switching module is used for switching the cleaning gear of the cleaning robot; the user state judging module is used for judging the motion states of the second class of users; wherein, the cleaning robot is internally stored with a global map containing the divided region information. For satisfying the needs of sound localization technique, the people's voice detection module includes a plurality of microphone, the microphone sets up with presetting the angle and presetting the distance. Compared with the prior art, the technical scheme has the advantages that under the condition that the voice is detected, the user type is distinguished through the user identification module, the motion state of the user is detected through the user state judgment module, the proper cleaning gear is switched to reduce the noise of the cleaning robot, the disturbance degree of the cleaning robot to the user with the specific type during working is reduced, meanwhile, the cleaning effect is also ensured, and the user experience is improved.
The invention also discloses a chip which is used for storing computer program codes and can be arranged in the mobile robot, and the computer program codes realize the steps of the cleaning robot control method when being executed. Or, the chip implements the functions of the modules in the mobile robot embodiment when executing the computer program code. Illustratively, the computer program code may be partitioned into one or more modules/units that are stored in and executed by the chip to complete the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program code in the mobile robot. For example, the computer program code may be partitioned into: the mobile robot comprises a human voice detection module, a user identification module, a cleaning gear switching module and a user state judgment module in the embodiment. The chip can reduce the disturbance degree of the cleaning robot to users of specific categories during working, simultaneously ensure the cleaning effect and improve the user experience.
Obviously, the above-mentioned embodiments are only a part of embodiments of the present invention, not all embodiments, and the technical solutions of the embodiments may be combined with each other. Furthermore, if terms such as "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc., appear in the embodiments, their indicated orientations or positional relationships are based on those shown in the drawings only for convenience of describing the present invention and simplifying the description, but do not indicate or imply that the referred devices or elements must have a specific orientation or be constructed and operated in a specific orientation, and thus, should not be construed as limiting the present invention. If the terms "first", "second", "third", etc. appear in the embodiments, they are for convenience of distinguishing between related features, and they are not to be construed as indicating or implying any relative importance, order or number of features.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (8)

1. A cleaning robot control method, characterized by comprising the steps of:
step S1, in the cleaning process, the cleaning robot detects whether the voice is generated, if the voice is not detected, cleaning is carried out according to a preset plan, and if the voice is detected, the step S2 is carried out;
s2, the cleaning robot determines the direction of a user, then opens a vision sensor to further determine the area where the user is located and carries out user identification, if only one type of user is identified, the cleaning robot uses a mute file to clean the area, and if two types of users are identified, the step S3 is carried out; the cleaning robot is internally stored with a global map containing divided region information; the cleaning robot captures the face of a user by using a visual sensor and carries out face recognition, if matched users exist in a database, the cleaning robot is judged as a first-class user, and if no matched users exist in the database, the cleaning robot is judged as a second-class user;
s3, the cleaning robot judges the motion state of the second class of users, if the second class of users are in a moving state in the area, the cleaning robot uses a mute file to clean the area, if the second class of users are in a static state in the area, the cleaning robot skips the cleaning of the area, uses the mute file to clean the area directly adjacent to the area, and then the step S4 is carried out;
and S4, after the cleaning robot finishes cleaning all areas except the area, detecting whether the area has two types of users, if not, cleaning the area and then returning to the base, and if so, directly returning to the base, and then returning to the area at preset time intervals to detect until the area is cleaned.
2. The cleaning robot control method of claim 1, wherein the cleaning robot includes three cleaning gears, a silent gear, a normal gear and an intense gear, wherein the silent gear has the lowest noise, the normal gear has the next lowest noise, and the intense gear has the highest noise.
3. The cleaning robot controlling method of claim 1, wherein in the step S2, the cleaning robot determines the user orientation using a sound source localization technique.
4. The method as claimed in claim 1, wherein the step S3 includes determining a relative position between the second type of user and other objects or users in the area, if the relative position changes, the second type of user is in a moving state in the area, and if the relative position does not change, the second type of user is in a static state in the area.
5. The method as claimed in claim 1, wherein in step S4, when the cleaning robot cleans the area without the second type of user, the cleaning robot performs cleaning using a standard gear if a human voice is detected in the area, and performs cleaning using a strong gear if a human voice is not detected.
6. A cleaning robot provided with a vision sensor for implementing the cleaning robot control method of any one of claims 1 to 5, characterized by comprising:
the voice detection module is used for detecting whether voice exists and carrying out sound source positioning on the user;
the user identification module is used for distinguishing a first class of users from a second class of users;
the cleaning gear switching module is used for switching the cleaning gear of the cleaning robot;
the user state judging module is used for judging the motion states of the second class of users;
wherein, the cleaning robot is internally stored with a global map containing the divided region information.
7. The cleaning robot as claimed in claim 6, wherein the human voice detecting module includes a plurality of microphones, and the microphones are arranged at a predetermined angle and a predetermined distance.
8. A chip storing a computer program, wherein the computer program is executed to implement the cleaning robot control method according to any one of claims 1 to 5.
CN202111229047.1A 2021-10-21 2021-10-21 Cleaning robot control method, chip and cleaning robot Active CN113907652B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111229047.1A CN113907652B (en) 2021-10-21 2021-10-21 Cleaning robot control method, chip and cleaning robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111229047.1A CN113907652B (en) 2021-10-21 2021-10-21 Cleaning robot control method, chip and cleaning robot

Publications (2)

Publication Number Publication Date
CN113907652A CN113907652A (en) 2022-01-11
CN113907652B true CN113907652B (en) 2023-03-10

Family

ID=79242249

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111229047.1A Active CN113907652B (en) 2021-10-21 2021-10-21 Cleaning robot control method, chip and cleaning robot

Country Status (1)

Country Link
CN (1) CN113907652B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114711693A (en) * 2022-05-09 2022-07-08 珠海格力电器股份有限公司 Path optimization method, nonvolatile storage medium, and cleaning system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105640443A (en) * 2014-12-03 2016-06-08 小米科技有限责任公司 Muting work method and apparatus for automatic cleaning equipment, and electronic equipment
CN106155050A (en) * 2015-04-15 2016-11-23 小米科技有限责任公司 The mode of operation method of adjustment of intelligent cleaning equipment and device, electronic equipment
CN106338992A (en) * 2016-09-26 2017-01-18 海尔优家智能科技(北京)有限公司 Noise processing method and device
CN108628572A (en) * 2018-04-10 2018-10-09 平安科技(深圳)有限公司 Robot adjusts method, apparatus, computer equipment and the storage medium of volume
CN112506073A (en) * 2021-02-04 2021-03-16 南京乐羽智能科技有限公司 Intelligent home control method and system based on big data analysis

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105640443A (en) * 2014-12-03 2016-06-08 小米科技有限责任公司 Muting work method and apparatus for automatic cleaning equipment, and electronic equipment
WO2016086586A1 (en) * 2014-12-03 2016-06-09 小米科技有限责任公司 Automatic cleaning device quiet operation method and device, and electronic apparatus
CN106155050A (en) * 2015-04-15 2016-11-23 小米科技有限责任公司 The mode of operation method of adjustment of intelligent cleaning equipment and device, electronic equipment
CN106338992A (en) * 2016-09-26 2017-01-18 海尔优家智能科技(北京)有限公司 Noise processing method and device
CN108628572A (en) * 2018-04-10 2018-10-09 平安科技(深圳)有限公司 Robot adjusts method, apparatus, computer equipment and the storage medium of volume
CN112506073A (en) * 2021-02-04 2021-03-16 南京乐羽智能科技有限公司 Intelligent home control method and system based on big data analysis

Also Published As

Publication number Publication date
CN113907652A (en) 2022-01-11

Similar Documents

Publication Publication Date Title
JP7139226B2 (en) Mobile cleaning robot artificial intelligence for situational awareness
JP6430944B2 (en) Robot and method for autonomously inspecting or processing floor surfaces
CN110575099B (en) Fixed-point cleaning method, floor sweeping robot and storage medium
JP6427503B2 (en) Mobile Robot Providing Environment Mapping for Home Environment Control
JP4675811B2 (en) Position detection device, autonomous mobile device, position detection method, and position detection program
RU2312579C2 (en) Domestic network using self-mobile robot (versions)
KR102612613B1 (en) Air cleaner
US11806862B2 (en) Robots, methods, computer programs, computer-readable media, arrays of microphones and controllers
CN113907652B (en) Cleaning robot control method, chip and cleaning robot
CN111328386A (en) Exploration of unknown environments by autonomous mobile robots
CN113017518A (en) Cleaning control method and device for sweeping and mopping integrated robot
KR100962593B1 (en) Method and apparatus for area based control of vacuum cleaner, and recording medium thereof
CN106227059A (en) Intelligent home furnishing control method based on indoor threedimensional model and equipment
CN112890680B (en) Follow-up cleaning operation method, control device, robot and storage medium
CN105446332B (en) Automatic cleaning control method and device and electronic equipment
CN105785955A (en) Smart home control method, smart home equipment and intelligent terminal
CN111150331A (en) Information processing method and device, mobile cleaning equipment and computer readable storage medium
CN105856229A (en) Indoor positioning method, device and sweeping robot
WO2023025023A1 (en) Cleaning method and apparatus of mobile robot, and storage medium and electronic apparatus
CN109974225A (en) A kind of air conditioning control method, device, storage medium and air-conditioning
CN106338992A (en) Noise processing method and device
CN106131521A (en) A kind of robot projection method and apparatus
KR20200057825A (en) Air purifier of providing recommendation information corresponding schedule based on artificial intelligence controlling and method implementing thereof
CN113749585B (en) Semantic-based self-adaptive sweeping method for sweeping robot
CN114516061B (en) Robot control method, robot system and robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant