WO2018145275A1 - 一种智能清洁方法及装置 - Google Patents

一种智能清洁方法及装置 Download PDF

Info

Publication number
WO2018145275A1
WO2018145275A1 PCT/CN2017/073127 CN2017073127W WO2018145275A1 WO 2018145275 A1 WO2018145275 A1 WO 2018145275A1 CN 2017073127 W CN2017073127 W CN 2017073127W WO 2018145275 A1 WO2018145275 A1 WO 2018145275A1
Authority
WO
WIPO (PCT)
Prior art keywords
cleaning
person
cleaning area
module
area
Prior art date
Application number
PCT/CN2017/073127
Other languages
English (en)
French (fr)
Inventor
潘瑞
Original Assignee
格兰比圣(深圳)科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 格兰比圣(深圳)科技有限公司 filed Critical 格兰比圣(深圳)科技有限公司
Priority to PCT/CN2017/073127 priority Critical patent/WO2018145275A1/zh
Publication of WO2018145275A1 publication Critical patent/WO2018145275A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means

Definitions

  • the present invention relates to the field of smart devices, and in particular, to a smart cleaning method and device.
  • cleaning robots have been widely used, through sensors, to detect obstacles, such as hitting walls or other obstacles, turning and setting according to different rules, walking through different routes, covering the entire area.
  • the robot will automatically form the key information of the room, such as the room area and structure, and then automatically formulate the walking plan according to the location and the structure of the house, and set the control path with the body.
  • Repeat walking indoors such as along the side, concentrated, random, straight walking.
  • Most of the cleaning robots complete the cleaning process by themselves. Since the effective judgment of the obstacles cannot be achieved, the cleaning efficiency is not high and the problems are not intelligent.
  • the technical problem to be solved by the present invention is that, in view of the problem that the cleaning robot cannot achieve comprehensive cleaning in the prior art, an intelligent cleaning method and device capable of realizing human-computer interaction to jointly judge obstacles to achieve comprehensive and effective cleaning are provided.
  • a smart cleaning method comprising the following steps:
  • the person in the field of view is tracked, and the voice, face and/or limb movements of the tracked person are monitored in real time, and the voice, face and/or body motion of the tracked person is determined.
  • Movement and/or cleaning is performed in accordance with the command operation.
  • the acquiring the cleaning area includes: determining an origin of the cleaning area by a preset point, acquiring boundary data of the wall or the obstacle, generating a boundary of the initial cleaning area, and installing the cleaning area After being sent to the control terminal for confirmation, the cleaning area is generated.
  • tracking the person in the field of view when detecting a person in the cleaning area, includes:
  • the person When it is detected that there is only one person in the cleaning area, the person is regarded as the tracked person;
  • the person closest to the preset point is the tracked person.
  • the method further includes generating the work log of the moved and/or cleaned data and transmitting the work log to the control terminal.
  • the method further includes performing a question and answer interaction according to the voice information of the tracker.
  • the invention also provides an intelligent cleaning device, comprising: a camera module, a detecting module, a real-time positioning module, and a connection with the camera module, the real-time positioning module and the detecting module:
  • a cleaning area generating module for obtaining a cleaning area
  • a tracking module configured to track a person in the field of view when the cleaning area is detected
  • a monitoring and recognition module for monitoring and recognizing an instruction operation corresponding to a voice, a face, and/or a body motion of the tracked person in real time
  • a real-time positioning module is further included;
  • the cleaning area generating module is configured to: according to the preset point acquired by the real-time positioning module, the origin of the cleaning area, obtain the boundary data of the wall or the obstacle through the detecting module, and generate a boundary of the initial cleaning area, and The cleaning area is generated after the initial cleaning area is transmitted to the control terminal of the tracked person for confirmation.
  • the tracking module is specifically configured to use the person as a tracked person when detecting that there is only one person in the cleaning area;
  • the person closest to the preset point is the tracked person.
  • a log generating module is further included for generating a log of the moving and/or cleaning data and transmitting the data to the control terminal.
  • an interaction module is further included for performing a question and answer interaction according to the voice information of the tracker.
  • the invention realizes the determination of the cleaning area by realizing the intelligent interaction of the human-machine and the human-machine to jointly determine the obstacle, and at the same time, through the voice, the face and the The instruction operations corresponding to the limb movements are performed for cleaning to achieve a comprehensive and effective cleaning effect.
  • FIG. 1 is a schematic flow chart of a smart cleaning method provided by the present invention.
  • FIG. 2 is a schematic view showing the structure of a module of a smart cleaning device according to the present invention.
  • the smart cleaning method provided by the present invention comprises the following steps:
  • the obtaining the cleaning area includes: determining an origin of the cleaning area by using the preset point, acquiring boundary data of the wall or the obstacle, generating a boundary of the initial cleaning area, and sending the cleaning area to the control terminal for confirmation, and generating The cleaning area.
  • the preset point is the starting cleaning point, that is, the origin of the cleaning area. It can detect the wall or obstacle according to the detector or infrared sensor and obtain the boundary of the wall or obstacle, or identify it by preset parameter data, such as the size and position of the wall, the number of obstacles such as furniture, and the material. And location, etc.
  • the initial cleaning area is sent to the control terminal for confirmation, and the control unit can add and/or remove the area unit to be cleaned to avoid appearance. Missing or miscleaning problems.
  • the infrared sensor or the like it is detected whether there is any person in the clean area to cope with the generation of new garbage and achieve effective cleaning.
  • the person is the tracked person; when it is detected that there are many people in the cleaning area, the person closest to the preset point is the tracked person.
  • Track the tracked person in real time to remove the added garbage from the tracked person and identify the tracked
  • the instruction operation corresponding to the voice, face and/or limb movement of the person for example, when the tracked person issues the following instruction "Come here to clean" and guides by gestures such as front, back, left and right, that is, the cleaning purpose is achieved.
  • the instructions corresponding to the voice, face and/or limb movements of the tracked person can be preset by the control terminal.
  • the purpose of cleaning interaction can be realized through the voice interaction with the tracked person, such as encountering an unrecognizable object during the cleaning process, which can be judged by the tracked person through voice interaction or sent to the control terminal for judgment.
  • the level of cleanliness of a certain area of the control terminal and/or the tracked person can be informed to remind the area that needs to pay attention to hygiene in the future.
  • the cleaning includes cleaning, water spraying and mopping, and the cleaning mode is selected according to the instruction operation and the hygienic degree of the cleaning area.
  • S400 Save the data of the moving and/or cleaning data generation and send it to the control terminal to view the cleaning route and cleaning materials to avoid accidental cleaning or leakage cleaning.
  • the intelligent cleaning method provided by the invention confirms the generation of the cleaning area to achieve the cleaning of the cleaning area, and at the same time, realizes the interaction with the tracked person by tracking the tracker in the cleaning area, and realizes the dynamic generated by the tracked person. Cleaning of the garbage and the purpose of thorough cleaning of the cleaning area.
  • the present invention provides an intelligent cleaning device, including a dust suction port, a camera module 10, a detection module 20, a real-time positioning module 30, and a connection with the camera module, the real-time positioning module, and the detection module.
  • a cleaning area generating module 40 configured to acquire a cleaning area, specifically for real-time positioning according to the
  • the preset point acquired by the module 30 is the origin of the cleaning area, which can be understood as the position point where the cleaning device is currently located; the wall or obstacle is recognized by the camera module 10 and the detecting module 20, and the wall or obstacle is acquired.
  • the boundary data in turn, generates a boundary of the initial cleaning area, and after the initial cleaning area is sent to the control terminal for confirmation, the cleaning area is generated.
  • the wall or the obstacle may also be identified by preset parameter data, including the size and position of the wall, the number of obstacles such as the number of furniture, the material and the position.
  • the smart cleaning device can judge the obstacle according to the size of the dust suction port, that is, if the size of the object is larger than the size of the dust suction port, it is judged as an obstacle, and it is not necessary to clean, and the boundary is recognized as a cleaning area. Boundary; if the size of the object is smaller than the size of the suction port, it is one of the auxiliary conditions for judging the obstacle.
  • sending the initial cleaning area to the control terminal may be implemented by means of an app or the like, and the user further adds and/or removes the area unit to be cleaned by controlling the cleaning area received by the terminal. To create a final clean area, avoiding the problem of missing or miscleaning.
  • the tracking module 50 is configured to track a person in the field of view when the cleaning area is detected;
  • the person when it is detected that there is only one person in the cleaning area, the person is regarded as the tracked person; when it is detected that there are many people in the cleaning area, the person closest to the smart cleaning device is the tracked person.
  • the monitoring and identification module 60 is configured to monitor and identify the command operation corresponding to the voice, face and/or limb motion of the tracked person in real time;
  • the instructions corresponding to the voice, face and/or limb movements of the tracked person can be preset by the control terminal.
  • the cleaning includes cleaning, water spraying and mopping, and the cleaning mode is selected according to the instruction operation and the hygienic degree of the cleaning area.
  • the interaction module 80 is configured to perform a question and answer interaction according to the voice information of the tracker
  • the tracked person can judge or send to the control terminal through voice interaction to judge; at the same time, the control terminal can be informed And the degree of cleanliness of a certain area of the person being tracked to remind the area that needs to pay attention to hygiene in the future, such as “the bottom of the bed is dirty, please pay attention to maintaining hygiene”; in addition, the module can also be used to select the cleaning mode. It is suggested that if the tracked person selects the cleaning mode, but the smart cleaning device determines that the cleaning mode cannot be achieved only by the cleaning mode, the interaction module 70 suggests to increase the mopping mode, and the tracked person can select according to the needs at that time.
  • the log generation module 90 is configured to save the data generated by the mobile and/or cleaning and send it to the control terminal to view the cleaning route and the cleaning object to avoid accidental cleaning or leakage cleaning.
  • the above description is only a preferred embodiment of the present invention, and is not intended to limit the present invention.
  • the general position of the main board and the heat dissipation module may be appropriately adjusted according to the needs of the smart cleaning device itself, or may be The actual heat source status of the product is modified to adjust the position and performance of the heat dissipation module and the temperature equalizing material to achieve a balanced state.
  • the invention is capable of various modifications and changes. Any modifications, equivalent substitutions, improvements, etc. within the spirit and scope of the invention are intended to be included within the scope of the appended claims.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Electric Vacuum Cleaner (AREA)

Abstract

一种智能清洁装置智能清洁方法,包括以下步骤:获取清洁区域,检测当前清洁区域内是否有人;当检测到清洁区域内有人时,对视野内人员进行追踪,实时监测被追踪者的语音、脸部和/或肢体动作,判断被追踪者的语音、脸部和/或肢体动作所对应的指令操作,根据指令操作移动和/或清洁;同时提供一种智能清洁装置,通过实现人机智能交互,通过人机共同确定障碍物,来实现清洁区域的确定,同时,通过对语音、脸部及肢体动作所对应的指令操作来进行清洁,以实现全面有效地清洁效果。

Description

一种智能清洁方法及装置 技术领域
本发明涉及智能设备领域,尤其涉及一种智能清洁方法及装置。
背景技术
目前,清洁机器人已广泛投入使用,通过感应器,侦测障碍物,如碰到墙壁或其他障碍物,自行转弯并依不同规则设定,通过不同路线行走,全面覆盖区域。通过电磁脉冲自动扫描或超声波反射等方式,机器人会自动形成房间的关键信息,如房间面积和结构,然后根据自己所处的位置和房屋的结构自动制定行走计划,配合机身设定控制路径,在室内反复行走,如沿边、集中、随机、直线行走等。清洁机器人大都自行完成清洁过程,由于无法实现对障碍物的有效判断,因此导致清洁效率不高,且不智能等问题。
发明内容
本发明所要解决的技术问题是,针对现有技术中清洁机器人无法实现全面清洁的问题,提供一种能够实现人机交互共同判断障碍物实现全面有效清洁的智能清洁方法及装置。
本发明解决其技术问题所采用的技术方案是:一种智能清洁方法,包括以下步骤:
获取清洁区域,检测当前清洁区域内是否有人;
当检测到清洁区域内有人时,对视野内人员进行追踪,实时监视被追踪者的语音、脸部和/或肢体动作,判断所述被追踪者的语音、脸部和/或肢体动作所对应的指令操作,
根据所述指令操作进行移动和/或清洁。
在本发明提供的智能清洁方法中,所述获取清洁区域包括:以预设点确定为清洁区域的原点,获取墙或障碍物的边界数据,生成初始清洁区域的边界,并将所述清洁区域发送至控制终端进行确认后,生成所述清洁区域。
在本发明提供的智能清洁方法中,所述当检测到清洁区域内有人时,对视野内人员进行追踪包括:
当检测到清洁区域内只有一人时,以所述人员作为被追踪者;
当检测到清洁区域内有多人时,以距离所述预设点最近的人员为被追踪者。
在本发明提供的智能清洁方法中,还包括将所述移动和/或清洁的数据生成工作日志,并发送至控制终端。
在本发明提供的智能清洁方法中,还包括根据所述追踪者的语音信息进行问答交互。
在本发明还提供了一种智能清洁装置,包括:摄像模块,探测模块、实时定位模块,以及与所述摄像模块、实时定位模块和探测模块进行连接的:
清洁区域生成模块,用于获取清洁区域;
追踪模块,用于检测到所述清洁区域有人时,对视野内的人员进行追踪;
监视识别模块,用于实时监视并识别被追踪者的语音、脸部和/或肢体动作所对应的指令操作;
移动清洁模块,用于根据所述指令操作进行移动和/清洁。
在本发明提供的智能清洁装置中,还包括实时定位模块;
所述清洁区域生成模块,具体用于根据所述实时定位模块所获取的预设点为清洁区域的原点,通过探测模块获取墙或障碍物的边界数据,生成初始清洁区域的边界,并将所述初始清洁区域发送至所述被追踪者的控制终端进行确认后,生成所述清洁区域。
在本发明提供的智能清洁装置中,所述追踪模块,具体用于当检测到清洁区域内只有一人时,以所述人员作为被追踪者;
当检测到清洁区域内有多人时,以距离所述预设点最近的人员为被追踪者。
在本发明提供的智能清洁装置中,还包括日志生成模块,用于对所述移动和/清洁的数据生成日志并发送至所述控制终端。
在本发明提供的智能清洁装置中,还包括交互模块,用于根据所述追踪者的语音信息进行问答交互。
实施本发明提供的智能清洁方法及装置,可以达到以下有益效果:本发明通过实现人机智能交互,通过人机共同确定障碍物,来实现清洁区域的确定,同时,通过对语音、脸部及肢体动作所对应的指令操作来进行清洁,以实现全面有效地清洁效果。
附图说明
图1为本发明提供的一种智能清洁方法的流程示意图;
图2为本发明一种智能清洁装置的模块结构示意图。
具体实施方式
为了使本发明的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本发明进行进一步详细说明。应当理解,此处所描述的具体实施例仅用以解释本发明,并不用于限定本发明。
实施例1
如图1所示,本发明提供的一种智能清洁方法,包括以下步骤:
S100、获取清洁区域,检测当前清洁区域内是否有人;
其中,获取清洁区域,具体包括以预设点确定为清洁区域的原点,获取墙或障碍物的边界数据,生成初始清洁区域的边界,并将所述清洁区域发送至控制终端进行确认后,生成所述清洁区域。
其中,预设点为起始清洁点,即清洁区域的原点。可根据探测器或红外传感器进行探测识别墙或障碍物并获取墙或障碍物的边界,亦或是通过预置的参数数据进行识别,如墙的尺寸、位置,障碍物如家具的数量、材质和位置等。为确保清洁区域的有效获取以及提高清洁效率,在获取初始清洁区域后,将初始清洁区域发送至控制终端进行确认,通过控制终端可进行添加和/或移除需要清洁的区域单元,以避免出现遗漏或误清洁的问题。
S200、当检测到清洁区域内有人时,对视野内人员进行追踪,实时监视被追踪者的语音、脸部和/或肢体动作,判断所述被追踪者的语音、脸部和/或肢体动作所对应的指令操作。
根据红外传感器等检测清洁区域内是否有人,以应对新增垃圾的产生并实现有效清洁的目的。当检测到清洁区域内只有一人时,以该人员为被追踪者;当检测到清洁区域内有多人时,以距离所述预设点最近的人员为被追踪者。对被追踪者进行实时追踪,以清除被追踪者带来的新增垃圾,并能够识别被追踪 者的语音、脸部和/或肢体动作所对应的指令操作,如当被追踪者发出如下指令“来这里打扫”并通过前后左右等手势指引,即实现清洁目的。被追踪者的语音、脸部和/或肢体动作所对应的指令,可通过控制终端进行预设。
除此之外,还可通过与被追踪者的语音交互实现清洁交互目的,如在清洁过程中,遇到无法识别的物体,可通过语音交互让被追踪者来判断或发送至控制终端进行判断;同时,可告知控制终端和/被追踪者某一区域的洁净程度,以此来提醒日后需要注意卫生的区域。
S300、根据所述指令操作进行移动和/或清洁。
其中,清洁包括清扫、喷水和拖地等模式,根据指令操作以及清洁区域的卫生程度选择相应的清洁模式进行清洁。
S400、将移动和/或清洁的数据生成日志保存,并发送至控制终端,可查看清洁路线及清洁物,以避免误清洁或漏清洁。
本发明提供的智能清洁方法,通过清洁区域的生成确认,以实现清洁区域的清洁,同时,通过追踪清洁区域内的追踪者,实现与被追踪者的交互,并实现对被追踪者产生的动态垃圾的清扫以及对清洁区域的全面清洁目的。
实施例2
如图2所示,本发明提供的一种智能清洁装置,包括吸尘口、摄像模块10,探测模块20、实时定位模块30,以及与所述摄像模块、实时定位模块和探测模块进行连接的清洁区域生成模块40、追踪模块50、监视识别模块60、移动清洁模块70、交互模块80和日志生成模块90。
具体地,
清洁区域生成模块40,用于获取清洁区域,具体用于根据所述实时定位 模块30所获取的预设点为清洁区域的原点,该预设点可理解为清洁装置当前所在的位置点;通过摄像模块10和探测模块20识别出墙或障碍物,并获取墙或障碍物的边界数据,进而生成初始清洁区域的边界,将所述初始清洁区域发送至控制终端进行确认后,生成所述清洁区域。可选的是,也可通过预置的参数数据进行墙或障碍物的识别,所述参数数据包括墙的尺寸、位置,障碍物如家具的数量、材质和位置等。需要说明的是,该智能清洁装置可根据其吸尘口的尺寸来判断识别障碍物,即若物体的尺寸大于吸尘口尺寸,则判断为障碍物,无需清洁,识别其边界,作为清洁区域边界;若物体的尺寸小于吸尘口尺寸,则作为判断障碍物的辅助条件之一。
可以理解的是,将所述初始清洁区域发送至所述控制终端,可以是通过app等手段来实现,用户通过控制终端接收到的清洁区域来进一步添加和/或移除需要清洁的区域单元,以生成最终的清洁区域,从而可避免出现遗漏或误清洁的问题。
追踪模块50,用于检测到所述清洁区域有人时,对视野内的人员进行追踪;
具体地,当检测到清洁区域内只有一人时,以所述人员作为被追踪者;当检测到清洁区域内有多人时,以距离该智能清洁装置最近的人员为被追踪者。
监视识别模块60,用于实时监视并识别被追踪者的语音、脸部和/或肢体动作所对应的指令操作;
对被追踪者进行实时追踪,以清除被追踪者带来的新增垃圾,并能够识别被追踪者的语音、脸部和/或肢体动作所对应的指令操作,如当被追踪者发出如下指令“来这里打扫”并通过手势指引,即实现清洁目的。被追踪者的语音、脸部和/或肢体动作所对应的指令,可通过控制终端进行预设。
若未检测到所述清洁区域内有人,则根据清洁区域和区域内洁净程度自行进行清洁。
移动清洁模块70,用于根据所述指令操作进行移动和/清洁;
其中,清洁包括清扫、喷水和拖地等模式,根据指令操作以及清洁区域的卫生程度选择相应的清洁模式进行清洁。
交互模块80,用于根据所述追踪者的语音信息进行问答交互;
通过与被追踪者的语音交互实现清洁交互目的,如在清洁过程中,遇到无法识别的物体,可通过语音交互让被追踪者来判断或发送至控制终端进行判断;同时,可告知控制终端和/被追踪者某一区域的洁净程度,以此来提醒日后需要注意卫生的区域,如“床底很脏,请注意保持卫生”等;另外,通过该模块还可对清洁模式的选择提出建议,如被追踪者选择清扫模式,但该智能清洁装置判断仅通过清扫模式并不能达到清洁的目的,则通过交互模块70建议增加拖地模式,被追踪者可根据当时的需要进行选择等。
日志生成模块90,用于将移动和/或清洁的数据生成日志保存,并发送至控制终端,可查看清洁路线及清洁物,以避免误清洁或漏清洁。
以上所述仅为本发明的优选实施例而已,并不用于限制本发明,对于本领域的技术人员来说,可根据智能清洁装置自身需要适当调整主板、散热模组的大体位置,也可根据产品实际热源状况,对散热模组和均温材料的位置及性能进行修改,以达到一种平衡状态。本发明可以有各种更改和变化。凡在本发明的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本发明的权利要求范围之内。

Claims (10)

  1. 一种智能清洁方法,其特征在于,包括以下步骤:
    获取清洁区域,检测当前清洁区域内是否有人;
    当检测到清洁区域内有人时,对视野内人员进行追踪,实时监视被追踪者的语音、脸部和/或肢体动作,判断所述被追踪者的语音、脸部和/或肢体动作所对应的指令操作,
    根据所述指令操作进行移动和/或清洁。
  2. 根据权利要求1所述的方法,其特征在于,所述获取清洁区域包括:以预设点确定为清洁区域的原点,获取墙或障碍物的边界数据,生成初始清洁区域的边界,并将所述清洁区域发送至控制终端进行确认后,生成所述清洁区域。
  3. 根据权利要求1所述的方法,其特征在于,所述当检测到清洁区域内有人时,对视野内人员进行追踪包括:
    当检测到清洁区域内只有一人时,以所述人员作为被追踪者;
    当检测到清洁区域内有多人时,以距离所述预设点最近的人员为被追踪者。
  4. 根据权利要求1所述的方法,其特征在于,还包括将所述移动和/或清洁的数据生成工作日志,并发送至控制终端。
  5. 根据权利要求1所述的方法,其特征在于,还包括根据所述追踪者的语音信息进行问答交互。
  6. 一种智能清洁装置,其特征在于,包括:摄像模块,探测模块、实时定位模块,以及与所述摄像模块、实时定位模块和探测模块进行连接的:
    清洁区域生成模块,用于获取清洁区域;
    追踪模块,用于检测到所述清洁区域有人时,对视野内的人员进行追踪;
    监视识别模块,用于实时监视并识别被追踪者的语音、脸部和/或肢体动作所对应的指令操作;
    移动清洁模块,用于根据所述指令操作进行移动和/清洁。
  7. 根据权利要求6所述的智能清洁装置,其特征在于,还包括实时定位模块;
    所述清洁区域生成模块,具体用于根据所述实时定位模块所获取的预设点为清洁区域的原点,通过探测模块获取墙或障碍物的边界数据,生成初始清洁区域的边界,并将所述初始清洁区域发送至所述被追踪者的控制终端进行确认后,生成所述清洁区域。
  8. 根据权利要求6所述的智能清洁装置,其特征在于,
    所述追踪模块,具体用于当检测到清洁区域内只有一人时,以所述人员作为被追踪者;
    当检测到清洁区域内有多人时,以距离所述预设点最近的人员为被追踪者。
  9. 根据权利要求6所述的智能清洁装置,其特征在于,还包括日志生成模块,用于对所述移动和/清洁的数据生成日志并发送至所述控制终端。
  10. 根据权利要求6所述的智能清洁装置,其特征在于,还包括交互模块,用于根据所述追踪者的语音信息进行问答交互。
PCT/CN2017/073127 2017-02-08 2017-02-08 一种智能清洁方法及装置 WO2018145275A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/073127 WO2018145275A1 (zh) 2017-02-08 2017-02-08 一种智能清洁方法及装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/073127 WO2018145275A1 (zh) 2017-02-08 2017-02-08 一种智能清洁方法及装置

Publications (1)

Publication Number Publication Date
WO2018145275A1 true WO2018145275A1 (zh) 2018-08-16

Family

ID=63106979

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/073127 WO2018145275A1 (zh) 2017-02-08 2017-02-08 一种智能清洁方法及装置

Country Status (1)

Country Link
WO (1) WO2018145275A1 (zh)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101084817A (zh) * 2007-04-26 2007-12-12 复旦大学 开放智能计算构架的家用多功能小型服务机器人
KR20100056905A (ko) * 2008-11-20 2010-05-28 주식회사 대우일렉트로닉스 홈 네트워크 시스템을 이용한 로봇 청소기 및 그 제어 방법
CN104605793A (zh) * 2014-09-23 2015-05-13 东莞市万锦电子科技有限公司 地面清洁机器人系统及智能家电系统
CN105407774A (zh) * 2013-07-29 2016-03-16 三星电子株式会社 自动清扫系统、清扫机器人和控制清扫机器人的方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101084817A (zh) * 2007-04-26 2007-12-12 复旦大学 开放智能计算构架的家用多功能小型服务机器人
KR20100056905A (ko) * 2008-11-20 2010-05-28 주식회사 대우일렉트로닉스 홈 네트워크 시스템을 이용한 로봇 청소기 및 그 제어 방법
CN105407774A (zh) * 2013-07-29 2016-03-16 三星电子株式会社 自动清扫系统、清扫机器人和控制清扫机器人的方法
CN104605793A (zh) * 2014-09-23 2015-05-13 东莞市万锦电子科技有限公司 地面清洁机器人系统及智能家电系统

Similar Documents

Publication Publication Date Title
US11709497B2 (en) Method for controlling an autonomous mobile robot
US8892256B2 (en) Methods for real-time and near real-time interactions with robots that service a facility
EP3684563B1 (en) Moving robot and control method thereof
JP6199507B2 (ja) 掃除方法、装置、プログラム及び記録媒体
CN104965552B (zh) 一种基于情感机器人的智能家居环境协同控制方法及系统
WO2019007038A1 (zh) 扫地机器人、扫地机器人系统及其工作方法
CN108231079A (zh) 用于控制电子设备的方法、装置、设备以及计算机可读存储介质
US11806862B2 (en) Robots, methods, computer programs, computer-readable media, arrays of microphones and controllers
TW202038842A (zh) 待吸物收集站、由待吸物收集站與抽吸式清潔設備組成之系統以及相應方法
CN109432466A (zh) 一种便携式智能消毒机器人、消毒路径控制方法及芯片
JP2019034138A (ja) 自律移動する掃除機の動作方法
JP2007147217A (ja) ロボット
TW201947338A (zh) 複數個移動式機器人及其控制方法
WO2022227533A1 (zh) 清扫控制方法、装置和空调机
WO2015100958A1 (zh) 机器人系统及机器人办公、教学、设计、工程、家庭系统
WO2018145275A1 (zh) 一种智能清洁方法及装置
CN110881909A (zh) 一种扫地机控制方法及装置
KR102521849B1 (ko) 로봇 청소기
Volkhardt et al. Finding people in apartments with a mobile robot
US20230221427A1 (en) Navigation device applied to a wireless identification tag
US11631279B2 (en) Smart cleaning system
US20220087498A1 (en) Self-cleaning environment
CN206373913U (zh) 一种智能物联网家庭安全服务机器人
US20220133112A1 (en) Self-propelled cleaning appliance
KR20200106107A (ko) 이동 로봇 및 이동 로봇의 제어방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17896212

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17896212

Country of ref document: EP

Kind code of ref document: A1