WO2017101721A1 - 自动清洁设备及清洁方法 - Google Patents

自动清洁设备及清洁方法 Download PDF

Info

Publication number
WO2017101721A1
WO2017101721A1 PCT/CN2016/108935 CN2016108935W WO2017101721A1 WO 2017101721 A1 WO2017101721 A1 WO 2017101721A1 CN 2016108935 W CN2016108935 W CN 2016108935W WO 2017101721 A1 WO2017101721 A1 WO 2017101721A1
Authority
WO
WIPO (PCT)
Prior art keywords
automatic cleaning
cleaning device
data
processor
preset
Prior art date
Application number
PCT/CN2016/108935
Other languages
English (en)
French (fr)
Inventor
薛英男
夏勇峰
Original Assignee
小米科技有限责任公司
北京石头世纪科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=57437726&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=WO2017101721(A1) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by 小米科技有限责任公司, 北京石头世纪科技有限公司 filed Critical 小米科技有限责任公司
Priority to JP2018528748A priority Critical patent/JP6823794B2/ja
Priority to EP16874778.0A priority patent/EP3391797B1/en
Priority to EA201891263A priority patent/EA039532B1/ru
Priority to AU2016372758A priority patent/AU2016372758B2/en
Priority to KR1020187015795A priority patent/KR20180081546A/ko
Publication of WO2017101721A1 publication Critical patent/WO2017101721A1/zh
Priority to AU2018100726A priority patent/AU2018100726A4/en
Priority to US16/009,977 priority patent/US11013385B2/en
Priority to US17/242,200 priority patent/US20210251450A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0248Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means in combination with a laser
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2805Parameters or conditions being sensed
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2836Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means characterised by the parts which are controlled
    • A47L9/2852Elements for displacement of the vacuum cleaner or the accessories therefor, e.g. wheels, casters or nozzles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection

Definitions

  • the invention relates to the field of automatic cleaning technology, in particular to an automatic cleaning device and a cleaning method.
  • various types of automatic cleaning operations can be realized by various automatic cleaning devices such as a smart sweeping robot and an intelligent mopping robot, which brings a convenient user experience.
  • the automatic cleaning device needs to generate map information of the surrounding area in real time to realize automatic cleaning operation.
  • the present invention provides an automatic cleaning device to solve the deficiencies in the related art.
  • an automatic cleaning device comprising:
  • a collecting unit configured to collect preset environmental parameters around the automatic cleaning device
  • An application processor AP the central processing unit CPU included in the AP is electrically connected to the collection unit, to obtain the preset environment parameter collected by the collection module, and the AP further includes a graphics processor GPU.
  • the GPU is electrically connected to the CPU, and the GPU obtains the preset environment parameter from the CPU, and generates a map around the automatic cleaning device accordingly.
  • the collecting unit includes: a laser ranging device LDS; wherein distance data collected by the LDS and surrounding objects is used as the preset environmental parameter.
  • the LDS includes: a point laser emitter that obtains distance data between the surrounding objects by generating a point laser.
  • the LDS includes: a line laser emitter that obtains distance data with a surrounding object by generating a line laser.
  • the collecting unit includes: an image collecting device; wherein the surrounding object image data collected by the image collecting device is used as the preset environment parameter.
  • the GPU includes:
  • a storage component in which a particle filter based positioning algorithm is stored
  • a computing component connected to the storage component, configured to retrieve the positioning algorithm and perform calculation processing on the preset environment parameter to obtain a map around the automatic cleaning device.
  • it also includes:
  • the pre-processing unit is respectively connected to the collection unit and the CPU, and is configured to pre-process the preset environment parameter, so that the CPU obtains the pre-processed preset environment parameter.
  • the preprocessing unit comprises: a digital signal processor DSP.
  • the automatic cleaning device is a cleaning robot or a mopping robot.
  • a cleaning method of an automatic cleaning device comprising:
  • a data collection step using a collection unit to collect preset environmental parameters around the automatic cleaning device
  • the preprocessing unit is used to preprocess the preset environment parameter, and the preprocessed preset environment parameter is provided to the central processor;
  • the central processor provides the pre-processed preset environment parameters to the graphics processor, and the graphics processor generates map data around the automatic cleaning device accordingly.
  • the data processing step further includes: the graphics processor includes a computing component and a storage component electrically connected to each other, the computing component retrieving a stored particle filter based positioning algorithm in the storage component, and Performing calculation processing on the pre-processed preset environmental parameters to obtain a map around the automatic cleaning device.
  • the graphics processor includes a computing component and a storage component electrically connected to each other, the computing component retrieving a stored particle filter based positioning algorithm in the storage component, and Performing calculation processing on the pre-processed preset environmental parameters to obtain a map around the automatic cleaning device.
  • the data collecting step includes collecting data by using a laser ranging device, and using the distance data collected between the surrounding objects as the preset environmental parameter.
  • the data collecting step includes collecting data by using an image collecting device, and using the collected surrounding object image data as the preset environment parameter.
  • a computer control system for an automatic cleaning device includes: a central processing unit, an image processor, an acquisition unit, and a preprocessing unit; the central processing unit, the image processor The acquisition unit and the pre-processing unit are connected by a communication bus; the acquisition unit is configured to collect preset environmental parameters around the automatic cleaning device; the central processor is configured to acquire the preset environmental parameters collected by the collection module; The graphics processor obtains the preset environmental parameters from the central processor and generates a map around the automatic cleaning device accordingly.
  • a mobile electronic device includes: a communication connection establishing module, configured to establish a communication connection between the mobile electronic device and the automatic cleaning device; and a position instruction sending module, configured to: Sending a location information request instruction to the automatic cleaning device; and a location receiving module, configured to receive the automatic cleaning device every The location information is returned once in a preset time, the location information includes a real-time location where the automatic cleaning device is located, and a display module is configured to display the location information on an interaction interface of the mobile electronic device.
  • control command sending module is configured to send an action request instruction to the automatic cleaning device.
  • the present invention adopts a structure in which the CPU and the GPU are matched in the AP of the automatic cleaning device, so that the GPU can be dedicated to generating a map around the automatic cleaning device, and the CPU can also be used for data processing in other aspects.
  • Process control, through the GPU to share the map generation process can reduce the data processing requirements of the CPU, help to increase the processing capacity and reaction speed of the automatic cleaning equipment, to improve the efficiency of the automatic cleaning equipment.
  • FIG. 1 is a schematic structural view of an automatic cleaning device according to an exemplary embodiment.
  • FIG. 2 is a schematic structural view of another automatic cleaning device according to an exemplary embodiment.
  • FIG. 3 is a schematic structural diagram of a GPU according to an exemplary embodiment.
  • FIGS. 4-7 are schematic structural views of a cleaning robot according to an exemplary embodiment.
  • the automatic cleaning device may include: an acquisition unit 10 and an application processor AP (Application Processor) 20;
  • the unit 10 is configured to collect preset environmental parameters around the automatic cleaning device, and the AP 20 can generate a map around the automatic cleaning device by analyzing and processing the preset environmental parameters, for the automatic cleaning device to walk and perform automatic Clean and other operations.
  • AP Application Processor
  • the AP 20 further includes a central processing unit CPU 201 and a graphics processor.
  • GPU 202 is electrically connected to the collection unit 10 and obtains the preset environment parameters collected by the CPU 201; and the GPU 202 is electrically connected to the CPU 201, and obtains the preset environment parameters acquired by the CPU 201 from the collection unit 10, thereby Set environmental parameters to generate a map around the automatic cleaning device.
  • the present invention can simultaneously configure the CPU 201 and the GPU 202 in the AP 20, so that the GPU 202 can share the processing pressure of the CPU 201 on the one hand, and can fully utilize the structural characteristics and data processing performance of the GPU 202 on the other hand, thereby speeding up the real-time generation of the map. Improve the efficiency of automatic cleaning equipment.
  • the automatic cleaning device may further include: a pre-processing unit 30.
  • the pre-processing unit 30 is connected to the collection unit 10 and the CPU 201 for pre-processing the preset environment parameters, so that the CPU obtains the pre-processed preset environment parameters.
  • the pre-processing unit 30 may be a digital signal processor DSP that performs pre-processing of preset environmental parameters obtained by the acquisition unit 10 by, for example, format conversion, integration, cleaning, etc. of the data, to facilitate the GPU 202. Final processing of the preset environmental parameters.
  • the GPU 202 When the GPU 202 generates a map according to preset environmental parameters, it can perform calculation and processing in various ways.
  • the fusion of sensor data can be performed by a sensor fusion algorithm.
  • the GPU 202 can locate the automatic cleaning device in the work area by using a particle filter based positioning algorithm, and obtain a corresponding map, which is formed based on multiple
  • the sensor is obtained by fusion of algorithms on a common time basis; the positioning method combined with particle filter and GPU parallel computing solves the problem of positioning accuracy and avoids falling into local optimum problem, and achieves real-time performance through parallel computing.
  • Heuristic search algorithm is used for path planning, which theoretically ensures that the optimal path is searched, and the calculation amount is greatly optimized, so that the path planning can be solved in real time.
  • the GPU 202 may include: a storage component 202A in which a particle filter-based positioning algorithm is stored; of course, a GPU and a graphics memory (RAM) may be separately set; the computing component 202B, The method is connected to the storage component 202A for retrieving the positioning algorithm in the storage component 202A, and performing calculation processing on the preset environment parameter according to the positioning algorithm to obtain a map around the automatic cleaning device.
  • the GPU rasterizes the working area formed by the line enclosing of the reflective points of the object around the automatic cleaning device and obtains the coordinate values of the intersections; the GPU calculates each intersection to connect the various reflective points.
  • the laser emitter When moving in the working area, the laser emitter emits a laser reflection line formed by the reflection of the laser emission line through the surrounding object, and the surrounding object is provided with a direct light function of making the laser reflection line parallel to the laser emission line.
  • the receiving portion can receive multiple laser reflections simultaneously a plurality of first angles between the head orientation line of the automatic cleaning device and the plurality of laser reflection lines are measured by the angle encoder; the GPU performs arithmetic processing on the plurality of first angles to obtain each line A third set of angles between the laser reflection lines; the GPU compares the third set of angles with the second set of angles to obtain a position of the robot within the coordinate system.
  • the location of the automatic cleaning device within the map is determined in real time by the GPU.
  • the automatic cleaning device can adopt a plurality of different types of collection units 10, and the corresponding preset environmental parameters and the data processing methods adopted by the GPU 202 may have corresponding differences.
  • the technical solution of the present invention will be described in detail below with reference to the structural schematic diagram of the cleaning robot shown in FIGS. 4-7.
  • the cleaning robot 100 (of course, may also be other types of automatic cleaning equipment such as a mopping robot, the invention is not limited thereto) includes a machine body 110, a sensing system 120, a control system 130, Drive system 140, cleaning system 150, energy system 160, and human-machine interaction system 170. among them:
  • the machine body 110 includes a forward portion 111 and a rearward portion 112 having an approximately circular shape (both front and rear are circular), and may have other shapes including, but not limited to, an approximate D-shape of the front rear circle.
  • the sensing system 120 includes a position determining device 121 located above the machine body 110, a buffer 122 located at the forward portion 111 of the machine body 110, a cliff sensor 123 and an ultrasonic sensor (not shown), and an infrared sensor (not shown) a sensor, a magnetometer (not shown), an accelerometer (not shown), a gyroscope (not shown), an odometer (not shown), etc., to the control system 130 provides various location information and motion status information for the machine.
  • the location determining device 121 includes the acquisition unit 10 in the embodiment shown in FIG. 1 or FIG. 2, for example, the acquisition unit 10 may be an image acquisition device, a laser ranging device (LDS), or the like.
  • LDS laser ranging device
  • the acquisition unit 10 is an image acquisition device (such as a camera)
  • the preset environmental parameter collected by the image acquisition device is image data of surrounding objects of the cleaning robot
  • the GPU 202 passes the image of the surrounding object. The data is analyzed and processed to generate the corresponding map.
  • the acquisition unit 10 is a laser ranging device
  • the distance data collected by the laser ranging device and the surrounding object is used as the preset environmental parameter, and then the GPU 202 passes the pair.
  • the surrounding object is analyzed and processed by the data to generate a corresponding map.
  • the laser ranging device of the triangulation method is taken as an example to illustrate how to determine the position.
  • the basic principle of the triangulation method is based on the equivalence relation of similar triangles, and will not be described here.
  • the laser distance measuring device includes a light emitting unit and a light receiving unit.
  • the light emitting unit may include a light source that emits light
  • the light source may include a light emitting element such as an infrared or visible light emitting diode (LED) that emits infrared light or visible light.
  • the light source may be a light emitting element that emits a laser beam.
  • a laser diode (LD) is taken as an example of a light source.
  • a light source using a laser beam can make the measurement more accurate than other light.
  • Lines or visible rays are affected by environmental factors (such as the color or texture of the object) and may be reduced in measurement accuracy.
  • the laser diode (LD) can be a point laser, which measures the two-dimensional position information of the obstacle, or a line laser, and measures the three-dimensional position information within a certain range of the obstacle.
  • the light receiving unit may include an image sensor on which a spot of light reflected or scattered by the obstacle is formed.
  • the image sensor may be a collection of a plurality of unit pixels in a single row or in multiple rows. These light receiving elements can convert an optical signal into an electrical signal.
  • the image sensor may be a complementary metal oxide semiconductor (CMOS) sensor or a charge coupled device (CCD) sensor, which is preferably a complementary metal oxide semiconductor (CMOS) sensor due to cost advantages.
  • CMOS complementary metal oxide semiconductor
  • CCD charge coupled device
  • the light receiving unit may include a light receiving lens assembly. Light reflected or scattered by the obstacle may travel through the light receiving lens assembly to form an image on the image sensor.
  • the light receiving lens assembly may include a single or multiple lenses.
  • the base may support the light emitting unit and the light receiving unit, and the light emitting unit and the light receiving unit are disposed on the base and spaced apart from each other by a specific distance.
  • the base may be rotatably disposed on the main body 110, or the base itself may be rotated without rotating, and the emitted light and the received light may be rotated by providing the rotating element.
  • the rotational angular velocity of the rotating element can be obtained by setting an optocoupler element and a code wheel.
  • the optocoupler element senses a missing tooth on the code wheel, and the instantaneous angular velocity can be obtained by dividing the slippage time of the tooth gap distance and the distance between the tooth gaps.
  • a data processing device such as a DSP, connected to the light receiving unit records and transmits an obstacle distance value at all angles in the 0° angular direction of the robot to a data processing unit in the control system 130, such as an application processor including a CPU. (AP), the CPU runs a particle filter based positioning algorithm to obtain the current position of the robot, and maps according to the position for navigation.
  • the positioning algorithm preferably uses Instant Location and Map Construction (SLAM).
  • the forward portion 111 of the machine body 110 can carry a buffer 122 that detects one of the travel paths of the robot 100 via a sensor system, such as an infrared sensor, while the drive wheel module 141 is propelling the robot to walk on the ground during cleaning. Multiple events (or objects), the robot can control the drive wheel module 141 to cause the robot to respond to the event (or object) through events (or objects) detected by the buffer 122, such as obstacles, walls, For example, away from obstacles.
  • a sensor system such as an infrared sensor
  • the control system 130 is disposed on a circuit board in the machine body 110, and includes a computing processor in communication with a non-transitory memory such as a hard disk, a flash memory, a random access memory, such as a central processing unit, an application processor, and application processing. Based on the obstacle information fed back by the laser ranging device, a positioning algorithm, such as SLAM, is used to draw an instant map in the environment in which the robot is located.
  • a positioning algorithm such as SLAM
  • the control system 130 can greatly improve the cleaning efficiency of the robot based on the cleaning path and the cleaning mode of the map information planning, which is the most efficient and reasonable.
  • Drive system 140 can maneuver robot 100 to travel across the ground based on drive commands having distance and angle information, such as x, y, and ⁇ components.
  • the drive system 140 includes a drive wheel module 141 that can simultaneously control the left and right wheels.
  • the drive wheel modules 141 preferably include a left drive wheel module and a right drive wheel module, respectively.
  • the left and right drive wheel modules are opposed along a transverse axis defined by the body 110.
  • the robot may include one or more driven wheels 142 including, but not limited to, a universal wheel.
  • the drive wheel module includes a traveling wheel and a drive motor and a control circuit for controlling the drive motor.
  • the drive wheel module can also be connected to a circuit for measuring the drive current and an odometer.
  • the drive wheel module 141 can be detachably coupled to the main body 110 for easy assembly and disassembly.
  • the drive wheel can have an offset drop suspension system that is movably fastened, for example rotatably attached to the robot body 110, and receives a spring bias that is biased downward and away from the robot body 110.
  • the spring bias allows the drive wheel to maintain contact and traction with the ground with a certain amount of ground force while the cleaning elements of the robot 100 also contact the ground 10 with a certain pressure.
  • the cleaning system 150 can be a dry cleaning system and/or a wet cleaning system.
  • the main cleaning function is derived from the cleaning system 151 consisting of a roller brush, a dust box, a fan, an air outlet, and a connecting member between the four.
  • a roller brush that has some interference with the ground sweeps the garbage on the ground and rolls it in front of the suction port between the roller brush and the dust box, and then is sucked into the dust box by the suction fan generated by the fan and passing through the dust box.
  • the dust removal capacity of the sweeper can be characterized by Dub picking efficiency (Dust pick up efficiency).
  • the cleaning efficiency DPU is affected by the brush structure and material, and is connected by the dust suction port, the dust box, the fan, the air outlet and the four.
  • the wind utilization rate of the air duct formed by the components is affected by the type and power of the wind turbine and is a responsible system design problem.
  • the increased dust removal capacity is more important for energy-limited cleaning robots. Because the improvement of dust removal ability directly reduces the energy requirement, that is to say, the machine that can clean the ground of 80 square meters can be evolved to charge 100 square meters or more. And the battery life of the battery that reduces the number of times of charging is also greatly increased, so that the frequency of replacing the battery by the user is also increased.
  • the dry cleaning system can also include an edge brush 152 having a rotating shaft that is angled relative to the ground for moving debris into the roller brushing area of the cleaning system 150.
  • Energy system 160 includes rechargeable batteries, such as nickel metal hydride batteries and lithium batteries.
  • the rechargeable battery can be connected with a charging control circuit, a battery pack charging temperature detecting circuit and a battery undervoltage monitoring circuit, a charging control circuit, and a battery pack charging temperature.
  • the degree detection circuit and the battery undervoltage monitoring circuit are connected to the single chip control circuit.
  • the main body is charged by connecting the charging electrode provided on the side or below the fuselage to the charging post. If the exposed charging electrode is stained with dust, the plastic body around the electrode melts and deforms due to the accumulation effect of electric charge during the charging process, and even the electrode itself is deformed, and normal charging cannot be continued.
  • the human-computer interaction system 170 includes buttons on the host panel, the buttons are for the user to select functions, and may also include a display screen and/or an indicator light and/or a speaker, the display screen, the indicator light and the speaker display the current state of the machine or Feature selection; also includes a mobile client program.
  • the mobile phone client can display the map of the environment where the device is located and the location of the machine, and can provide the user with richer and more user-friendly functions.
  • the robot 100 can travel on the ground by various combinations of movements of three mutually perpendicular axes defined by the body 110: the transverse axis x, the front and rear axes y, and Center vertical axis z.
  • the forward driving direction along the front and rear axis y is indicated as "forward”
  • the backward driving direction along the front and rear axis y is indicated as "backward”.
  • the transverse axis x extends substantially between the right and left wheels of the robot along an axis defined by the center point of the drive wheel module 141.
  • the robot 100 can be rotated about the x-axis. When the forward portion of the robot 100 is inclined upward, it is “upward” when it is inclined downward toward the rearward portion, and is “downward” when the forward portion of the robot 100 is inclined downward. Additionally, the robot 100 can be rotated about the z-axis. In the forward direction of the robot, when the robot 100 is tilted to the right of the Y-axis to "right turn", when the robot 100 is tilted to the left of the y-axis to "left turn".

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Mechanical Engineering (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Electric Vacuum Cleaner (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

一种自动清洁设备,包括:采集单元(10),用于采集所述自动清洁设备周围的预设环境参数;应用处理器(20),所述应用处理器(20)中包含的中央处理器(201)电连接至所述采集单元(10),以获取所述采集单元(10)采集到的所述预设环境参数;所述应用处理器(20)中还包括图形处理器(202),所述图形处理器(202)电连接至所述中央处理器(201),且所述图形处理器(202)从所述中央处理器(201)处获得所述预设环境参数,并据此生成所述自动清洁设备周围的地图。本方案可以增加自动清洁设备的处理能力和反应速度,以提升自动清洁设备的工作效率。

Description

自动清洁设备及清洁方法
本申请基于申请号为201521054625.2、申请日为2015年12年16日的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此引入本申请作为参考。
技术领域
本发明涉及自动清洁技术领域,尤其涉及一种自动清洁设备及清洁方法。
背景技术
在相关技术中,通过智能扫地机器人、智能拖地机器人等各种自动清洁设备,可以实现各种类型的自动清洁操作,给用户带来便捷的使用体验。其中,自动清洁设备需要实时生成周围区域的地图信息,以实现自动清洁操作。
然而,由于自动清洁设备的处理能力有限,往往会由于地图生成速度较慢而导致工作效率的降低。
发明内容
本发明提供一种自动清洁设备,以解决相关技术中的不足。
根据本发明实施例的第一方面,提供一种自动清洁设备,包括:
采集单元,用于采集所述自动清洁设备周围的预设环境参数;
应用处理器AP,所述AP中包含的中央处理器CPU电连接至所述采集单元,以获取所述采集模块采集到的所述预设环境参数;所述AP中还包括图形处理器GPU,所述GPU电连接至所述CPU,且所述GPU从所述CPU处获得所述预设环境参数,并据此生成所述自动清洁设备周围的地图。
可选的,所述采集单元包括:激光测距装置LDS;其中,所述LDS采集到的与周围物体之间的距离数据被作为所述预设环境参数。
可选的,所述LDS中包括:点激光发射器,所述点激光发射器通过生成点激光来获得与周围物体之间的距离数据。
可选的,所述LDS中包括:线激光发射器,所述线激光发射器通过生成线激光来获得与周围物体之间的距离数据。
可选的,所述采集单元包括:图像采集装置;其中,所述图像采集装置采集到的周围物体图像数据被作为所述预设环境参数。
可选的,所述GPU包括:
存储组件,所述存储组件中存放有基于粒子滤波的定位算法;
计算组件,连接至所述存储组件,用于调取所述定位算法并对所述预设环境参数进行计算处理,以得到所述自动清洁设备周围的地图。
可选的,还包括:
预处理单元,分别连接至所述采集单元和所述CPU,用于对所述预设环境参数进行预处理,以供所述CPU获得预处理后的所述预设环境参数。
可选的,所述预处理单元包括:数字信号处理器DSP。
可选的,所述自动清洁设备为扫地机器人或拖地机器人。
根据本发明的实施例的第二方面,提供一种自动清洁设备的清洁方法,包括:
数据采集步骤,利用采集单元采集所述自动清洁设备周围的预设环境参数;
数据预处理步骤,利用预处理单元对上述预设环境参数进行预处理,将预处理后的预设环境参数提供给中央处理器;
数据处理,中央处理器将预处理后的预设环境参数提供给图形处理器,图形处理器据此生成所述自动清洁设备周围的地图数据。
可选地,所述数据处理步骤进一步包括,所述图形处理器包括相互电连接的计算组件和存储组件,所述计算组件调取所述存储组件中的存储的基于粒子滤波的定位算法,并对预处理后的预设环境参数进行计算处理,以得到所述自动清洁设备周围的地图。
可选地,所述数据采集步骤,包括利用激光测距装置采集数据,将采集到的与周围物体之间的距离数据作为所述预设环境参数。
可选地,所述数据采集步骤,包括利用图像采集装置采集数据,将采集到的周围物体图像数据作为所述预设环境参数。
根据本发明的实施例的第三方面,提供一种自动清洁设备的计算机控制系统,包括:中央处理器、图像处理器、采集单元和预处理单元;所述中央处理器、所述图像处理器、采集单元和预处理单元通过通信总线连接;采集单元用于采集所述自动清洁设备周围的预设环境参数;中央处理器用于获取所述采集模块采集到的所述预设环境参数;所述图形处理器从所述中央处理器处获得所述预设环境参数,并据此生成所述自动清洁设备周围的地图。
根据本发明的实施例的第四方面,提供一种移动电子设备,包括:通信连接建立模块,用于将所述移动电子设备与上述的自动清洁设备建立通信连接;位置指令发送模块,用于向所述自动清洁设备发送位置信息请求指令;位置接收模块,用于接收所述自动清洁设备每隔 预设时间返回一次的位置信息,所述位置信息包括所述自动清洁设备所处的实时位置;显示模块,用于将所述位置信息显示在所述移动电子设备的交互界面。
进一步地,包括控制指令发送模块,用于向所述自动清洁设备发送动作请求指令。
本发明的实施例提供的技术方案可以包括以下有益效果:
由上述实施例可知,本发明通过在自动清洁设备的AP中采用CPU与GPU相配合的结构,使得GPU可以专用于生成自动清洁设备周围的地图,而CPU也可以用于其他方面的数据处理与过程控制,从而通过GPU对地图生成过程的分担,可以降低对CPU的数据处理需求,有助于增加自动清洁设备的处理能力和反应速度,以提升自动清洁设备的工作效率。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本发明。
附图说明
此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本发明的实施例,并与说明书一起用于解释本发明的原理。
图1是根据一示例性实施例示出的一种自动清洁设备的结构示意图。
图2是根据一示例性实施例示出的另一种自动清洁设备的结构示意图。
图3是根据一示例性实施例示出的一种GPU的结构示意图。
图4-7是根据一示例性实施例示出的一种扫地机器人的结构示意图。
具体实施方式
这里将详细地对示例性实施例进行说明,其示例表示在附图中。下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本发明相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本发明的一些方面相一致的装置和方法的例子。
图1是根据一示例性实施例示出的一种自动清洁设备的结构示意图,如图1所示,该自动清洁设备可以包括:采集单元10和应用处理器AP(Application Processor)20;其中,采集单元10用于采集该自动清洁设备周围的预设环境参数,而AP20可以通过对该预设环境参数的分析和处理,生成该自动清洁设备周围的地图,以供该自动清洁设备行走并执行自动清洁等操作。
在本发明的自动清洁设备中,AP20中进一步包含中央处理器CPU201和图形处理器 GPU202。其中,CPU201电连接至该采集单元10并获得其采集到的上述预设环境参数;以及,GPU202电连接至CPU201,并获得CPU201从采集单元10处获取的上述预设环境参数,从而根据该预设环境参数来生成该自动清洁设备周围的地图。
在本实施例中,由于GPU202自身的结构特性,使得其相比于CPU201更加适合于执行大量数据的相同计算,而采集单元10采集到的预设环境参数正是大数据量的同类型数据。所以,本发明通过在AP20中同时配置CPU201和GPU202,使得一方面GPU202能够分担CPU201的处理压力,另一方面能够充分发挥出GPU202自身的结构特性和数据处理性能,从而能够加快地图的实时生成、提升自动清洁设备的工作效率。
1、数据预处理
如图2所示,自动清洁设备中还可以包括:预处理单元30。其中,预处理单元30分别连接至采集单元10和CPU201,用于对该预设环境参数进行预处理,以供该CPU获得预处理后的该预设环境参数。举例而言,预处理单元30可以为数字信号处理器DSP,该DSP通过譬如对数据的格式变换、集成、清洗等,执行对采集单元10所得到的预设环境参数的预处理,以便于GPU202对该预设环境参数的最终处理。
2、数据处理
GPU202根据预设环境参数生成地图时,可以通过多种方式进行计算和处理。传感器数据的融合可通过传感器融合算法来执行,举例而言,GPU202可以通过基于粒子滤波的定位算法,在工作区域内对自动清洁设备进行定位,并得到相应的地图,该地图的形成是基于多传感器以共同的时间基础来通过算法进行融合而得到的;采用粒子滤波与GPU并行计算相结合的定位方法,解决了定位精度问题和避免陷入局部最优问题,又同时通过并行计算达到了实时性要求;采用启发式搜索算法进行路径规划,在理论上确保搜索到最优路径的同时,大大优化了计算量,使得路径规划能够实时求解。
相应地,如图3所示,GPU202可以包括:存储组件202A,该存储组件202A中存放有基于粒子滤波的定位算法;当然,也可以单独设置GPU和图图形存储器(RAM);计算组件202B,连接至存储组件202A,用于调取存储组件202A中的定位算法,并根据该定位算法对该预设环境参数进行计算处理,以得到该自动清洁设备周围的地图。在一个实施例中,GPU对自动清洁设备周围物体的反光点的连线包围所形成的工作区域进行栅格化并得出各交叉点的坐标值;GPU计算出每一交叉点连接各个反光点所形成的各条连接线之间的多个第二角度,该多个第二角度组成与每一交叉点对应的第二角度组,然后将每一第二角度组存储起来;当自动清洁设备在工作区域内移动时,激光发射器发出激光发射线经周围物体反射形成的激光反射线被所述接收部接收,所述周围物体具备使该激光反射线平行于激光发射线的光线直反功能,且所述接收部可同时接收到多条激光反射 线,通过角度编码器对应测得自动清洁设备的机头朝向线与所述多条激光反射线之间的多个第一角度;GPU对所述多个第一角度进行运算处理以获得各条激光反射线之间的一个第三角度组;GPU将所述第三角度组与所述第二角度组进行对比以获得机器人在所述坐标系内的位置。从而,通过GPU实时确定自动清洁设备在地图内的位置。
3、采集单元10和预设环境参数
自动清洁设备可以采用多种不同类型的采集单元10,则相应采集到的预设环境参数以及GPU202采用的数据处理方式均可能存在相应差异。为了便于理解,下面结合图4-7所示的扫地机器人的结构示意图,对本发明的技术方案进行详细说明。
如图4-7所示,扫地机器人100(当然,也可以为诸如拖地机器人等其他类型的自动清洁设备,本发明并不对此进行限制)包含机器主体110、感知系统120、控制系统130、驱动系统140、清洁系统150、能源系统160和人机交互系统170。其中:
机器主体110包括前向部分111和后向部分112,具有近似圆形形状(前后都为圆形),也可具有其他形状,包括但不限于前方后圆的近似D形形状。
感知系统120包括位于机器主体110上方的位置确定装置121、位于机器主体110的前向部分111的缓冲器122、悬崖传感器123和超声传感器(图中未示出)、红外传感器(图中未示出)、磁力计(图中未示出)、加速度计(图中未示出)、陀螺仪(图中未示出)、里程计(图中未示出)等传感装置,向控制系统130提供机器的各种位置信息和运动状态信息。位置确定装置121中包括图1或图2所示实施例中的采集单元10,比如该采集单元10可以为图像采集装置、激光测距装置(LDS)等。
1)在一种情况下,当采集单元10为图像采集装置(如摄像头)时,该图像采集装置采集到的预设环境参数为扫地机器人的周围物体图像数据,则GPU202通过对该周围物体图像数据的分析处理,即可生成相应的地图。
2)在另一种情况下,当采集单元10为激光测距装置时,所述激光测距装置采集到的与周围物体之间的距离数据被作为所述预设环境参数,则GPU202通过对该周围物体距离数据的分析处理,即可生成相应的地图。
下面以三角测距法的激光测距装置为例说明如何进行位置确定。三角测距法的基本原理基于相似三角形的等比关系,在此不做赘述。
激光测距装置包括发光单元和受光单元。发光单元可以包括发射光的光源,光源可以包括发光元件,例如发射红外光线或可见光线的红外或可见光线发光二极管(LED)。优选地,光源可以是发射激光束的发光元件。在本实施例中,将激光二极管(LD)作为光源的例子。具体地,由于激光束的单色、定向和准直特性,使用激光束的光源可以使得测量相比于其它光更为准确。例如,相比于激光束,发光二极管(LED)发射的红外光 线或可见光线受周围环境因素影响(例如对象的颜色或纹理),而在测量准确性上可能有所降低。激光二极管(LD)可以是点激光,测量出障碍物的二维位置信息,也可以是线激光,测量出障碍物一定范围内的三维位置信息。
受光单元可以包括图像传感器,在该图像传感器上形成由障碍物反射或散射的光点。图像传感器可以是单排或者多排的多个单位像素的集合。这些受光元件可以将光信号转换为电信号。图像传感器可以为互补金属氧化物半导体(CMOS)传感器或者电荷耦合元件(CCD)传感器,由于成本上的优势优选是互补金属氧化物半导体(CMOS)传感器。而且,受光单元可以包括受光透镜组件。由障碍物反射或散射的光可以经由受光透镜组件行进以在图像传感器上形成图像。受光透镜组件可以包括单个或者多个透镜。
基部可以支撑发光单元和受光单元,发光单元和受光单元布置在基部上且彼此间隔一特定距离。为了测量机器人周围360°方向上的障碍物情况,可以使基部可旋转地布置在主体110上,也可以基部本身不旋转而通过设置旋转元件而使发射光、接收光发生旋转。旋转元件的旋转角速度可以通过设置光耦元件和码盘获得,光耦元件感应码盘上的齿缺,通过齿缺间距的滑过时间和齿缺间距离值相除可得到瞬时角速度。码盘上齿缺的密度越大,测量的准确率和精度也就相应越高,但在结构上就更加精密,计算量也越高;反之,齿缺的密度越小,测量的准确率和精度相应也就越低,但在结构上可以相对简单,计算量也越小,可以降低一些成本。
与受光单元连接的数据处理装置,如DSP,将相对于机器人0°角方向上的所有角度处的障碍物距离值记录并传送给控制系统130中的数据处理单元,如包含CPU的应用处理器(AP),CPU运行基于粒子滤波的定位算法获得机器人的当前位置,并根据此位置制图,供导航使用。定位算法优选使用即时定位与地图构建(SLAM)。
机器主体110的前向部分111可承载缓冲器122,在清洁过程中驱动轮模块141推进机器人在地面行走时,缓冲器122经由传感器系统,例如红外传感器,检测机器人100的行驶路径中的一或多个事件(或对象),机器人可通过由缓冲器122检测到的事件(或对象),例如障碍物、墙壁,而控制驱动轮模块141使机器人来对所述事件(或对象)做出响应,例如远离障碍物。
控制系统130设置在机器主体110内的电路主板上,包括与非暂时性存储器,例如硬盘、快闪存储器、随机存取存储器,通信的计算处理器,例如中央处理单元、应用处理器,应用处理器根据激光测距装置反馈的障碍物信息利用定位算法,例如SLAM,绘制机器人所在环境中的即时地图。并且结合缓冲器122、悬崖传感器123和超声传感器、红外传感器、磁力计、加速度计、陀螺仪、里程计等传感装置反馈的距离信息、速度信 息综合判断扫地机当前处于何种工作状态,如过门槛,上地毯,位于悬崖处,上方或者下方被卡住,尘盒满,被拿起等等,还会针对不同情况给出具体的下一步动作策略,使得机器人的工作更加符合主人的要求,有更好的用户体验。进一步地,控制系统130能基于SLAM绘制的即使地图信息规划最为高效合理的清扫路径和清扫方式,大大提高机器人的清扫效率。
驱动系统140可基于具有距离和角度信息,例如x、y及θ分量,的驱动命令而操纵机器人100跨越地面行驶。驱动系统140包含驱动轮模块141,驱动轮模块141可以同时控制左轮和右轮,为了更为精确地控制机器的运动,优选驱动轮模块141分别包括左驱动轮模块和右驱动轮模块。左、右驱动轮模块沿着由主体110界定的横向轴对置。为了机器人能够在地面上更为稳定地运动或者更强的运动能力,机器人可以包括一个或者多个从动轮142,从动轮包括但不限于万向轮。驱动轮模块包括行走轮和驱动马达以及控制驱动马达的控制电路,驱动轮模块还可以连接测量驱动电流的电路和里程计。驱动轮模块141可以可拆卸地连接到主体110上,方便拆装和维修。驱动轮可具有偏置下落式悬挂系统,以可移动方式紧固,例如以可旋转方式附接,到机器人主体110,且接收向下及远离机器人主体110偏置的弹簧偏置。弹簧偏置允许驱动轮以一定的着地力维持与地面的接触及牵引,同时机器人100的清洁元件也以一定的压力接触地面10。
清洁系统150可为干式清洁系统和/或湿式清洁系统。作为干式清洁系统,主要的清洁功能源于滚刷、尘盒、风机、出风口以及四者之间的连接部件所构成的清扫系统151。与地面具有一定干涉的滚刷将地面上的垃圾扫起并卷带到滚刷与尘盒之间的吸尘口前方,然后被风机产生并经过尘盒的有吸力的气体吸入尘盒。扫地机的除尘能力可用垃圾的清扫效率DPU(Dust pick up efficiency)进行表征,清扫效率DPU受滚刷结构和材料影响,受吸尘口、尘盒、风机、出风口以及四者之间的连接部件所构成的风道的风力利用率影响,受风机的类型和功率影响,是个负责的系统设计问题。相比于普通的插电吸尘器,除尘能力的提高对于能源有限的清洁机器人来说意义更大。因为除尘能力的提高直接有效降低了对于能源要求,也就是说原来充一次电可以清扫80平米地面的机器,可以进化为充一次电清扫100平米甚至更多。并且减少充电次数的电池的使用寿命也会大大增加,使得用户更换电池的频率也会增加。更为直观和重要的是,除尘能力的提高是最为明显和重要的用户体验,用户会直接得出扫得是否干净/擦得是否干净的结论。干式清洁系统还可包含具有旋转轴的边刷152,旋转轴相对于地面成一定角度,以用于将碎屑移动到清洁系统150的滚刷区域中。
能源系统160包括充电电池,例如镍氢电池和锂电池。充电电池可以连接有充电控制电路、电池组充电温度检测电路和电池欠压监测电路,充电控制电路、电池组充电温 度检测电路、电池欠压监测电路再与单片机控制电路相连。主机通过设置在机身侧方或者下方的充电电极与充电桩连接进行充电。如果裸露的充电电极上沾附有灰尘,会在充电过程中由于电荷的累积效应,导致电极周边的塑料机体融化变形,甚至导致电极本身发生变形,无法继续正常充电。
人机交互系统170包括主机面板上的按键,按键供用户进行功能选择;还可以包括显示屏和/或指示灯和/或喇叭,显示屏、指示灯和喇叭向用户展示当前机器所处状态或者功能选择项;还可以包括手机客户端程序。对于路径导航型清洁设备,在手机客户端可以向用户展示设备所在环境的地图,以及机器所处位置,可以向用户提供更为丰富和人性化的功能项。
为了更加清楚地描述机器人的行为,进行如下方向定义:机器人100可通过相对于由主体110界定的如下三个相互垂直轴的移动的各种组合在地面上行进:横向轴x、前后轴y及中心垂直轴z。沿着前后轴y的前向驱动方向标示为“前向”,且沿着前后轴y的向后驱动方向标示为“后向”。横向轴x实质上是沿着由驱动轮模块141的中心点界定的轴心在机器人的右轮与左轮之间延伸。
机器人100可以绕x轴转动。当机器人100的前向部分向上倾斜,向后向部分向下倾斜时为“上仰”,且当机器人100的前向部分向下倾斜,向后向部分向上倾斜时为“下俯”。另外,机器人100可以绕z轴转动。在机器人的前向方向上,当机器人100向Y轴的右侧倾斜为“右转”,当机器人100向y轴的左侧倾斜为“左转”。
本领域技术人员在考虑说明书及实践这里公开的公开后,将容易想到本发明的其它实施方案。本申请旨在涵盖本发明的任何变型、用途或者适应性变化,这些变型、用途或者适应性变化遵循本发明的一般性原理并包括本发明未公开的本技术领域中的公知常识或惯用技术手段。说明书和实施例仅被视为示例性的,本发明的真正范围和精神由下面的权利要求指出。
应当理解的是,本发明并不局限于上面已经描述并在附图中示出的精确结构,并且可以在不脱离其范围进行各种修改和改变。本发明的范围仅由所附的权利要求来限制。

Claims (16)

  1. 一种自动清洁设备,其特征在于,包括:
    采集单元,用于采集所述自动清洁设备周围的预设环境参数;
    应用处理器,所述应用处理器中包含中央处理器,中央处理器电连接至所述采集单元,以获取所述采集模块采集到的所述预设环境参数;所述应用处理器中还包括图形处理器,所述图形处理器电连接至所述中央处理器,且所述图形处理器从所述中央处理器处获得所述预设环境参数,并据此生成所述自动清洁设备周围的地图。
  2. 根据权利要求1所述的自动清洁设备,其特征在于,所述采集单元包括:激光测距装置LDS;其中,所述LDS采集到的与周围物体之间的距离数据被作为所述预设环境参数。
  3. 根据权利要求2所述的自动清洁设备,其特征在于,所述LDS中包括:点激光发射器,所述点激光发射器通过生成点激光来获得与周围物体之间的距离数据。
  4. 根据权利要求2所述的自动清洁设备,其特征在于,所述LDS中包括:线激光发射器,所述线激光发射器通过生成线激光来获得与周围物体之间的距离数据。
  5. 根据权利要求1-4任一所述的自动清洁设备,其特征在于,所述采集单元包括:图像采集装置;其中,所述图像采集装置采集到的周围物体图像数据被作为所述预设环境参数。
  6. 根据权利要求1所述的自动清洁设备,其特征在于,所述图形处理器包括:
    存储组件,所述存储组件中存放有基于粒子滤波的定位算法;
    计算组件,连接至所述存储组件,用于调取所述定位算法并对所述预设环境参数进行计算处理,以得到所述自动清洁设备周围的地图。
  7. 根据权利要求1所述的自动清洁设备,其特征在于,还包括:
    预处理单元,分别连接至所述采集单元和所述中央处理器,用于对所述预设环境参数进行预处理,以供所述中央处理器获得预处理后的所述预设环境参数。
  8. 根据权利要求7所述的自动清洁设备,其特征在于,所述预处理单元包括:数字信号处理器DSP。
  9. 根据权利要求1所述的自动清洁设备,其特征在于,所述自动清洁设备为扫地机器人或拖地机器人。
  10. 一种自动清洁设备的清洁方法,其特征在于,包括:
    数据采集步骤,利用采集单元采集所述自动清洁设备周围的预设环境参数;
    数据预处理步骤,利用预处理单元对上述预设环境参数进行预处理,将预处理后的预设 环境参数提供给中央处理器;
    数据处理,中央处理器将预处理后的预设环境参数提供给图形处理器,图形处理器据此生成所述自动清洁设备周围的地图数据。
  11. 如权利要求10所述的方法,其特征在于,包括:
    所述数据处理步骤进一步包括,所述图形处理器包括相互电连接的计算组件和存储组件,所述计算组件调取所述存储组件中的存储的基于粒子滤波的定位算法,并对预处理后的预设环境参数进行计算处理,以得到所述自动清洁设备周围的地图。
  12. 如权利要求10或11所述的自动清洁设备的清洁方法,其特征在于,包括:
    所述数据采集步骤,包括利用激光测距装置采集数据,将采集到的与周围物体之间的距离数据作为所述预设环境参数。
  13. 如权利要求10-12任一所述的自动清洁设备的清洁方法,其特征在于,包括:
    所述数据采集步骤,包括利用图像采集装置采集数据,将采集到的周围物体图像数据作为所述预设环境参数。
  14. 一种自动清洁设备的计算机控制系统,其特征在于,包括:中央处理器、图像处理器、采集单元和预处理单元;所述中央处理器、所述图像处理器、采集单元和预处理单元通过通信总线连接;采集单元用于采集所述自动清洁设备周围的预设环境参数;中央处理器用于获取所述采集模块采集到的所述预设环境参数;所述图形处理器从所述中央处理器处获得所述预设环境参数,并据此生成所述自动清洁设备周围的地图。
  15. 一种移动电子设备,其特征在于,包括:通信连接建立模块,用于将所述移动电子设备与如权利要求1-9任一所述的自动清洁设备建立通信连接;位置指令发送模块,用于向所述自动清洁设备发送位置信息请求指令;位置接收模块,用于接收所述自动清洁设备每隔预设时间返回一次的位置信息,所述位置信息包括所述自动清洁设备所处的实时位置;显示模块,用于将所述位置信息显示在所述移动电子设备的交互界面。
  16. 如权利要求15所述的移动电子设备,其特征在于,包括控制指令发送模块,用于向所述自动清洁设备发送动作请求指令。
PCT/CN2016/108935 2015-12-16 2016-12-07 自动清洁设备及清洁方法 WO2017101721A1 (zh)

Priority Applications (8)

Application Number Priority Date Filing Date Title
JP2018528748A JP6823794B2 (ja) 2015-12-16 2016-12-07 自動清掃機器及び清掃方法
EP16874778.0A EP3391797B1 (en) 2015-12-16 2016-12-07 Automatic cleaning device and cleaning method
EA201891263A EA039532B1 (ru) 2015-12-16 2016-12-07 Автоматическое устройство для уборки и способ уборки
AU2016372758A AU2016372758B2 (en) 2015-12-16 2016-12-07 Automatic cleaning device and cleaning method
KR1020187015795A KR20180081546A (ko) 2015-12-16 2016-12-07 자동 청소 디바이스 및 청소 방법
AU2018100726A AU2018100726A4 (en) 2015-12-16 2018-05-31 Automatic cleaning device and cleaning method
US16/009,977 US11013385B2 (en) 2015-12-16 2018-06-15 Automatic cleaning device and cleaning method
US17/242,200 US20210251450A1 (en) 2015-12-16 2021-04-27 Automatic cleaning device and cleaning method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201521054625.2 2015-12-16
CN201521054625.2U CN205671994U (zh) 2015-12-16 2015-12-16 自动清洁设备

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/009,977 Continuation US11013385B2 (en) 2015-12-16 2018-06-15 Automatic cleaning device and cleaning method

Publications (1)

Publication Number Publication Date
WO2017101721A1 true WO2017101721A1 (zh) 2017-06-22

Family

ID=57437726

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/108935 WO2017101721A1 (zh) 2015-12-16 2016-12-07 自动清洁设备及清洁方法

Country Status (8)

Country Link
US (2) US11013385B2 (zh)
EP (1) EP3391797B1 (zh)
JP (1) JP6823794B2 (zh)
KR (1) KR20180081546A (zh)
CN (1) CN205671994U (zh)
AU (2) AU2016372758B2 (zh)
EA (1) EA039532B1 (zh)
WO (1) WO2017101721A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113475976A (zh) * 2020-03-16 2021-10-08 珠海格力电器股份有限公司 机器人可通行区域确定方法、装置、存储介质及机器人
CN114847803A (zh) * 2018-10-29 2022-08-05 北京石头创新科技有限公司 机器人的定位方法及装置、电子设备、存储介质

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106377209B (zh) * 2016-11-11 2022-07-22 北京地平线机器人技术研发有限公司 可移动清洁设备及其控制方法
TWI634403B (zh) * 2017-01-26 2018-09-01 好樣科技有限公司 自動清潔機及其控制方法
CN108873878A (zh) * 2017-06-22 2018-11-23 北京石头世纪科技有限公司 自主机器人及其控制方法、装置、系统和计算机可读介质
JP6984215B2 (ja) * 2017-08-02 2021-12-17 ソニーグループ株式会社 信号処理装置、および信号処理方法、プログラム、並びに移動体
CN107788916B (zh) * 2017-11-08 2018-09-28 安嘉琦 智能家居清洁一体机
CN107822565B (zh) * 2017-11-08 2020-06-26 上海雷盎云智能技术有限公司 基于数据分析实现狭缝清理的智能家居扫地机
CN109602338A (zh) * 2018-11-26 2019-04-12 深圳乐动机器人有限公司 一种清洁地面的方法、扫地机器人及拖地机器人
KR102103291B1 (ko) 2019-02-28 2020-05-27 한국생산기술연구원 내부에 위치하는 라이다 장치를 포함하는 로봇 청소기 및 이를 이용한 청소구역 탐지방법
KR102117868B1 (ko) 2019-02-28 2020-06-04 한국생산기술연구원 높이 조절이 가능한 라이다 구동부 및 이를 이용한 로봇 청소기
KR102224637B1 (ko) 2019-07-05 2021-03-08 엘지전자 주식회사 이동 로봇 및 그 제어방법
KR102275300B1 (ko) 2019-07-05 2021-07-08 엘지전자 주식회사 이동 로봇 및 그 제어방법
KR102297496B1 (ko) 2019-07-11 2021-09-02 엘지전자 주식회사 인공지능을 이용한 이동 로봇 및 이동 로봇의 제어방법
KR102361130B1 (ko) 2019-07-11 2022-02-09 엘지전자 주식회사 이동 로봇 및 그 제어방법
US10947685B1 (en) * 2020-09-10 2021-03-16 Jay Hirshberg Object-gathering apparatus
US11640166B2 (en) 2021-06-29 2023-05-02 Nanning Fulian Fugui Precision Industrial Co., Ltd. Method, mobile device and cleaning robot for specifying cleaning areas
TWI772079B (zh) * 2021-06-29 2022-07-21 新加坡商鴻運科股份有限公司 清掃區域指定方法、移動終端及掃地機器人
USD990802S1 (en) * 2021-08-05 2023-06-27 Shenzhen Haitao Optimization Technology Co., Ltd. Sweeping robot
KR20230102666A (ko) * 2021-12-30 2023-07-07 한국전자기술연구원 딥러닝 기반의 지능 소형 모빌리티 장치 및 시스템
CN116269059B (zh) * 2023-05-19 2023-08-11 杭州涂鸦信息技术有限公司 扫地机器人标定系统及方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201573208U (zh) * 2009-06-16 2010-09-08 泰怡凯电器(苏州)有限公司 实现室内服务机器人同时定位和地图创建的装置及机器人
CN102053623A (zh) * 2009-11-10 2011-05-11 德国福维克控股公司 用于控制机器人的方法
CN102283616A (zh) * 2010-10-22 2011-12-21 青岛科技大学 基于机器视觉的家庭智能清理系统
US20120106829A1 (en) * 2010-11-03 2012-05-03 Tae-Kyeong Lee Robot cleaner and controlling method of the same
CN103048996A (zh) * 2012-12-27 2013-04-17 深圳先进技术研究院 基于激光扫描测距仪的自动引导车、系统及导航方法
CN104302218A (zh) * 2012-12-28 2015-01-21 艾罗伯特公司 自主覆盖机器人

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6481515B1 (en) * 2000-05-30 2002-11-19 The Procter & Gamble Company Autonomous mobile surface treating apparatus
US7751919B2 (en) 2006-08-19 2010-07-06 Dynamic Micro Systems Method for operating equipment using buffer station having emergency access
KR100877072B1 (ko) * 2007-06-28 2009-01-07 삼성전자주식회사 이동 로봇을 위한 맵 생성 및 청소를 동시에 수행하는 방법및 장치
KR101461185B1 (ko) * 2007-11-09 2014-11-14 삼성전자 주식회사 스트럭쳐드 라이트를 이용한 3차원 맵 생성 장치 및 방법
KR101325145B1 (ko) 2009-10-28 2013-11-06 한국전자통신연구원 이동 로봇의 길 지도를 이용한 경로 탐색 장치 및 방법
KR20110119118A (ko) * 2010-04-26 2011-11-02 엘지전자 주식회사 로봇 청소기, 및 이를 이용한 원격 감시 시스템
TW201305761A (zh) * 2011-07-21 2013-02-01 Ememe Robot Co Ltd 自走機器人及其定位方法
US8798840B2 (en) * 2011-09-30 2014-08-05 Irobot Corporation Adaptive mapping with spatial summaries of sensor data
US9037396B2 (en) * 2013-05-23 2015-05-19 Irobot Corporation Simultaneous localization and mapping for a mobile robot
TWI547623B (zh) 2013-07-16 2016-09-01 許今彥 多感應場智慧馬桶
JP5897517B2 (ja) * 2013-08-21 2016-03-30 シャープ株式会社 自律移動体
US9589359B2 (en) * 2014-04-24 2017-03-07 Intel Corporation Structured stereo
KR102527645B1 (ko) * 2014-08-20 2023-05-03 삼성전자주식회사 청소 로봇 및 그 제어 방법
AU2015322263B2 (en) * 2014-09-24 2018-03-22 Samsung Electronics Co., Ltd. Cleaning robot and method for controlling cleaning robot
KR101620428B1 (ko) * 2014-10-10 2016-05-12 엘지전자 주식회사 로봇 청소기 및 로봇 청소기의 제어방법
KR102388448B1 (ko) * 2015-06-09 2022-04-21 삼성전자주식회사 이동 로봇 및 그 제어 방법

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201573208U (zh) * 2009-06-16 2010-09-08 泰怡凯电器(苏州)有限公司 实现室内服务机器人同时定位和地图创建的装置及机器人
CN102053623A (zh) * 2009-11-10 2011-05-11 德国福维克控股公司 用于控制机器人的方法
CN102283616A (zh) * 2010-10-22 2011-12-21 青岛科技大学 基于机器视觉的家庭智能清理系统
US20120106829A1 (en) * 2010-11-03 2012-05-03 Tae-Kyeong Lee Robot cleaner and controlling method of the same
CN103048996A (zh) * 2012-12-27 2013-04-17 深圳先进技术研究院 基于激光扫描测距仪的自动引导车、系统及导航方法
CN104302218A (zh) * 2012-12-28 2015-01-21 艾罗伯特公司 自主覆盖机器人

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3391797A4 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114847803A (zh) * 2018-10-29 2022-08-05 北京石头创新科技有限公司 机器人的定位方法及装置、电子设备、存储介质
CN114847803B (zh) * 2018-10-29 2024-04-16 北京石头创新科技有限公司 机器人的定位方法及装置、电子设备、存储介质
CN113475976A (zh) * 2020-03-16 2021-10-08 珠海格力电器股份有限公司 机器人可通行区域确定方法、装置、存储介质及机器人

Also Published As

Publication number Publication date
JP2019505256A (ja) 2019-02-28
KR20180081546A (ko) 2018-07-16
EP3391797B1 (en) 2022-03-23
US20180289228A1 (en) 2018-10-11
AU2016372758A1 (en) 2018-06-21
CN205671994U (zh) 2016-11-09
EP3391797A4 (en) 2019-04-10
EA201891263A1 (ru) 2018-11-30
US20210251450A1 (en) 2021-08-19
JP6823794B2 (ja) 2021-02-03
AU2018100726A4 (en) 2018-07-19
EA039532B1 (ru) 2022-02-08
AU2016372758B2 (en) 2019-04-18
EP3391797A1 (en) 2018-10-24
US11013385B2 (en) 2021-05-25

Similar Documents

Publication Publication Date Title
WO2017101721A1 (zh) 自动清洁设备及清洁方法
WO2020200282A1 (zh) 机器人工作区域地图构建方法、装置、机器人和介质
TWI789625B (zh) 一種清潔機器人及其控制方法
CN114468898B (zh) 机器人语音控制方法、装置、机器人和介质
EP3998007A1 (en) Automatic cleaning device control method and apparatus, device and medium
TW202110379A (zh) 一種清潔機器人及其控制方法
CN109920425B (zh) 机器人语音控制方法、装置、机器人和介质
WO2022048153A1 (zh) 机器人的定位方法及装置、存储介质
CN217792839U (zh) 自动清洁设备
WO2022227876A1 (zh) 一种测距方法、装置、机器人和存储介质
CN106253362B (zh) 电路控制系统及方法、自主清洁设备
CN210673215U (zh) 一种多光源探测机器人
CN211270533U (zh) 一种摄像头装置及清洁机器人
CN210931183U (zh) 一种清洁机器人
CN112244705B (zh) 智能清洁设备、控制方法、计算机存储介质
CN207782431U (zh) 电路控制系统及自主清洁设备
JP7433430B2 (ja) カメラ装置及び清掃ロボット
CN210673216U (zh) 一种滤光式机器人
AU2024201545A1 (en) Camera apparatus and cleaning robot
CN116942017A (zh) 自动清洁设备、控制方法及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16874778

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2018528748

Country of ref document: JP

ENP Entry into the national phase

Ref document number: 20187015795

Country of ref document: KR

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 1020187015795

Country of ref document: KR

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2016372758

Country of ref document: AU

Date of ref document: 20161207

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 201891263

Country of ref document: EA

WWE Wipo information: entry into national phase

Ref document number: 2016874778

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2016874778

Country of ref document: EP

Effective date: 20180716