CN111281274A - Visual floor sweeping method and system - Google Patents

Visual floor sweeping method and system Download PDF

Info

Publication number
CN111281274A
CN111281274A CN202010190776.XA CN202010190776A CN111281274A CN 111281274 A CN111281274 A CN 111281274A CN 202010190776 A CN202010190776 A CN 202010190776A CN 111281274 A CN111281274 A CN 111281274A
Authority
CN
China
Prior art keywords
sweeping
garbage
cleaning
platform
scheme
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010190776.XA
Other languages
Chinese (zh)
Inventor
张伟伟
付丽红
韩璐
王小琦
锁立亚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suning Intelligent Terminal Co ltd
Original Assignee
Suning Intelligent Terminal Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suning Intelligent Terminal Co ltd filed Critical Suning Intelligent Terminal Co ltd
Priority to CN202010190776.XA priority Critical patent/CN111281274A/en
Publication of CN111281274A publication Critical patent/CN111281274A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • A47L11/4008Arrangements of switches, indicators or the like
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/06Control of the cleaning action for autonomous devices; Automatic detection of the surface condition before, during or after cleaning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Electric Vacuum Cleaner (AREA)

Abstract

The invention discloses a visual sweeping method and a system, wherein the method automatically identifies the ground and garbage scanned by an AI camera, reports the ground and garbage to a server, records the place where the garbage exists and a map, and simultaneously, an AI platform analyzes factors influencing the sweeping effect, such as the law of easy dirtiness in a home, the garbage type, the activity law of family members and the like, avoids the factors influencing the sweeping effect in an optimal mode, and formulates the sweeping modes of the autonomous sweeping time, the autonomous sweeping area and different areas of a sweeping robot; finally, the indoor cleaning can be carried out automatically without human operation. According to the invention, the time for managing the sweeping robot by a user can be saved, the intelligent level of the sweeping robot is improved, the cleaning effect and accuracy of the sweeping robot are improved, repeated sweeping is reduced, the service life of the sweeping machine is prolonged, and accurate sweeping is realized.

Description

Visual floor sweeping method and system
Technical Field
The invention belongs to the field of intelligent home furnishing, and particularly relates to a visual floor sweeping method and system.
Background
The existing visual sweeping robot has the advantages that a sweeping mode needs to be set in advance by a user when sweeping, the robot sweeps according to the set sweeping or mopping mode, and a single mode needs to be executed to manually initiate another mode again.
The method can not realize autonomous cleaning, can not autonomously judge whether the mopping mode or the cleaning mode is used, can not automatically judge the cleaning effect, wastes time and energy of a user, wastes electricity and causes excessive cleaning and repeated cleaning.
Disclosure of Invention
The invention aims to provide a visual sweeping method and a visual sweeping system, which can automatically judge where a user is easy to get dirty and where the user needs to sweep, autonomously select a sweeping mode, focus a sweeping area, and automatically judge a sweeping effect and sweeping time.
The technical scheme for realizing the purpose of the invention is as follows: a visual floor sweeping method comprises the following steps:
scanning an area to be cleaned in real time through an AI camera, and reporting the identified video image containing garbage to an AI platform;
the AI platform analyzes the garbage type, the place where the corresponding garbage is generated and the time for generating the garbage according to the video image acquired by the AI camera to generate a cleaning scheme; the AI platform sends the video image, the data and the generated cleaning scheme to the cloud platform;
the cloud platform pushes the cleaning scheme to the sweeping robot to be executed;
and continuously training the system in the process of executing the cleaning scheme, and repeatedly correcting the cleaning scheme.
Further, the cloud platform converts the generated cleaning scheme into structured data for storage.
Further, the cloud platform sends the generated cleaning scheme to the mobile terminal through the pushing platform, checks the cleaning scheme, judges whether the cleaning scheme is adjusted, and if the cleaning scheme is not adjusted, the cleaning robot executes the cleaning scheme by default.
Further, the AI platform analyzes the garbage type according to the video image and data collected by the AI camera, and the specific method comprises the following steps:
and collecting training sample input, establishing a knowledge base for garbage classification, segmenting and performing characteristic processing on the picture, and outputting an image recognition result.
Further, the AI platform analyzes and generates a place corresponding to the garbage according to the video image and the data acquired by the AI camera, and the specific method comprises the following steps:
after the sweeping machine finishes the map drawing after multiple times of sweeping, the sweeping machine divides the area to be swept into square areas with the size being 1.5 times of the diameter area of the sweeping machine according to the drawn map, the type and the area of the most easily generated garbage are recorded in each square area after the division, and the garbage generation place is determined.
Further, the AI platform analyzes the time of generating the garbage according to the video image and the data collected by the AI camera, and the time determining method comprises the following steps: and recording the time, place and area of each cleaning by the user, wherein the time period of the garbage generated between two times of cleaning is the time for generating the garbage.
Further, the cleaning scheme comprises automatic cleaning time and a cleaning mode of each square area after division.
Further, the mobile terminal displays the cleaning scheme in a form of displaying text notes on a map.
A visual floor sweeping system comprising;
the AI camera is carried on the sweeping robot and used for recording video data in the motion process and transmitting the video data to the AI platform in real time;
the AI platform is used for analyzing the video transmitted by the AI camera, identifying the ground garbage, reporting to the server, recording the place and the map where the garbage appears, and formulating the cleaning time, the cleaning area and the cleaning mode of different areas of the sweeper;
the cloud platform is used for converting the video images and data sent by the AI platform into structured data for storage; meanwhile, the generated cleaning scheme is sent to the mobile terminal through a pushing platform;
the mobile terminal is used for acquiring the cleaning scheme generated by the AI platform; and sending the confirmed cleaning scheme to the cloud platform, and pushing the cleaning scheme to the sweeping robot by the cloud platform for execution.
Furthermore, the mobile terminal supports real-time adjustment of the cleaning scheme through map and character display.
Compared with the prior art, the invention has the following remarkable advantages: (1) the sweeping robot is not required to be manually managed, and the sweeping machine can autonomously determine the sweeping time and the sweeping mode; (2) the sweeping robot collects video images and performs AI analysis after sweeping, and can autonomously judge whether the sweeping reaches real cleaning and switch a new sweeping mode; (3) the cleaning can be automatically carried out by avoiding factors influencing the cleaning effect, such as more activities of people, seasons, weather changes and the like; (4) save the electric energy and avoid repeatedly cleaning, prolong robot life of sweeping the floor.
Drawings
Figure 1 is a diagram of a visual sweeper system architecture.
Fig. 2 is a flow chart of a visual sweeping implementation.
Fig. 3 is a picture analysis flow chart.
Fig. 4 is a ConvNet work division diagram.
Fig. 5 is an AI object recognition architecture diagram.
Detailed Description
As shown in fig. 1 and 2, the present invention provides a visual floor sweeping method, which comprises the following steps:
scanning an area to be cleaned in real time through an AI camera, and reporting the identified video image containing garbage to an AI platform;
the AI platform analyzes the garbage type, the place where the corresponding garbage is generated and the time for generating the garbage according to the video image acquired by the AI camera to generate a cleaning scheme; the AI platform sends the video image, the data and the generated cleaning scheme to the cloud platform; the cloud platform converts the video images and data into structured data for storage;
the cloud platform pushes the cleaning scheme to the sweeping robot to be executed;
and continuously training the system in the process of executing the cleaning scheme, and repeatedly correcting the cleaning scheme.
Further, the cloud platform sends the generated cleaning scheme to the mobile terminal through the pushing platform, checks the cleaning scheme, judges whether the cleaning scheme is adjusted, and if the cleaning scheme is not adjusted, the cleaning robot executes the cleaning scheme by default.
Further, the AI platform analyzes the garbage type according to the video image and data collected by the AI camera, and the specific method comprises the following steps:
and collecting training sample input, establishing a knowledge base for garbage classification, segmenting and performing characteristic processing on the picture, and outputting an image recognition result.
Further, the AI platform analyzes and generates a place corresponding to the garbage according to the video image and the data acquired by the AI camera, and the specific method comprises the following steps:
after the sweeping machine finishes the map drawing after multiple times of sweeping, the sweeping machine divides the area to be swept into square areas with the size being 1.5 times of the diameter area of the sweeping machine according to the drawn map, the type and the area of the most easily generated garbage are recorded in each square area after the division, and the garbage generation place is determined.
Further, the AI platform analyzes the time of generating the garbage according to the video image and the data collected by the AI camera, and the time determining method comprises the following steps: and recording the time, place and area of each cleaning by the user, wherein the time period of the garbage generated between two times of cleaning is the time for generating the garbage.
Further, the cleaning scheme comprises automatic cleaning time and a cleaning mode of each square area after division.
Further, the generation method of the cleaning scheme comprises the following steps:
after the sweeping machine is used for sweeping for multiple times, the map of a house is drawn completely, and the area to be swept by the sweeping machine is divided according to the drawn map and a square area 1.5 times the diameter area of the sweeping machine;
after division, the area of each square records the type and the area of the most easily generated garbage, and determines whether the mopping or dust collection is carried out and the corresponding mopping and suction force is determined;
forming a cleaning scheme by combining the time of frequently cleaning by a client, wherein the cleaning scheme comprises automatic cleaning time and a cleaning mode of each divided square area; the cleaning mode comprises sweeping and mopping.
Further, the mobile terminal displays the cleaning scheme in a form of displaying text notes on a map.
A visual floor sweeping system comprising;
the AI camera is used for recording video data in the motion process and transmitting the video data to the AI platform in real time;
the AI platform is used for analyzing videos transmitted by the AI camera, identifying ground garbage, reporting to the server, recording places and maps where the garbage occurs, analyzing rules of easy dirtiness of families, garbage types, activities of the families, seasons, weather changes and other factors influencing the cleaning effect, avoiding the factors influencing the cleaning effect in an optimal mode, optimally selecting cleaning time and cleaning mode, and then formulating cleaning modes of time, areas, key areas and different areas independently cleaned by the sweeper;
the cloud platform is used for converting the video images and data sent by the AI platform into structured data for storage; meanwhile, the generated cleaning scheme is sent to the mobile terminal through a pushing platform;
and the mobile terminal is used for acquiring the cleaning scheme generated by the AI platform, displaying the cleaning scheme in a map and character mode and supporting real-time adjustment of the cleaning scheme.
The vision robot body of sweeping floor for carry on the AI camera, clean according to the instruction of AI platform.
Further, the factors influencing the cleaning effect are avoided in an optimal mode, and the optimal mode specifically comprises the following steps:
after the sweeping machine is used for sweeping for multiple times, the map of a house is drawn completely, and the area to be swept by the sweeping machine is divided according to the drawn map and a square area 1.5 times the diameter area of the sweeping machine; recording the type and area of the most easily generated garbage in each area of each block after division to determine the garbage generation place; therefore, the dirtiness rule is: what type of garbage is easy to generate in each partition area of the home plus the time period of generation;
the garbage types are: comparing the data with an AI database;
the family activity rule is as follows: the camera captures the activity area and time of the family, and the sweeper avoids the difficulties and factors influencing sweeping when sweeping; and then the cleaning mode and time are determined according to the weather of the region and season.
According to the invention, the time for managing the sweeping robot by a user can be saved, the intelligent level of the sweeping robot is improved, the cleaning effect and accuracy of the sweeping robot are improved, repeated sweeping is reduced, the service life of the sweeping machine is prolonged, and accurate sweeping is realized.
The technical solution of the present invention will be described in detail below with reference to the embodiments and the accompanying drawings.
Examples
With reference to fig. 1 and 2, the invention provides a visual floor sweeping method and system, which can automatically judge where a user is easy to get dirty and where the user needs to sweep, and autonomously select a sweeping mode, a key sweeping area and a sweeping time, thereby saving the time for managing a floor sweeping robot by the user, improving the intelligent level of the floor sweeping robot, improving the cleaning effect and accuracy of the floor sweeping robot, reducing repeated sweeping, prolonging the service life of a floor sweeping machine, realizing accurate sweeping and automatic judgment of a sweeping effect, and switching modes and modes for sweeping when the effect is poor, and the method specifically comprises the following steps:
(1) the sweeping robot with the AI camera records video scanning data in the motion process and transmits the data to the AI platform in real time.
(2) The AI platform analyzes the video transmitted by the camera, so as to identify the household ground and garbage automatically and report to the server, then records dirty places and maps, and meanwhile, the AI platform of the sweeper analyzes factors influencing the sweeping effect, such as rules of easy dirtiness of the household, garbage types and activities of the household, avoids the factors influencing the sweeping effect in an optimal mode, optimally selects sweeping time and sweeping mode, and then sets the sweeping modes of the autonomous sweeping time, region, key region and different regions of the sweeper.
The AI platform analyzes the garbage types of the user's home, places where corresponding garbage is easy to generate, time when the garbage is easy to generate, and the like, according to the video images and data of the AI camera. The AI image recognition mainly comprises classification, picture segmentation and feature processing. Before analyzing user data, a picture classification rule base is established, as shown in fig. 3, the establishment of the picture classification rule base of the system mainly includes the following operations:
firstly, a knowledge base for garbage classification is established, and enough training sample inputs are collected, wherein the samples need to reach million levels of garbage, and the total amount needs to reach hundreds of millions of levels, so that a large enough knowledge base is formed.
After the knowledge base is established, the picture is divided and subjected to feature processing, the dividing and feature processing processes can be repeatedly interacted with the knowledge base, the content of the knowledge base is corrected and supplemented, and the knowledge base is continuously enriched and optimized.
And finally, outputting an image recognition result by the knowledge base and the characteristic processing together, and explaining the result. And the unsatisfactory pictures are identified and fed back to the knowledge base for further training and optimization.
The establishment of the knowledge base is mainly carried out secondary transformation on the tensor flow, and a convolutional neural network algorithm is used.
The convolutional neural network, i.e., ConvNet, is a feedforward neural network, i.e., no loop is shown, and the BP algorithm of the ordinary neural network is only suitable for conveniently calculating the gradient and is also a feedforward neural network. ConvNet's training of the data is an iterative process until a 99.9% image accuracy recognition rate is reached. The hierarchy of the convolutional neural network includes: an input layer; a convolution layer; an active layer; a pooling layer; a flatten layer; a fully-connected layer; a normalization layer; cutting and layering; a fusion layer; and (5) outputting the layer. The specific division of each layer is shown in fig. 4. The establishment of the convolutional layer, the activation layer and the pooling layer can be multiple, and each processing and completed task is different until the practical application standard of object identification can be reached.
After the repeated training of a plurality of data is completed, the successfully trained data is stored in the picture database.
As shown in fig. 5, when a user actually uses the video, after the user inputs the video, the video is converted into a picture, then the picture is subjected to picture preprocessing, feature processing and extraction, three-dimensional modeling is performed on the picture, finally the picture of the user is compared, if the comparison is successful, the picture data is sent to a cloud platform, and if the comparison is failed, the content is used as training data to train so as to continuously increase training samples and continuously enrich a picture database. So far, the picture analysis process is basically finished. The picture preprocessing comprises image input, image graying, image enhancement, image filtering, image binaryzation and image dial positioning.
(3) The cloud platform sends the generated cleaning scheme to the mobile terminal through the pushing platform, a user can check cleaning in a map and character combined mode, if the user feels that a problem exists, the cleaning robot can adjust the cleaning scheme, and the cleaning robot can execute the cleaning scheme by default if the user does not have the problem.
(4) The cloud platform finally forms an autonomous cleaning scheme after the foundation suggested by the user is modified, and the flat pushing and sweeping robot executes the scheme.
(5) The system can continuously train in the process of executing the scheme, and repeatedly revise the cleaning scheme so as to achieve the optimal automatic customized cleaning effect.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A visual floor sweeping method is characterized by comprising the following steps:
scanning an area to be cleaned in real time through an AI camera, and reporting the identified video image containing garbage to an AI platform;
the AI platform analyzes the garbage type, the place where the corresponding garbage is generated and the time for generating the garbage according to the video image acquired by the AI camera to generate a cleaning scheme; the AI platform sends the video image, the data and the generated cleaning scheme to the cloud platform;
the cloud platform pushes the cleaning scheme to the sweeping robot to be executed;
and continuously training the system in the process of executing the cleaning scheme, and repeatedly correcting the cleaning scheme.
2. The visual sweeping method of claim 1, wherein the generated sweeping solution is processed by a cloud platform, and the cloud platform converts video images and data into structured data for storage.
3. The visual sweeping method according to claim 1, wherein the cloud platform sends the generated sweeping scheme to the mobile terminal through the pushing platform, checks the sweeping scheme, judges whether the scheme is adjusted, and if not, the sweeping robot executes the scheme by default.
4. The visual floor sweeping method according to claim 1, wherein the AI platform analyzes the garbage type according to the video image and data acquired by the AI camera, and the specific method is as follows:
and collecting training sample input, establishing a knowledge base for garbage classification, segmenting and performing characteristic processing on the picture, and outputting an image recognition result.
5. The visual floor sweeping method according to claim 1, wherein the AI platform analyzes and generates a place corresponding to the garbage according to the video image and data acquired by the AI camera, and the specific method is as follows:
after the sweeping machine finishes the map drawing after multiple times of sweeping, the sweeping machine divides the area to be swept into square areas with the size being 1.5 times of the diameter area of the sweeping machine according to the drawn map, the type and the area of the most easily generated garbage are recorded in each square area after the division, and the garbage generation place is determined.
6. The visual floor sweeping method of claim 5, wherein the AI platform analyzes the time of generating the garbage according to the video images and data collected by the AI camera, and the time determination method comprises: and recording the time, place and area of each cleaning by the user, wherein the time period of the garbage generated between two times of cleaning is the time for generating the garbage.
7. The visual sweeping method of claim 5, wherein the sweeping scheme comprises automatic sweeping time and sweeping mode for each square area after division.
8. The visual sweeping method according to claim 1 or 7, wherein the mobile terminal displays the sweeping scheme in the form of a text note displayed on a map.
9. A visual floor sweeping system, comprising;
the AI camera is carried on the sweeping robot and used for recording video data in the motion process and transmitting the video data to the AI platform in real time;
the AI platform is used for analyzing the video transmitted by the AI camera, identifying the ground garbage, reporting to the server, recording the place and the map where the garbage appears, and formulating the cleaning time, the cleaning area and the cleaning mode of different areas of the sweeper;
the cloud platform is used for converting the video images and data sent by the AI platform into structured data for storage; meanwhile, the generated cleaning scheme is sent to the mobile terminal through a pushing platform;
the mobile terminal is used for acquiring the cleaning scheme generated by the AI platform; and sending the confirmed cleaning scheme to the cloud platform, and pushing the cleaning scheme to the sweeping robot by the cloud platform for execution.
10. The visual sweeping system of claim 9, wherein the mobile terminal supports real-time adjustment of the sweeping regimen by displaying in a map plus text format.
CN202010190776.XA 2020-03-18 2020-03-18 Visual floor sweeping method and system Pending CN111281274A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010190776.XA CN111281274A (en) 2020-03-18 2020-03-18 Visual floor sweeping method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010190776.XA CN111281274A (en) 2020-03-18 2020-03-18 Visual floor sweeping method and system

Publications (1)

Publication Number Publication Date
CN111281274A true CN111281274A (en) 2020-06-16

Family

ID=71021688

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010190776.XA Pending CN111281274A (en) 2020-03-18 2020-03-18 Visual floor sweeping method and system

Country Status (1)

Country Link
CN (1) CN111281274A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112690704A (en) * 2020-12-22 2021-04-23 珠海市一微半导体有限公司 Robot control method, control system and chip based on vision and laser fusion
CN114451816A (en) * 2021-12-23 2022-05-10 杭州华橙软件技术有限公司 Cleaning strategy generation method and device, computer equipment and storage medium
CN114569001A (en) * 2022-03-16 2022-06-03 北京石头世纪科技股份有限公司 Intelligent mobile device
CN115281558A (en) * 2022-07-14 2022-11-04 珠海格力电器股份有限公司 Working method and device of vision detection auxiliary sweeping robot and air conditioning equipment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105395144A (en) * 2015-12-21 2016-03-16 美的集团股份有限公司 Control method, system and cloud server of sweeping robot and sweeping robot
CN105892321A (en) * 2016-04-28 2016-08-24 京东方科技集团股份有限公司 Dispatching method and device for cleaning robot
CN107233051A (en) * 2017-07-03 2017-10-10 北京小米移动软件有限公司 The control method and device of sweeping robot
CN109330503A (en) * 2018-12-20 2019-02-15 江苏美的清洁电器股份有限公司 Cleaning household electrical appliance and its control method and system
CN110490688A (en) * 2019-07-12 2019-11-22 苏宁智能终端有限公司 Method of Commodity Recommendation and device
US20200008639A1 (en) * 2019-08-28 2020-01-09 Lg Electronics Inc. Artificial intelligence monitoring device and method of operating the same
US20200016764A1 (en) * 2019-08-26 2020-01-16 Lg Electronics Inc. Artificial intelligence robot for performing cleaning using pollution log and method for same
TW202008086A (en) * 2018-07-24 2020-02-16 美商高通公司 Managing cleaning robot behavior

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105395144A (en) * 2015-12-21 2016-03-16 美的集团股份有限公司 Control method, system and cloud server of sweeping robot and sweeping robot
CN105892321A (en) * 2016-04-28 2016-08-24 京东方科技集团股份有限公司 Dispatching method and device for cleaning robot
CN107233051A (en) * 2017-07-03 2017-10-10 北京小米移动软件有限公司 The control method and device of sweeping robot
TW202008086A (en) * 2018-07-24 2020-02-16 美商高通公司 Managing cleaning robot behavior
CN109330503A (en) * 2018-12-20 2019-02-15 江苏美的清洁电器股份有限公司 Cleaning household electrical appliance and its control method and system
CN110490688A (en) * 2019-07-12 2019-11-22 苏宁智能终端有限公司 Method of Commodity Recommendation and device
US20200016764A1 (en) * 2019-08-26 2020-01-16 Lg Electronics Inc. Artificial intelligence robot for performing cleaning using pollution log and method for same
US20200008639A1 (en) * 2019-08-28 2020-01-09 Lg Electronics Inc. Artificial intelligence monitoring device and method of operating the same

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112690704A (en) * 2020-12-22 2021-04-23 珠海市一微半导体有限公司 Robot control method, control system and chip based on vision and laser fusion
CN114451816A (en) * 2021-12-23 2022-05-10 杭州华橙软件技术有限公司 Cleaning strategy generation method and device, computer equipment and storage medium
CN114451816B (en) * 2021-12-23 2024-02-09 杭州华橙软件技术有限公司 Cleaning policy generation method, cleaning policy generation device, computer device and storage medium
CN114569001A (en) * 2022-03-16 2022-06-03 北京石头世纪科技股份有限公司 Intelligent mobile device
CN114569001B (en) * 2022-03-16 2023-10-20 北京石头世纪科技股份有限公司 Intelligent mobile device
CN115281558A (en) * 2022-07-14 2022-11-04 珠海格力电器股份有限公司 Working method and device of vision detection auxiliary sweeping robot and air conditioning equipment
CN115281558B (en) * 2022-07-14 2024-05-31 珠海格力电器股份有限公司 Visual detection assisted floor sweeping robot work method and device and air conditioning equipment

Similar Documents

Publication Publication Date Title
CN111281274A (en) Visual floor sweeping method and system
CN111543902B (en) Floor cleaning method and device, intelligent cleaning equipment and storage medium
CN110490688B (en) Commodity recommendation method and device
CN111643010A (en) Cleaning robot control method and device, cleaning robot and storage medium
CN111643014A (en) Intelligent cleaning method and device, intelligent cleaning equipment and storage medium
CN111643017B (en) Cleaning robot control method and device based on schedule information and cleaning robot
CN104899556A (en) System for counting persons in classroom based on image recognition
CN107480643A (en) A kind of robot of Intelligent refuse classification processing
CN103020590B (en) A kind of vehicle identification system based on three-dimensional model and images match and method thereof
CN110921146A (en) Household garbage classification method and system based on internet big data and image processing technology
CN108320236A (en) A kind of interior space surveying and mapping data management system
CN116797944A (en) Detection method and system for identifying cleanliness of photovoltaic panel based on unmanned aerial vehicle image
CN114065837A (en) Linen use supervision method, device, system, equipment and storage medium
CN113988930A (en) Artificial intelligent valuation system for commercial real estate
CN115644739B (en) Commercial cleaning robot control method and system based on Internet of things
CN111610928A (en) Rapid and universal buried point data acquisition method
CN115147703B (en) Garbage segmentation method and system based on GinTrans network
CN113057529B (en) Garbage classification control system based on stair cleaning robot
CN112450807A (en) Obstacle removing control method, device and system for sweeping robot
CN202374374U (en) Shopping mall crowd flow dynamic management system based on remote images
CN111007496B (en) Through-wall perspective method based on neural network associated radar
CN201453273U (en) Automatic personnel identity information acquisition and comparison device
CN111814581A (en) Student behavior identification method and system based on classroom scene
CN113556431A (en) AI (Artificial intelligence) director assistant system capable of implementing intelligent interaction
CN109545362B (en) Visual recognition analysis system for controlling result data by using target data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200616