WO2022111539A1 - 扫地控制方法、装置、扫地机器人及计算机可读介质 - Google Patents
扫地控制方法、装置、扫地机器人及计算机可读介质 Download PDFInfo
- Publication number
- WO2022111539A1 WO2022111539A1 PCT/CN2021/132880 CN2021132880W WO2022111539A1 WO 2022111539 A1 WO2022111539 A1 WO 2022111539A1 CN 2021132880 W CN2021132880 W CN 2021132880W WO 2022111539 A1 WO2022111539 A1 WO 2022111539A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- cleaning
- area
- room
- robot
- sweeping
- Prior art date
Links
- 238000010408 sweeping Methods 0.000 title claims abstract description 264
- 238000000034 method Methods 0.000 title claims abstract description 85
- 238000004140 cleaning Methods 0.000 claims abstract description 663
- 230000033001 locomotion Effects 0.000 claims description 35
- 238000001514 detection method Methods 0.000 claims description 33
- 238000013135 deep learning Methods 0.000 claims description 24
- 238000003860 storage Methods 0.000 claims description 13
- 230000007613 environmental effect Effects 0.000 claims description 11
- 239000013589 supplement Substances 0.000 claims description 5
- 239000000284 extract Substances 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 10
- 238000004891 communication Methods 0.000 description 8
- 230000003749 cleanliness Effects 0.000 description 7
- 238000004590 computer program Methods 0.000 description 7
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 7
- 238000012545 processing Methods 0.000 description 5
- 238000011109 contamination Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 241000238876 Acari Species 0.000 description 3
- 239000000428 dust Substances 0.000 description 3
- 238000000354 decomposition reaction Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 239000010813 municipal solid waste Substances 0.000 description 2
- 238000001179 sorption measurement Methods 0.000 description 2
- 239000002386 air freshener Substances 0.000 description 1
- 238000004887 air purification Methods 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013136 deep learning model Methods 0.000 description 1
- 238000005238 degreasing Methods 0.000 description 1
- 239000003599 detergent Substances 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- -1 dirt Substances 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 239000003517 fume Substances 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/24—Floor-sweeping machines, motor-driven
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
Definitions
- the present application relates to the technical field of sweeping robots, and in particular, to a sweeping control method, a device, a sweeping robot, and a computer-readable medium.
- the present application aims to solve one of the technical problems in the related art at least to a certain extent.
- an object of the present application is to provide a cleaning control method, device, cleaning robot and computer-readable medium.
- the sweeping control method includes: obtaining the current environment information of the sweeping robot; determining the type of the target object according to the environment information; determining the corresponding sweeping mode according to the type of the target object, and controlling the sweeping robot to follow the Cleaning is performed in the determined cleaning mode.
- the sweeping robot in the process of real-time cleaning and walking, takes pictures of the surrounding environment to collect environmental information of the surrounding environment, such as image information of the area to be cleaned, furniture information, room area information, etc., and the sweeping robot executes Before cleaning, first determine the category of the target object included in the environmental information, and then determine the corresponding cleaning mode according to the category, so as to ensure that the cleaning robot cleans in a targeted and accurate manner, and improves the cleaning efficiency.
- environmental information of the surrounding environment such as image information of the area to be cleaned, furniture information, room area information, etc.
- the sweeping control method includes: acquiring image information of an area to be cleaned, performing deep learning recognition on the image information, and acquiring room area information in the image information; and determining the to-be-cleaned area information according to the room area information
- the cleanable area in the area, the cleanable area is the passable room area that the sweeping robot can enter in the area to be cleaned; according to the sweeping motion trajectory, determine the target area, and control the sweeping robot to supplement the sweeping of the target area , the target area is the unsweeped area existing in the cleanable area.
- the step before the step of acquiring image information of the area to be cleaned, performing deep learning recognition on the image information, and acquiring room area information in the image information, the step further includes: controlling the cleaning robot to traverse the area to be cleaned; a laser sensor is called to collect detection data of the sweeping robot during the movement in the area to be cleaned; and a grid map of the area to be cleaned is established according to the detection data.
- the step of acquiring image information of an area to be cleaned, performing deep learning recognition on the image information, and acquiring room area information in the image information includes: controlling the cleaning robot to traverse the area to be cleaned ; Call the camera module, take pictures of the area to be cleaned, and obtain the image information of the area to be cleaned; input the image information into the room door recognition network, call the trained room door recognition model to identify the area to be cleaned in the image information room door information.
- the step of establishing a grid map of the area to be cleaned further includes: inputting the image information into a room door recognition network, and invoking a trained room door recognition model to recognize the image information
- the room door information of the area to be cleaned in the grid map ; according to the obtained room door information, the grid map is decomposed to obtain the room type environment information of the room area corresponding to each room door in the area to be cleaned in the grid map.
- the step of determining a cleanable area in the to-be-cleaned area according to the room area information includes: controlling the cleaning robot to initially explore the to-be-cleaned area based on the room area information ; Obtain the exploration trajectory of the sweeping robot, and determine whether the sweeping robot can enter the room area corresponding to the room door information; if the sweeping robot can enter the room area corresponding to the room door information, the The passable room area that the cleaning robot can enter is marked as a cleanable area.
- the step of determining the target area according to the cleaning motion trajectory, and controlling the cleaning robot to supplement the cleaning of the target area includes: responding to a cleaning control instruction, controlling the cleaning robot to clean the area to be cleaned; Call the grid map to compare with the cleaning motion trajectory of the sweeping robot, and determine that there is a room area that has been missed in this cleaning; call the navigation module for navigation, control the sweeping robot to move to the area of the room that has been missed, and call the sensor to detect the Whether the room door in the missed room area can pass; if the room door can pass, determine the missed room area as the target area, and control the cleaning robot to enter the target area to perform supplementary cleaning.
- the step of determining the target area according to the sweeping motion trajectory, and controlling the cleaning robot to re-clean the target area further includes: responding to the cleaning control instruction, controlling the cleaning robot to perform a cleaning operation in the area to be cleaned.
- Clean call the grid map to compare with the cleaning movement trajectory of the sweeping robot, and determine that there is a room area that has been missed in this cleaning; call the navigation module for navigation, control the sweeping robot to move to the area of the room that is missed, and call the sensor to detect Whether the room door of the room area that has been missed can pass through; if the door of the room cannot pass through, mark the room area that was missed to be scanned as an inaccessible area, and report the room area information of the inaccessible area to the server, The missed scan detection of the impassable area is stopped.
- the present application also proposes a floor sweeping control device, the floor sweeping control device includes: a room area identification module for acquiring image information of the area to be cleaned, performing deep learning recognition on the image information, and obtaining The room area information in the image information; the cleaning area determination module is used to determine the cleanable area in the to-be-cleaned area according to the room area information; the target area supplementary cleaning module is used to determine according to the cleaning motion trajectory A target area, controlling the cleaning robot to supplementally sweep the target area, where the target area is an unsweeped area in the cleanable area.
- the sweeping control method proposed in the embodiment of the present application realizes controlling the sweeping robot to effectively identify the room information of the area to be cleaned, and then performs supplementary sweeping detection on the room that has not been cleaned during the cleaning process, and determines whether the room can be reached.
- the uncleaned room that the sweeping robot can reach can be supplemented, so as to avoid missed sweeps and improve the cleaning efficiency of the sweeping robot.
- the sweeping control method includes: obtaining furniture information of the current environment of the sweeping robot; determining the room type of the current environment according to the furniture information; determining a cleaning mode according to the room type; obtaining the cleaning situation of the current environment, according to the cleaning
- the cleaning parameters of the cleaning robot in the cleaning mode are determined according to the situation, and the cleaning robot is controlled to perform cleaning according to the cleaning parameters.
- the step of determining a cleaning mode according to the room type includes: when the room type is a kitchen type, determining that the current cleaning mode is a kitchen cleaning mode; or, when the room type is a kitchen cleaning mode; When the room type is the bedroom type, the current cleaning mode is determined to be the bedroom cleaning mode; or, when the room type is the living room type, the current cleaning mode is determined to be the living room cleaning mode; or, when the room type is the bathroom type When is, it is determined that the current cleaning mode is the toilet cleaning mode.
- the step of determining the cleaning parameters of the cleaning robot in the cleaning mode according to the cleaning situation includes: comparing the cleaning situation of the current environment with a preset in the cleaning mode The cleaning conditions are compared; the cleaning parameters of the cleaning robot in the cleaning mode are determined according to the comparison results.
- the step of comparing the cleaning conditions of the current environment with the preset cleaning conditions in the cleaning mode includes: acquiring the cleaning conditions of different areas in the current environment, and comparing each area The cleaning situation of the cleaning mode is compared with the preset cleaning situation of the corresponding preset area in the cleaning mode.
- the step of determining the cleaning parameters of the cleaning robot in the cleaning mode according to the comparison result includes: adjusting the cleaning parameters of the cleaning robot in the cleaning mode according to the comparison result of each area the cleaning parameter of each area in the mode; and determining the adjusted cleaning parameter as the cleaning parameter of each area.
- the furniture information includes the type information of the furniture and the position information of the furniture
- the step of obtaining the furniture information of the current environment of the cleaning robot includes: Step S1, controlling the cleaning robot to obtain the information of the current environment. Image data; Step S2, the first target detection algorithm based on deep learning determines the type information of each furniture in the image data, and determines the position of the furniture in the image data and the width and height values; And, Step S3, combine the pixels of each of the furniture in the image data with the data obtained by the ranging sensor to obtain the depth value of each of the furniture in the image data; Step S4, obtain multiple frames of images at different times The data executes steps S1 to S3 to obtain the position information of the furniture.
- the step of acquiring the cleaning status of the current environment includes: extracting dirty data in the image data by a second target detection algorithm based on deep learning; counting the values of the dirty data and Area; determine the cleanliness of the current environment according to the value and the area.
- the step further includes: acquiring a terminal account connected to the cleaning robot; storing the cleaning process, the furniture information and the room type is displayed on the display interface for logging in the terminal account.
- another aspect of the present application also provides a cleaning robot, the cleaning robot includes a memory, a processor and a control program of the cleaning robot stored in the memory and running on the processor, the processor When the control program of the cleaning robot is executed, the cleaning control method described in the above embodiment is implemented.
- the above-mentioned sweeping control method of the present application controls the sweeping robot to collect furniture information of the current environment when the sweeping robot performs the cleaning task, determines the room type of the current environment according to the furniture information, determines the cleaning mode according to the room type, and obtains the current environment.
- the cleaning parameters of the cleaning robot in the cleaning mode are determined according to the cleaning situation, and the cleaning robot is controlled to perform cleaning on the current cleaning area according to the cleaning parameters.
- Control the cleaning robot to determine the cleaning mode corresponding to the room type according to different room types, and further determine the cleaning parameters in the cleaning mode according to the cleaning situation of the cleaning area, and control the cleaning robot to complete the cleaning of the cleaning area according to the determined cleaning parameters.
- the sweeping robot When the sweeping robot cleans different types of rooms, the sweeping robot automatically changes the cleaning mode corresponding to the type of room in the cleaning area, realizing the use of different cleaning modes for different types of rooms, achieving the effect of refined cleaning, and does not require The user performs manual replacement, which improves the intelligence of the sweeping robot.
- another aspect of the present application further provides a computer-readable storage medium, where a control program of a cleaning robot is stored on the computer-readable storage medium, and when the control program of the cleaning robot is executed by a processor
- the sweeping control method described in the above embodiment is implemented.
- FIG. 1 is a schematic flowchart of a sweeping control method according to an embodiment of the present application.
- FIG. 2 is a schematic structural diagram of a sweeping robot of a hardware operating environment involved in the solution of an embodiment of the present application
- FIG. 3 is a schematic flowchart of the first embodiment of the sweeping control method of the present application.
- FIG. 6 is a schematic flowchart of a third embodiment of the application.
- FIG. 7 is a schematic diagram of a sweeping control device module of the hardware operating environment involved in the solution of the embodiment of the present application.
- the present application discloses a sweeping control method. As shown in FIG. 1 , the sweeping control method includes:
- the sweeping robot can use the collection devices such as cameras, distance sensors, etc. set on the sweeping robot in the process of moving, and the collection device can realize the collection of surrounding environment information, environmental information It can be image information of the area to be cleaned, furniture information, room area information, etc.
- the target object can be understood as the area to be cleaned by the cleaning robot, the specific room location, the specific walking route, etc.;
- FIG. 2 is a schematic structural diagram of a cleaning robot of a hardware operating environment involved in the solution of the embodiment of the present application.
- the cleaning robot may include: a processor 1001 , such as a CPU, a network interface 1004 , a user interface 1003 , a memory 1005 , and a communication bus 1002 .
- the communication bus 1002 is used to realize the connection and communication between these components.
- the user interface 1003 may include a display screen (Display), an input unit such as a keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface and a wireless interface.
- the network interface 1004 may include a standard wired interface, a wireless interface (eg, a Wi-Fi interface) according to some embodiments of the present application.
- the memory 1005 may be high-speed RAM memory, or may be non-volatile memory, such as disk memory.
- the memory 1005 may also be a storage device independent of the aforementioned processor 1001 according to some embodiments of the present application.
- the cleaning robot may further include a camera, a ranging sensor (lidar, TOF sensor, binocular vision ranging sensor, etc.), an RF (Radio Frequency, radio frequency) circuit sensor, a remote control, an audio circuit, Wi-Fi modules, detectors, and more.
- a ranging sensor lidar, TOF sensor, binocular vision ranging sensor, etc.
- RF Radio Frequency, radio frequency
- the sweeping robot may also be configured with other sensors such as a gyroscope, a barometer, a hygrometer, and a temperature sensor, which will not be repeated here.
- the structure of the cleaning robot shown in FIG. 2 does not constitute a limitation on the cleaning robot equipment, and may include more or less components than those shown in the figure, or combine some components, or arrange different components .
- the memory 1005 which is a computer-readable storage medium, may include an operating system, a network communication module, a user interface module, and a cleaning control program.
- the following describes the sweeping control method of the present application by taking the target object as furniture and the environment information including furniture information corresponding to the furniture as an example.
- the main solution of this embodiment is to obtain the furniture information of the current environment of the sweeping robot when the sweeping robot performs the cleaning task; determine the room type of the current environment according to the furniture information, and determine the cleaning mode according to the room type; obtain the cleaning of the current environment the cleaning parameters of the cleaning robot in the cleaning mode are determined according to the cleaning situation, and the cleaning robot is controlled to perform cleaning according to the cleaning parameters.
- the network interface 1004 is mainly used to connect to the background server and perform data communication with the background server;
- the user interface 1003 is mainly used to connect to the client (client) and perform data communication with the client;
- the controller 1001 can be used to call the sweeping control program in the memory 1005, and perform the following operations:
- the cleaning situation of the current environment is acquired, the cleaning parameters of the cleaning robot in the cleaning mode are determined according to the cleaning situation, and the cleaning robot is controlled to perform cleaning according to the cleaning parameters.
- FIG. 3 is a schematic flowchart of the first embodiment of the sweeping control method of the present application.
- the embodiments of the present application provide embodiments of the sweeping control method. It should be noted that, although the logic sequence is shown in the flowchart, in some cases, the sequence shown or described may be performed in a different sequence than the one here. A step of.
- Sweeping control methods include:
- Step S10 obtaining furniture information of the current environment of the sweeping robot
- the furniture information includes information such as type information of the furniture and position information of the furniture.
- the steps of obtaining the furniture information of the current environment of the sweeping robot include:
- Step S1 controlling the sweeping robot to obtain image data of the current environment
- Step S2 the first target detection algorithm based on deep learning determines the type information of each furniture in the image data, and determines the position, width and height values of the furniture in the image data;
- Step S3 combining the pixel of each of the furniture in the image data with the data obtained by the ranging sensor to obtain the depth value of each of the furniture in the image data;
- Step S4 acquiring multiple frames of image data at different times and performing steps S1 to S3 to obtain the position information of the furniture.
- the cleaning robot can obtain image data in the current environment through a monocular camera (eg, monocular RGB), and then determine the type information of furniture contained in the image data through a first target detection algorithm based on deep learning.
- the first target detection algorithm based on deep learning is an algorithm for extracting furniture information in the image data through a target model based on a large amount of statistical image data.
- the furniture image existing in the image data is extracted by using the bounding box, and then the position of the furniture in the image data in the image data, and the width and height values of the furniture are determined.
- the sweeping robot obtains the depth value of each furniture by combining the pixels of each furniture in the image data with the data obtained by the ranging sensor, and then determines the distance between it and the sweeping robot through the depth value of each furniture (the deeper the depth value is, the closer it is to the sweeping robot). The larger the distance is), thus, by acquiring multiple frames of image data for analysis, the distance between the sweeping robot and the furniture in the image data can be obtained. It is understandable that when the sweeping robot performs the cleaning task, it determines its location through SLAM (real-time positioning and map construction). When the distance between the sweeping robot and the furniture is determined, it can be determined that the sweeping robot is in the cleaning process. The location information of the furniture contained in the image data in the physical world is obtained.
- the position information of the furniture of the image data obtained by the cleaning robot in the cleaning process can be determined by the monocular camera and the distance measuring sensor, which increases the diversity of the position information of the furniture determined by the cleaning robot.
- Step S20 determining the room type of the current environment according to the furniture information, and determining the cleaning mode according to the room type;
- the room area to be cleaned by the cleaning robot is divided into different room types according to user activities, including kitchen type, bedroom type, living room type and bathroom type.
- the step of determining the room type of the current environment according to the furniture information includes:
- Step S23 determining the type of furniture by using the collected furniture information through a deep learning model
- Step S24 the room type of the current environment is determined by the furniture type.
- the type of furniture is further determined through a target detection algorithm based on deep learning. For example, by inputting the acquired image data into a target detection algorithm including identifying the type of bed, and identifying the type of furniture contained in the image information currently acquired by the sweeping robot as a bed, it can be determined that the current environment of the sweeping robot is a bedroom .
- the cleaning robot marks the type of each room in the area that needs to be cleaned before performing cleaning, and maps and stores the marked room type and the corresponding furniture information, so that the cleaning robot can obtain the image through the camera device.
- the room type in the current cleaning area is determined according to the collected furniture information, and then the cleaning mode of the current cleaning area is determined.
- the cleaning mode is a cleaning mode corresponding to a room type, which includes a kitchen cleaning mode, a bedroom cleaning mode, a living room cleaning mode, and a bathroom cleaning mode.
- different room types are stored corresponding to the cleaning modes that the cleaning robot needs to select when cleaning, so that the cleaning robot can be controlled to select the corresponding cleaning mode to perform cleaning when recognizing different room types Task.
- the cleaning parameters performed by the cleaning robot are different. For example, for the kitchen cleaning mode, due to the large amount of oil fume stains in the kitchen, when cleaning the kitchen area, control the sweeping robot to start the roller brush to clean; for the horizontal cleaning mode, when the sweeping area where the sweeping robot is currently located is the bedroom When cleaning the area, since a large number of beddings are placed in the cleaning area of the bedroom, it is easy to generate mites, and the user has a long rest time in the bedroom every day. Mites and air purification.
- the cleaning robot when the cleaning robot recognizes that the furniture information in the room includes the cleaning robot and the sofa, it can be confirmed that the room type of the current cleaning area of the cleaning robot is the living room type, and then the cleaning mode of the cleaning robot is determined to be the living room cleaning mode, control the cleaning robot to perform cleaning tasks according to the cleaning parameters in the cleaning mode.
- step S30 the cleaning situation of the current environment is acquired, the cleaning parameters of the cleaning robot in the cleaning mode are determined according to the cleaning situation, and the cleaning robot is controlled to perform cleaning according to the cleaning parameters.
- the cleaning parameters are parameters determined by the sweeping robot according to the cleaning situation of the current cleaning area. For example, when it is determined that there is a large area of water stains in the current environment, the sweeping robot starts the mopping function of the large mop to remove the water in the current cleaning area. Clean up the stains; when it is determined that the area with water stains in the current environment is small, you can start the mopping function of the small mop to clean up the water stains in the current cleaning area.
- the cleaning robot determines the cleaning mode according to the room type, it further obtains the cleaning situation of the current room, determines the cleaning parameters in the cleaning mode in real time according to the cleaning situation of the room, and then controls the cleaning robot to clean the current according to the determined cleaning parameters.
- the cleaning robot after determining the room type of the current cleaning area, the cleaning robot is preset, and then determines the corresponding cleaning mode according to the room type, further obtains the cleaning situation of the current environment, and determines according to the cleaning situation.
- the cleaning parameters of the cleaning robot in the cleaning mode wherein, the specific method for determining the cleaning parameters can be stored in advance according to the room type.
- the cleaning mode of the cleaning area is determined according to the acquired current furniture information, and the cleaning parameters in the cleaning mode are further determined according to the cleaning situation in the cleaning area. It can be understood that when the cleaning situation in the cleaning area is consistent with the preset cleaning situation in the cleaning mode, the cleaning robot is controlled to clean according to the standard cleaning parameters in the cleaning mode.
- Describe the steps to obtain the cleanliness of the current environment including:
- Step S31 extracting dirty data in the image data based on a second target detection algorithm based on deep learning
- Step S32 count the value and/or area of the dirty data
- Step S33 determining the cleanliness of the current environment according to the numerical value and/or the area.
- the second target detection algorithm based on deep learning is an algorithm for performing deep learning based on a large amount of image data to obtain dirty data existing in the image data.
- the dirt data includes data such as water stains, dust, dirt, and garbage.
- the sweeping robot extracts the dirt data in the image data by using the obtained image data based on the second target detection algorithm of deep learning, and then counts the value and area of the obtained dirt data, and determines the value and the area. The cleanliness of the current environment.
- the value of the acquired contamination data can be compared with a preset value to determine the cleanliness of the current environment.
- the cleanliness of the current environment and the preset value can be established through a mapping table.
- the mapping relationship between the preset value is less than or equal to 4, the corresponding cleaning situation is clean, and when the preset value is less than or equal to 4, the corresponding cleaning situation is dirty.
- the value of the dirty data is 5, which is greater than the preset value of 4, it is determined that the cleaning condition of the current environment is dirty.
- the cleaning situation of the current environment can also be determined by determining the area of the cleaning robot by obtaining the area of the dirty data. It can count the area of each contamination in the contamination data, and then accumulate the area of each contamination. When it is greater than the preset area value of the area, it is confirmed that the current environment is dirty.
- the current cleaning situation of the environment may refer to the value and the area of the dirt data at the same time. It can be used to confirm that the cleaning condition of the current environment is dirty when the value of the dirt data is greater than the preset value and the accumulated value of the area of the dirt data is greater than the preset area value of the area. (For example, for a bedroom of 20m 2 , when the accumulated area of its dirty data is 7m 2 greater than the preset area value of 5m 2 , it is confirmed that the current cleanliness of the bedroom is dirty)
- the sweeping robot when the sweeping robot performs the cleaning task, the sweeping robot is controlled to collect furniture information of the current environment, the room type of the current environment is determined according to the furniture information, the cleaning mode is determined according to the room type, and the cleaning of the current environment is obtained.
- the cleaning parameters of the cleaning robot in the cleaning mode are determined according to the cleaning situation, and the cleaning robot is controlled to perform cleaning on the current cleaning area according to the cleaning parameters. Control the cleaning robot to determine the cleaning mode corresponding to the room type according to different room types, and further determine the cleaning parameters in the cleaning mode according to the cleaning situation of the cleaning area, and control the cleaning robot to complete the cleaning of the cleaning area according to the determined cleaning parameters.
- the sweeping robot When the sweeping robot cleans different types of rooms, the sweeping robot automatically changes the cleaning mode corresponding to the room type in the cleaning area, realizing the use of different cleaning modes for different types of rooms, achieving the effect of refined cleaning, and does not require The user performs manual replacement, which improves the intelligence of the sweeping robot.
- the step of determining the cleaning mode according to the room type includes: step S21, when the room type is a kitchen type, determining that the current cleaning mode is the kitchen cleaning mode; or, step S22, When the room type is the bedroom type, then determine that the current cleaning mode is the bedroom cleaning mode; or, in step S23, when the room type is the living room type, determine that the current cleaning mode is the living room cleaning mode; or, step S23 S24, when the room type is the toilet type, determine that the current cleaning mode is the toilet cleaning mode.
- the cleaning mode of the kitchen type is the cleaning mode determined when the cleaning robot determines that the cleaning area is the kitchen.
- the determination method can be determined by using a monocular camera installed in the forward direction of the sweeping robot and a ranging sensor to capture that there are cabinets and kitchen trash cans in the current cleaning area, and then it can be determined that the sweeping robot is currently executing If the room type of the cleaning area is the kitchen type, then determine that the current cleaning mode of the sweeping robot is the kitchen cleaning mode, and the functional modules of the sweeping robot running in the kitchen cleaning mode are: wet mopping, adding degreasing detergent to the water tank, cleaning and other functions.
- the cleaning mode of the bedroom type is the cleaning mode determined when the cleaning robot determines that the cleaning area is the bedroom.
- the determination method may determine that the room type in the current cleaning area of the cleaning robot is the bedroom type when the camera installed in the forward direction of the cleaning robot captures that there are beds and wardrobes in the current cleaning area, Then it is determined that the current cleaning mode of the sweeping robot is the bedroom sweeping mode, and the functional modules of the sweeping robot running in the bedroom sweeping mode are: dry mopping, adding air freshener to remove mites in the water tank, cleaning and other functions.
- the cleaning mode of the living room type is the cleaning mode determined when the cleaning robot determines that the cleaning area is the living room.
- the method for determining the room type in the current cleaning area of the cleaning robot can be determined when the camera installed in the forward direction of the cleaning robot captures the presence of sofas, cleaning robots, and shoe cabinets in the current cleaning area. If it is a living room type, it is determined that the current cleaning mode of the sweeping robot is the living room cleaning mode, and the functional modules of the sweeping robot running in the living room cleaning mode are: cleaning, vacuuming, wet mopping and other functions.
- the cleaning mode of the toilet type is the cleaning mode determined when the sweeping robot determines that the cleaning area is the toilet.
- the determination method may be that when a camera installed in the forward direction of the cleaning robot captures the presence of furniture such as toilets and sinks in the current cleaning area, it can be determined that the room type of the cleaning area currently performed by the cleaning robot is: For toilet type, determine that the current cleaning mode of the sweeping robot is the toilet cleaning mode, and the functional modules of the sweeping robot running in the toilet cleaning mode are: cleaning, dry mopping and other functions.
- the cleaning robot can determine the room type of the cleaning area according to the furniture obtained by the cleaning robot, and then perform cleaning with different cleaning parameters, so that different cleaning parameters can be set according to different room types during the cleaning process. Improve the efficiency of cleaning and the intelligence of the sweeping robot.
- FIG. 4 is another embodiment of the present application.
- the step of determining the cleaning parameters of the cleaning robot in the cleaning mode according to the cleaning situation includes:
- Step S31 comparing the cleaning situation of the current environment with the preset cleaning situation in the cleaning mode
- Step S32 determining the cleaning parameters of the cleaning robot in the cleaning mode according to the comparison result.
- the cleaning parameters corresponding to different cleaning modes are further determined according to the cleaning conditions of the current cleaning area, and the cleaning conditions of the current environment can be compared with the preset cleaning conditions.
- the cleaning situation of the current environment is cleaner than the preset cleaning situation
- the working parameters of each functional module of the sweeping robot are reduced by means such as reducing the adsorption force of the fan; when the cleaning situation of the current environment is more dirty than the preset cleaning situation , then increase the adsorption strength of the fan and so on to improve the working parameters of each functional module of the sweeping robot, so as to realize the cleaning of the cleaning area.
- the step of comparing the cleaning situation of the current environment with the preset cleaning situation in the cleaning mode includes:
- step S311 the cleaning conditions of different areas in the current environment are acquired, and the cleaning conditions of each area are compared with the preset cleaning conditions of the corresponding preset areas in the cleaning mode.
- the cleaning area in the current environment is divided into different areas.
- the cleaning condition of each divided area corresponds to the cleaning condition corresponding to the cleaning mode.
- the preset cleaning conditions of the preset area are compared.
- the cleaning area of the bedroom type is divided into 8 areas.
- the cleaning robot compares the cleaning situation of area 1 with the cleaning situation corresponding to the preset area 1, and compares the cleaning situation of area 2 with the cleaning situation corresponding to the preset area 2. By comparing, ..., the overall cleaning situation of the current cleaning area is obtained, which improves the accuracy of the sweeping robot in determining the cleaning situation of the current cleaning area.
- the step of determining the cleaning parameters of the cleaning robot in the cleaning mode according to the comparison result includes:
- Step S321 adjusting the cleaning parameters of each area of the cleaning robot in the cleaning mode according to the comparison result of each area;
- Step S322 determining that the adjusted cleaning parameter is used as the cleaning parameter of each area.
- the cleaning situation of each area is compared with the preset cleaning situation corresponding to the area to determine whether the cleaning mode is in the cleaning mode.
- cleaning parameters For example, the cleaning robot sets the living room into 12 areas, of which area 1 is the entrance area.
- the cleaning robot sets the living room into 12 areas, of which area 1 is the entrance area.
- the cleaning robot can adjust the cleaning parameters in the cleaning mode according to the cleaning conditions of the divided cleaning areas, and control the cleaning robot to increase the cleaning intensity in the area with a dirty environment, and reduce the cleaning intensity in the area with a relatively clean environment. It can intelligently adjust the cleaning parameters according to the cleaning conditions of the cleaning area, so as to prevent the robot from using the same cleaning parameters in the same cleaning mode, if the cleaning force is not enough, the cleaning area is still dirty or the cleaning force is too large, which leads to energy waste. question.
- the user information of the cleaning area can also be obtained, and when it is recognized that the user is in a sleep state, the cleaning parameters of the cleaning robot are reduced to prevent The noise made by the cleaning robot when it performs work affects the rest of the user.
- the present application proposes yet another embodiment.
- the method further includes: step S40, obtaining a terminal account connected to the cleaning robot; step S50, storing the cleaning process, the furniture information and all The room type is displayed on the display interface for logging in the terminal account.
- the cleaning process, the identified furniture information, and the room type are displayed by the cleaning robot in the terminal APP that has established a connection relationship.
- the determined room type is displayed, and then the cleaning process is displayed.
- a sweeping control method according to another embodiment of the present application will be described by taking the target object as a room door and the environment information including room area information as an example.
- the network interface 1004 is mainly used to connect to the background server and perform data communication with the background server;
- the user interface 1003 is mainly used to connect the client installed on the user's intelligent terminal, and perform data communication with the client ;
- the processor 1001 can be used to call the sweeping control program stored in the memory 1005, and perform the following operations:
- the room area information determine a cleanable area in the to-be-cleaned area, where the cleanable area is a passable room area in the to-be-cleaned area that the cleaning robot can enter;
- a target area is determined, and the cleaning robot is controlled to supplement the cleaning of the target area, and the target area is an unsweeped area in the cleanable area.
- the processor 1001 may also call the sweeping control program stored in the memory 1005 to perform the following operations:
- a grid map of the area to be cleaned is established.
- the processor 1001 may also call the sweeping control program stored in the memory 1005 to perform the following operations:
- the image information is input into the room door recognition network, and the trained room door recognition model is invoked to recognize the room door information of the area to be cleaned in the image information.
- the processor 1001 may also call the sweeping control program stored in the memory 1005 to perform the following operations:
- the image information is input into the room door recognition network, and the trained room door recognition model is called to identify the room door information of the area to be cleaned in the image information; the room door information is used to indicate room area information;
- the grid map is decomposed to obtain the room type environment information of the room area corresponding to each room door in the area to be cleaned in the grid map.
- the processor 1001 may also call the sweeping control program stored in the memory 1005 to perform the following operations:
- the cleaning robot can enter the room area corresponding to the room door information, mark the passable room area that the cleaning robot can enter as a cleanable area, and use the cleanable area as the type of the target object.
- the processor 1001 may also call the sweeping control program stored in the memory 1005 to perform the following operations:
- Calling the navigation module for navigation controlling the sweeping robot to move to the area of the room that has been missed, and calling a sensor to detect whether the door of the room in the area of the missed room can pass;
- the room door can pass through, the room area that has been missed is determined as the target area, and the cleaning robot is controlled to enter the target area to perform supplementary cleaning.
- the processor 1001 may also call the sweeping control program stored in the memory 1005 to perform the following operations:
- calling the navigation module for navigation controlling the sweeping robot to move to the area of the room that has been missed, and calling a sensor to detect whether the door of the room in the area of the room that has been missed can pass;
- the inaccessible area is used as the type of the target object.
- FIG. 5 is a flowchart of the second embodiment of the sweeping control method of the present application.
- the sweeping control method includes the following steps:
- Step S40 acquiring image information of the area to be cleaned, performing deep learning recognition on the image information, and acquiring room area information in the image information;
- the cleaning robot controls the cleaning robot to explore the area to be cleaned for the first time, and traverse the area to be cleaned.
- the area to be cleaned may be a room area range divided by the working environment of the cleaning robot.
- the area to be cleaned may be the entire area in the house, or a part of the area.
- the area to be cleaned is the area in the entire house.
- the cleaning application client assigns a cleaning task to the cleaning robot, wherein the cleaning task is a task of assigning a part or all of the area to be cleaned to the cleaning robot based on the scope of the area to be cleaned.
- the user assigns the cleaning task of cleaning room A and room B in the area to be cleaned to the cleaning robot through the cleaning application client installed on the smart terminal, and drives the cleaning robot to clean room A and room B. Room B.
- the sweeping robot is provided with a laser sensor, and when sweeping the area to be cleaned immediately, the laser sensor emits a detection signal, and receives the detection data reflected back in the area to be cleaned, and analyzes the detection data , establish a grid map, divide the area to be cleaned into several grids through the grid map, and mark whether there is a room area in each grid.
- the area to be cleaned traversed by the cleaning robot for establishing the complete grid map is the area in the entire house.
- the sweeping robot preliminarily obtains the apartment type environment information of the area to be cleaned according to the established grid map.
- the sweeping robot is further provided with a camera module.
- the camera module performs real-time photography of the area to be cleaned, and collects image information of the area to be cleaned, wherein the image information of the area to be cleaned is collected.
- the image information is a plurality of pictures that reflect the environmental information in the area to be cleaned and collected by the sweeping robot through the camera module.
- the image information can also be collected by the sweeping robot in the area to be cleaned when traversing the area to be cleaned.
- the environmental information of the video frame extracted from the video.
- the photographing module is a depth camera, and the image information obtained based on the depth camera is depth image information.
- a room door recognition network is preset for the cleaning robot, and the room door recognition network includes a trained room door recognition model, wherein the room door recognition model is established based on a neural network algorithm, and A network model capable of recognizing room door information in images after multiple training sessions.
- the sweeping robot After acquiring the image information of the area to be cleaned, the sweeping robot inputs the image information into the room door recognition network, uses the deep learning algorithm to call the room door recognition model, performs target detection on the image information, and extracts the image information according to the room door recognition model.
- Room door feature determine whether the image information collected by the sweeping robot when traversing the area to be cleaned has a room door, if the image information has room door features, output the location information of the room door through the room door recognition model, so that the sweeping robot can Get the room door information in the area to be detected.
- the location information of the room door may include coordinates or other data that can reflect the location relationship.
- Step S50 Determine a cleanable area in the to-be-cleaned area according to the room area information
- the room door information in the to-be-detected area is acquired, wherein the acquired room door information includes the location information of the room door, which corresponds to the location on the grid map. Mark the room door information, and decompose the grid map to obtain the room corresponding to each room door, and obtain a grid map that can fully reflect the household type environment information of the area to be cleaned.
- the cleaning robot can identify the room area information in the area to be cleaned according to the grid map after the area decomposition, and plan the path according to the grid map after the area decomposition, where the room area information includes the room door information and the information of the area to be cleaned. House type environment information.
- the sweeping robot obtains the decomposed grid map, obtains the room area information carried in the grid map, explores the room area in the decomposed area to be cleaned in the grid map, traverses each room area, and determines the decomposed room Whether the area is accessible for cleaning.
- the sweeping robot reads the room area information of the grid map and generates an exploration path plan.
- the exploration path planning is calculated by the sweeping robot based on the room area information reflected by the grid map, which can calculate all or part of the room area in the grid map. Planning the route of the sweeping robot that is exploring, calling the navigation module to explore the room in the room area information, and recording the exploration trajectory of the sweeping robot.
- the exploration trajectory is the motion trajectory of the sweeping robot during the exploration process.
- the exploration trajectory is compared with the grid map to identify the exploration trajectory of the sweeping robot in the area near the door of the room. It is shown that the cleaning robot can reach the door of the room, and it is determined that the door of the room can be reached according to the exploration track, and the connection relationship of each room in the area to be cleaned is determined. Store the connectivity in the specified memory. Control the sweeping robot to continue to identify the exploration trajectory, and determine whether the sweeping robot can pass through the room door and enter the room area corresponding to the room door. If the exploration trajectory extends to the interior of the room area, it means that the sweeping robot can enter the room area.
- the room area that the cleaning robot can enter is marked as a cleanable area, wherein the cleanable area is the room area in the area to be cleaned that the cleaning robot can enter and clean.
- the sweeping robot if the sweeping robot is exploring the room, there may be obstacles in a certain room door or other factors prevent the sweeping robot from reaching the room door, call the camera module to take pictures, and generate a room door
- the unreachable notification information is sent to the sweeping control client installed on the smart terminal bound to the sweeping robot, and prompts the user who uses the sweeping robot.
- the sweeping robot determines that a certain room door can be reached during the process of exploring the room, and the sweeping robot cannot enter the room area connected to the room door through the room door, judge that The door of the room cannot pass, and a notification message that the room cannot be entered is generated to the cleaning control client installed on the intelligent terminal bound to the cleaning robot, and the user using the cleaning robot is prompted.
- Step S60 Determine a target area according to the cleaning motion trajectory, and control the cleaning robot to supplementally clean the target area, where the target area is an uncleaned area in the cleanable area.
- the cleaning robot responds to the cleaning control instruction issued by the user through the cleaning application client installed on the smart terminal, reads the cleaning control instruction, obtains the cleaning task, identifies the area to be cleaned that needs to be cleaned in the cleaning task, and calls the The navigation module plans the cleaning route according to the grid map that has divided the room area, and controls the cleaning robot to clean the area to be cleaned in the cleaning task according to the cleaning route planned by the navigation module.
- the sweeping robot records a sweeping movement trajectory when cleaning, wherein the sweeping movement trajectory is the movement trajectory of the sweeping robot when it moves to the area to be cleaned for cleaning after receiving the sweeping control instruction.
- the recorded sweeping motion trajectory is analyzed to detect whether there is a room that is missed during the sweeping process of the sweeping robot.
- by analyzing the cleaning motion trajectory it can be obtained that the cleaning robot cleans those rooms during the cleaning process, and whether the cleaned room matches the area to be cleaned specified by the cleaning task in the cleaning control instruction.
- the cleaning motion trajectory shows that the cleaned area of the cleaning robot matches the area to be cleaned specified in the control instruction, then the cleaning robot does not miss the room in this cleaning task, and the cleaning robot completes this cleaning. task, and write off this cleaning task in the system.
- the cleaned area of the cleaning robot learned from the cleaning motion trajectory does not match the area to be cleaned specified in the control instruction, that is, there is an existing area in the to-be-cleaned area.
- the sweeping robot invokes the camera module to take pictures of the door of the room that has not been cleaned, identifies the captured photos, and determines whether the door of the room is blocked. If the identification result is no blockage, the sweeping robot restarts Move to the door of the room, and judge whether the sweeping robot can pass the door of the room that was missed. If the sweeping robot can enter the room that was missed through the door, mark the uncleaned area as the target area, and control the sweeping robot to re-target the target. area to be rescanned. After the sweeping robot completes the passability detection of all the rooms that have been missed, it will perform supplementary sweeping on the target area that all the sweeping robots can enter.
- the image information of the area to be cleaned is obtained, the room door information in the image information is extracted, and the generated grid map is divided into areas according to the room door information, so as to obtain each room in the grid map related to the room.
- Fig. 6 is a flowchart of the third embodiment of the sweeping control method of the application, and in the present embodiment, the sweeping control method comprises the following steps:
- Step S61 Responding to the cleaning control instruction, controlling the cleaning robot to clean the area to be cleaned;
- Step S62 call the grid map to compare with the cleaning motion trajectory of the sweeping robot, and determine that there is a room area that is missed in this cleaning;
- Step S63 calling the navigation module for navigation, controlling the cleaning robot to move to the area of the room that has been missed, and calling a sensor to detect whether the door of the room in the area of the room that has been missed can pass;
- Step S64 If the door of the room cannot pass, mark the unsweeped room area as an unreachable area, report the room area information of the unreachable area to the client, and stop the missed scan detection of the unreachable area .
- the cleaning robot responds to the cleaning control instruction issued by the user through the cleaning application client installed on the smart terminal, reads the cleaning control instruction, obtains the cleaning task, identifies the cleanable area that needs to be cleaned in the cleaning task, and calls the The navigation module plans the cleaning route according to the grid map that has divided the room area, and controls the cleaning robot to clean the cleanable area in the cleaning task according to the cleaning route planned by the navigation module.
- the sweeping robot records a sweeping movement trajectory when cleaning, wherein the sweeping movement trajectory is the movement trajectory of the sweeping robot when it moves to the area to be cleaned for cleaning after receiving the sweeping control instruction.
- the sweeping robot finishes cleaning, analyze the recorded sweeping movement trajectory, compare the sweeping movement trajectory with the grid map, and detect whether the sweeping robot has missed rooms during the cleaning process. If the comparison result shows that the cleaning robot has not cleaned one or more rooms in the cleaning task, it is determined that the cleaning robot has missed cleaning in this cleaning task.
- the sweeping robot if the result of the sweeping robot recognizing the photo is that there is an obstacle, the sweeping robot cannot reach the door of the room, and the room that has not been cleaned is marked as an unreachable room, and the sweeping robot sends this document to the mobile terminal bound to the sweeping robot. In the next cleaning task, there is a prompt message of an unreachable room, and the missed cleaning detection for the missed room is stopped.
- the sweeping robot moves to the door of the room and invokes the binocular sensor to determine whether the room door can pass. If the sweeping robot obtains the door of the room according to the binocular sensor In a tightly closed state, it is judged that the door of the room cannot pass through, and the sweeping robot cannot enter the room that was missed to clean, mark the room as an inaccessible room, and the sweeping robot sends the mobile terminal bound to the sweeping robot. The prompt message of the inaccessible room will stop the missed scan detection of the missed room.
- the door of the room that the cleaning robot cannot reach and/or the room area that the cleaning robot cannot enter in the cleaning task is marked, and sent.
- the corresponding prompt information reminds the user, so that the user can check the area that the sweeping robot has not cleaned, and avoid the sweeping robot encountering unreachable room doors and/or room areas that the sweeping robot cannot enter during the missed sweep detection.
- the missed scan detection leads to the situation that the cleaning robot is stuck, thereby improving the efficiency of missed scanning detection, and ensuring that the cleaning robot can effectively perform supplementary cleaning for the rooms that are missed and can be supplemented.
- FIG. 7 is a schematic block diagram of the sweeping control device of the present application.
- the present application also provides a sweeping control device, the sweeping control device includes:
- the room area identification module 10 is configured to obtain image information of the area to be cleaned, perform deep learning identification on the image information, and obtain room area information in the image information;
- a cleaning area determination module 20 configured to determine a cleanable area in the to-be-cleaned area according to the room area information, where the cleanable area is a passable room area in the to-be-cleaned area that the cleaning robot can reach;
- the target area supplementary cleaning module 30 is configured to determine a target area according to the cleaning motion trajectory, and control the cleaning robot to supplementally scan the target area, where the target area is an uncleaned area in the cleanable area.
- another aspect of the present application also provides a cleaning robot, the cleaning robot includes a memory, a processor and a control program of the cleaning robot stored in the memory and running on the processor, the processor The steps of the above-mentioned cleaning control method are implemented when the control program of the cleaning robot is executed.
- another aspect of the present application further provides a computer-readable storage medium, where a control program of a cleaning robot is stored on the computer-readable storage medium, and when the control program of the cleaning robot is executed by a processor Implement the steps of the sweeping control method as described above.
- the embodiments of the present application may be provided as methods and computer program products. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
- a computer-usable storage media including, but not limited to, disk storage, CD-ROM, optical storage, etc.
- These computer program instructions may also be stored in a computer-readable memory capable of directing a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory result in an article of manufacture comprising instruction means, the instructions
- the apparatus implements the functions specified in the flow or flow of the flowcharts and/or the block or blocks of the block diagrams.
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Electric Vacuum Cleaner (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
一种扫地控制方法、装置、扫地机器人及计算机可读介质。扫地控制方法包括:获取扫地机器人的当前图像信息;根据图像信息确定房间区域信息;根据房间区域信息选择相应的清扫策略。
Description
相关申请的交叉引用
本申请要求深圳市杉川致行科技有限公司于2020年11月27日提交的、申请名称为“扫地机器人的控制方法、扫地机器人以及计算机可读存储介质”的、中国专利申请号“202011369013.8”的优先权;
以及深圳市杉川致行科技有限公司于2020年11月30日提交的、申请名称为“扫地控制方法、装置、扫地机器人以及计算机可读存储介质”的、中国专利申请号“202011374059.9”的优先权。
本申请涉及扫地机器人技术领域,尤其涉及一种扫地控制方法、装置、扫地机器人及计算机可读介质。
随着信息技术的快速发展,越来越多的智能家具进入了人们的生活,例如洗碗机、扫地机器人等。
现有技术中,扫地机器人执行清洁任务时,大多通过人机交互的方式实现对扫地机器人的控制,通过人为调控的方式控制扫地机器人对按照执行不同强度的清洁,需要用户时刻关注扫地机器人的清扫情况以在扫地机器人完成一处清扫后重新输入清洁强度控制参数,控制过程繁琐。
申请内容
本申请旨在至少在一定程度上解决相关技术中的技术问题之一。
为此,本申请的一个目的在于提出一种扫地控制方法、装置、扫地机器人及计算机可读介质。
根据本申请的扫地控制方法包括:获取所述扫地机器人的当前环境信息;根据所述环境信息确定目标对象的类型;根据所述目标对象的类型确定对应的清扫模式,并控制所述扫地机器人按照所确定的清扫模式执行清扫。
本发明上述扫地控制方法,通过扫地机器人在实时清扫行走的过程中,对周围环境进行拍照以采集周围环境的环境信息如待清洁区域的图像信息、家具信息、房间区域信息等,而扫地机器人执行清扫之前,先确定环境信息中包括目标对象的类别,进而根据该类别确 定对应的清扫模式,从而保证扫地机器人有针对性清扫,且准确清扫,提高清扫效率。
根据本申请的扫地控制方法包括:获取待清扫区域的图像信息,对所述图像信息进行深度学习识别,获取所述图像信息中的房间区域信息;根据所述房间区域信息,确定所述待清洁区域中的可清扫区域,所述可清扫区域为所述待清扫区域中扫地机器人能够进入的可通行房间区域;依据清扫运动轨迹,确定目标区域,控制所述扫地机器人对所述目标区域补扫,所述目标区域为所述可清洁区域中存在的未清扫的区域。
根据本申请的一些实施例,所述获取待清洁区域的图像信息,对所述图像信息进行深度学习识别,获取所述图像信息中的房间区域信息步骤之前,还包括:控制所述扫地机器人遍历所述待清扫区域;调用激光传感器,采集所述扫地机器人在所述待清扫区域内运动过程中的检测数据;根据所述检测数据,建立所述待清扫区域的栅格地图。
根据本申请的一些实施例,获取待清扫区域的图像信息,对所述图像信息进行深度学习识别,获取所述图像信息中的房间区域信息步骤包括:控制所述扫地机器人遍历所述待清扫区域;调用拍照模块,对待清扫区域进行拍照,获取待清扫区域的所述图像信息;将所述图像信息输入到房间门识别网络,调用训练好的房间门识别模型识别所述图像信息中待清扫区域的房间门信息。
根据本申请的一些实施例,所述建立所述待清扫区域的栅格地图步骤,还包括:将所述图像信息输入到房间门识别网络,调用训练好的房间门识别模型识别所述图像信息中待清扫区域的房间门信息;根据获取到的房间门信息,对栅格地图进行区域分解,获得栅格地图中待清扫区域的与每个房间门对应的房间区域的户型环境信息。
根据本申请的一些实施例,所述根据所述房间区域信息,确定所述待清洁区域中的可清扫区域步骤包括:控制所述扫地机器人基于所述房间区域信息,初始化探索所述待清洁区域;获取所述扫地机器人的探索轨迹,判断所述扫地机器人能否进入与所述房间门信息对应的所述房间区域;若所述扫地机器人能够进入与房间门信息对应的所述房间区域,将所述扫地机器人能够进入的可通行房间区域标记为可清洁区域。
根据本申请的一些实施例,所述依据清扫运动轨迹,确定目标区域,控制所述扫地机器人对所述目标区域补扫步骤包括:响应清扫控制指令,控制所述扫地机器人对待清洁区域进行清扫;调用所述栅格地图与所述扫地机器人清洁运动轨迹进行对比,确定本次清扫存在漏扫房间区域;调用导航模块进行导航,控制所述扫地机器人移动到漏扫房间区域,调用传感器检测所述漏扫房间区域的所述房间门是否能够通过;若所述房间门能够通过,确定所述漏扫房间区域为目标区域,控制所述扫地机器人进入所述目标区域进行补扫。
根据本申请的一些实施例,所述依据清扫运动轨迹,确定目标区域,控制所述扫地机器人对所述目标区域补扫步骤,还包括:响应清扫控制指令,控制所述扫地机器人对待清洁 区域进行清扫;调用所述栅格地图与所述扫地机器人清洁运动轨迹进行对比,确定本次清扫存在漏扫房间区域;调用导航模块进行导航,控制所述扫地机器人移动到漏扫房间区域,调用传感器检测所述漏扫房间区域的所述房间门能否通过;若所述房间门不能通过,标记所述漏扫房间区域为不可通区域,上报所述不可通区域的所述房间区域信息到服务器,停止对所述不可通区域的漏扫检测。
此外,为实现上述目的,本申请还提出一种扫地控制装置,所述扫地控制装置包括:房间区域识别模块,用于获取待清扫区域的图像信息,对所述图像信息进行深度学习识别,获取所述图像信息中的房间区域信息;清扫区域确定模块,用于根据所述房间区域信息,确定所述待清洁区域中的可清扫区域;目标区域补扫模块,用于依据清扫运动轨迹,确定目标区域,控制所述扫地机器人对所述目标区域补扫,所述目标区域为所述可清洁区域中存在的未清扫的区域。
本申请实施例所提出的一种扫地控制方法,实现控制扫地机器人有效识别待清洁区域的房间信息,进而在清扫过程中针对未清扫的房间进行补扫检测,并确定该房间能否到达,针对扫地机器人能够达到的未清扫房间进行补扫,从而避免出现漏扫情况,提高扫地机器人的清洁效率的有益效果。
根据本申请的扫地控制方法包括:获取扫地机器人当前环境的家具信息;根据所述家具信息确定当前环境的房间类型;根据所述房间类型确定清扫模式;获取当前环境的清洁情况,根据所述清洁情况确定所述扫地机器人在所述清扫模式下的清扫参数,控制所述扫地机器人按照所述清扫参数执行清扫。
根据本申请的一些实施例,所述根据所述房间类型确定清扫模式的步骤,包括:当所述房间类型为厨房类型时,则确定当前的清扫模式为厨房清扫模式;或者,当所述房间类型为卧室类型时,则确定当前的清扫模式为卧室清扫模式;或者,当所述房间类型为客厅类型时,则确定当前的清扫模式为客厅清扫模式;或者,当所述房间类型为卫生间类型时,则确定当前的清扫模式为卫生间清扫模式。
根据本申请的一些实施例,所述根据所述清洁情况确定所述扫地机器人在所述清扫模式下的清扫参数的步骤,包括:将当前环境的清洁情况与在所述清扫模式下的预设清洁情况进行比对;根据比对结果确定所述扫地机器人在所述清扫模式下的清扫参数。
根据本申请的一些实施例,所述将当前环境的清洁情况与在所述清扫模式下的预设清洁情况进行比对的步骤,包括:获取当前环境中不同区域的清洁情况,将每一区域的清洁情况与所述清扫模式下对应的预设区域的预设清洁情况进行比对。
根据本申请的一些实施例,所述根据比对结果确定所述扫地机器人在所述清扫模式下的清扫参数的步骤,包括:根据每一区域的比对结果调整所述扫地机器人在所述清扫模式下 的每一区域的所述清扫参数;确定以调整后的所述清扫参数作为所述每一区域的所述清扫参数。
根据本申请的一些实施例,所述家具信息包括家具的类型信息以及家具的位置信息,所述获取扫地机器人当前环境的家具信息的步骤,包括:步骤S1,控制所述扫地机器人获取当前环境的图像数据;步骤S2,基于深度学习的第一目标检测算法确定所述图像数据中每一家具的类型信息,以及确定所述家具在所述图像数据中的位置以及宽度值和高度值;以及,步骤S3,将所述图像数据中每一所述家具的像素与测距传感器获取的数据进行结合得到所述图像数据中每一所述家具的深度值;步骤S4,获取多帧不同时刻的图像数据执行步骤S1至步骤S3得到所述家具的位置信息。
根据本申请的一些实施例,所述获取当前环境的清洁情况的步骤,包括:基于深度学习的第二目标检测算法提取所述图像数据中的脏污数据;统计所述脏污数据的数值和面积;根据所述数值和所述面积确定当前环境的清洁情况。
根据本申请的一些实施例,所述控制所述扫地机器人按照所述清扫参数执行清扫的步骤之后,还包括:获取与所述扫地机器人连接的终端账号;将所述清扫过程、所述家具信息以及所述房间类型显示于登录所述终端账号的显示界面。
此外,为实现上述目的,本申请另一方面还提供一种扫地机器人,所述扫地机器人包括存储器、处理器及存储在存储器上并在处理器上运行的扫地机器人的控制程序,所述处理器执行所述扫地机器人的控制程序时实现如上述实施例中所述扫地控制方法。
本申请上述扫地控制方法,通过在扫地机器人执行清扫任务时,控制扫地机器人采集当前环境的家具信息,根据所述家具信息确定当前环境的房间类型,根据所述房间类型确定清扫模式,获取当前环境的清洁情况,根据所述清洁情况确定所述扫地机器人在所述清扫模式下的清扫参数,控制所述扫地机器人按照所述清扫参数对当前的清扫区域执行清扫。控制扫地机器人根据不同的房间类型确定与房间类型对应的清扫模式,进一步地根据清扫区域的清洁情况确定所述清扫模式下的清扫参数,以确定的清扫参数控制扫地机器人完成对该清洁区域的清扫,在扫地机器人对不同类型的房间进行清扫时,扫地机器人自动更换与清扫区域的房间类型对应的清扫模式,实现对不同类型的房间使用不同的清洁模式,达到精细化打扫的效果,且不需要用户进行手动替换,提高了扫地机器人的智能性。
此外,为实现上述目的,本申请另一方面还提供一种计算机可读存储介质,所述计算机可读存储介质上存储有扫地机器人的控制程序,所述扫地机器人的控制程序被处理器执行时实现如上实施例中所述扫地控制方法。
本申请的附加方面和优点将在下面的描述中部分给出,部分将从下面的描述中变得明显,或通过本申请的实践了解到。
图1为本申请实施例扫地控制方法的流程示意图;
图2为本申请实施例方案涉及的硬件运行环境的扫地机器人结构示意图;
图3为本申请扫地控制方法第一实施例的流程示意图;
图4为本申请扫地控制方法另一实施例的流程示意图;
图5为本申请第二实施例的流程示意图;
图6为本申请第三实施例的流程示意图;
图7为本申请实施例方案涉及的硬件运行环境的扫地控制装置模块示意图。
应当理解,此处所描述的具体实施例仅用以解释本申请,并不用于限定本申请。
本申请公开了一种扫地控制方法,如图1所示,扫地控制方法包括:
获取所述扫地机器人的当前环境信息;扫地机器人可以在进行移动的过程中,利用扫地机器人上所设置的采集装置如摄像头、距离传感器等,利用采集装置可以实现对周围环境信息的采集,环境信息可以是待清洁区域的图像信息、家具信息、房间区域信息等。
根据所述环境信息确定目标对象的类型,目标对象可以理解为扫地机器人所要清扫的区域、具体的房间位置、具体的行走路线等;
根据所述目标对象的类型确定对应的清扫模式,并控制所述扫地机器人按照所确定的清扫模式执行清扫,对于不同的目标对象所需要的清洁模式以及目标对象待清扫时的状态不同,需要根据具体对目标对象进行分析,扫地机器人根据目标对象的类型按照与目标对象相匹配的清扫模式进行清扫。
如图2所示,图2为本申请实施例方案涉及的硬件运行环境的扫地机器人结构示意图。
如图2所示,该扫地机器人可以包括:处理器1001,例如CPU,网络接口1004,用户接口1003,存储器1005,通信总线1002。其中,通信总线1002用于实现这些组件之间的连接通信。用户接口1003可以包括显示屏(Display)、输入单元比如键盘(Keyboard),可选用户接口1003还可以包括标准的有线接口、无线接口。网络接口1004根据本申请的一些实施例可以包括标准的有线接口、无线接口(如Wi-Fi接口)。存储器1005可以是高速RAM存储器,也可以是稳定的存储器(non-volatile memory),例如磁盘存储器。存储器1005根据本申请的一些实施例还可以是独立于前述处理器1001的存储装置。
根据本申请的一些实施例,扫地机器人还可以包括摄像头、测距传感器(激光雷达、TOF传感器、双目视觉测距传感器等)、RF(Radio Frequency,射频)电路传感器、遥控器、音 频电路、Wi-Fi模块、检测器等等。当然,所述扫地机器人还可配置陀螺仪、气压计、湿度计、温度传感器等其他传感器,在此不再赘述。
本领域技术人员可以理解,图2中示出的扫地机器人结构并不构成对扫地机器人设备的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
如图2所示,作为一种计算机可读存储介质的存储器1005中可以包括操作系统、网络通信模块、用户接口模块以及扫地控制程序。
下面以目标对象为家具,环境信息包括家具对应的家具信息为例说明本申请的扫地控制方法。
本实施例主要解决方案是在扫地机器人执行清扫任务时,获取扫地机器人当前环境的家具信息;根据所述家具信息确定当前环境的房间类型,根据所述房间类型确定清扫模式;获取当前环境的清洁情况,根据所述清洁情况确定所述扫地机器人在所述清扫模式下的清扫参数,控制所述扫地机器人按照所述清扫参数执行清扫。
现有技术中扫地机器人在执行清扫任务时,是通过使用统一的参数对需要清洁的区域进行统一打扫,亦或者在打扫不同房间时往往需要通过获取用户预先设置的参数进行打扫。在实际使用中,扫地机器人在清洁不同区域采用统一的清洁方式清洁时,会出现清洁强度不够导致房间没有清洁干净或者过度清洁损耗大量的能源问题。
在图2所示的扫地机器人中,网络接口1004主要用于连接后台服务器,与后台服务器进行数据通信;用户接口1003主要用于连接客户端(用户端),与客户端进行数据通信;而处理器1001可以用于调用存储器1005中扫地控制程序,并执行以下操作:
获取扫地机器人当前环境的家具信息;
根据所述家具信息确定当前环境的房间类型,根据所述房间类型确定清扫模式;
获取当前环境的清洁情况,根据所述清洁情况确定所述扫地机器人在所述清扫模式下的清扫参数,控制所述扫地机器人按照所述清扫参数执行清扫。
参考图3,图3为本申请扫地控制方法第一实施例的流程示意图。
本申请实施例提供了扫地控制方法的实施例,需要说明的是,虽然在流程图中示出了逻辑顺序,但是在某些情况下,可以以不同于此处的顺序执行所示出或描述的步骤。
扫地控制方法包括:
步骤S10,获取扫地机器人当前环境的家具信息;
所述家具信息包括家具的类型信息以及家具的位置信息等信息。
所述获取扫地机器人当前环境的家具信息的步骤,包括:
步骤S1,控制所述扫地机器人获取当前环境的图像数据;
步骤S2,基于深度学习的第一目标检测算法确定所述图像数据中每一家具的类型信息, 以及确定所述家具在所述图像数据中的位置以及宽度值和高度值;
步骤S3,将所述图像数据中每一所述家具的像素与测距传感器获取的数据进行结合得到所述图像数据中每一所述家具的深度值;
步骤S4,获取多帧不同时刻的图像数据执行步骤S1至步骤S3得到所述家具的位置信息。
扫地机器人可通过单目相机(例如,单目RGB)获取当前环境中的图像数据,进而通过基于深度学习的第一目标检测算法确定图像数据中包含的家具的类型信息。所述基于深度学习的第一目标检测算法为以大量统计的图像数据为基础,通过目标模型提取出图像数据中的家具信息的算法。使用边界框提取图像数据中存在的家具图像,进而确定图像数据中的家具在所述图像数据中的位置,以及家具的宽度值和高度值。扫地机器人通过图像数据中每一家具的像素与测距传感器获取到的数据进行结合得到每一家具的深度值,进而通过每一家具的深度值确定其与扫地机器人之间的距离(深度值越大距离越远),由此,获取多帧图像数据进行分析,即可得到扫地机器人与图像数据中的家具之间的距离。可以理解的是,在扫地机器人执行清扫任务的过程中,其通过SLAM(即时定位与地图构建)确定其所在的位置,在确定扫地机器人与家具的距离时,即可确定扫地机器人在清扫过程中获取到图像数据中包含的家具在物理世界中的位置信息。
在上述实施例中通过单目相机以及结合测距传感器即可确定扫地机器人在执行清扫过程中获取的图像数据的家具的位置信息,增加了扫地机器人确定家具的位置信息的多样性。
步骤S20,根据所述家具信息确定当前环境的房间类型,根据所述房间类型确定清扫模式;
在本申请中将扫地机器人需要清扫的房间区域按照用户活动将其区分为不同的房间类型,包括厨房类型、卧室类型、客厅类型以及卫生间类型。
所述根据所述家具信息确定当前环境的房间类型的步骤,包括:
步骤S23,将采集到的家具信息通过深度学习模型确定家具的类型;
步骤S24,通过所述家具类型确定所述当前环境的房间类型。
在本实施例中,当识别到扫地机器人当前清扫区域存在的家具信息后,进一步地通过基于深度学习的目标检测算法确定家具的类型。例如,将获取到的图像数据输入到包含识别床类型的目标检测算法中,识别出扫地机器人当前获取的图像信息中所包含的家具的类型为床,即可确定扫地机器人当前所在的环境为卧室。
根据本申请的一些实施例,扫地机器人在执行清扫前对需要清扫区域的各个房间的类型进行标记,并将标记的房间类型与对应的家具信息进行映射存储,以使在扫地机器人通过摄像装置获取到清扫区域的各个家具信息时,根据采集的家具信息确定当前清扫区域中的 房间类型,进而确定当前清扫区域的清扫模式。
所述清扫模式为与房间类型所对应的清扫模式,其包括厨房清扫模式、卧室清扫模式、客厅清扫模式、以及卫生间清扫模式。
根据本申请的一些实施例,将不同的房间类型与扫地机器人在清扫时所需要选择的清扫模式相对应存储,进而能够控制扫地机器人在识别到不同房间类型时,选择与其对应的清扫模式执行清扫任务。在本申请中,在不同的清扫模式下,扫地机器人执行的清扫参数不同。例如,对于厨房清扫模式,由于厨房中存在较大的油烟污渍,在对厨房区域进行打扫时,控制扫地机器人启动滚刷进行清扫;对于卧式清扫模式,当扫地机器人当前所在的清扫区域为卧室清扫区域时,由于卧室清扫区域中放置大量的被褥,容易产生螨虫,且用户每天有较长的休息时间处于卧室,在完成对房间进行清洁的基础上,还需要控制扫地机器人及时对房间进行除螨以及净化空气。
根据本申请的一些实施例,在扫地机器人识别到房间中的家具信息包括扫地机器人、沙发时,即可确认扫地机器人当前清扫区域的房间类型为客厅类型,则确定扫地机器人的清扫模式为客厅清扫模式,控制扫地机器人按照所述清扫模式下的各清扫参数执行清扫任务。
步骤S30,获取当前环境的清洁情况,根据所述清洁情况确定所述扫地机器人在所述清扫模式下的清扫参数,控制所述扫地机器人按照所述清扫参数执行清扫。
所述清扫参数为扫地机器人根据当前清扫区域的清洁情况确定的参数,例如,在确定当前环境中存在大面积的水渍时,扫地机器人启动大拖把的拖地功能,将当前清扫区域中的水渍清理干净;当确定当前环境中存在水渍的面积较小时,则可以启动小拖把的拖地功能,将当前清扫区域中的水渍清理干净。
当扫地机器人根据房间类型确定清扫模式后,进一步地获取当前房间的清洁情况,根据房间的清洁情况实时确定在该清扫模式下的清扫参数,进而控制扫地机器人按照确定的清扫参数对当前进行清扫。
根据本申请的一些实施例,在本实施例中预先设置扫地机器人在确定当前清扫区域的房间类型后,进而根据房间类型确定对应的清扫模式,进一步地获取当前环境的清洁情况,根据清洁情况确定扫地机器人在所述清扫模式下的清扫参数,其中,清扫参数的具体确定方法,可根据房间类型预先将与房间类型对应的清扫模式的标准清扫参数进行存储,进而在扫地机器人在执行清扫任务的过程中根据获取当前的家具信息确定清扫区域的清扫模式,进一步地根据清扫区域中的清洁情况确定在所述清扫模式下的清扫参数。可以理解的是,当清扫区域中的清洁情况与在所述清扫模式下预设清洁情况一致时,则控制扫地机器人按照所述清扫模式下的标准清扫参数进行清扫。
述获取当前环境的清洁情况的步骤,包括:
步骤S31,基于深度学习的第二目标检测算法提取所述图像数据中的脏污数据;
步骤S32,统计所述脏污数据的数值和/或面积;
步骤S33,根据所述数值和/或所述面积确定当前环境的清洁情况。
所述基于深度学习的第二目标检测算法为基于大量的图像数据进行深度学习,以得到图像数据中存在的脏污数据的算法。所述脏污数据包括:水渍、灰尘、泥垢、垃圾等数据。
在本申请中,扫地机器人通过将获取到的图像数据基于深度学习的第二目标检测算法提取图像数据中的脏污数据,进而统计获取到的脏污数据的数值以及面积,通过数值以及面积确定当前环境的清洁情况。
根据本申请的一些实施例,可将获取到的脏污数据的数值与预设数值进行比对,进而确定当前环境的清洁情况,例如,可通过映射表建立当前环境的清洁情况与预设数值之间的映射关系,预设数值小于或等于4时对应的清洁情况为干净,预设数值小于或等于4时对应的清洁情况为肮脏。则当脏污数据的数值为5时,大于预设数值4,则确定当前环境的清洁情况为肮脏。
根据本申请的一些实施例,还可通过获取脏污数据的面积确定扫地机器人的面积确定当前环境的清洁情况。其可统计脏污数据中每一脏污的面积,进而将每一脏污的面积进行累加,当其大于该区域的预设面积值时,则确认当前环境的清洁情况为肮脏。
根据本申请的一些实施例,当前环境的清洁情况可同时参考脏污数据的数值与面积。其可为在脏污数据的数值大于预设数值且脏污数据的面积的累加值大于该区域的预设面积值时,确认当前环境的清洁情况为肮脏。(例如,对于20m
2的卧室,当其脏污数据的累加面积为7m
2大于预设面积值5m
2,则确认当前卧室的清洁情况为肮脏)
而在本申请中通过在扫地机器人执行清扫任务时,控制扫地机器人采集当前环境的家具信息,根据所述家具信息确定当前环境的房间类型,根据所述房间类型确定清扫模式,获取当前环境的清洁情况,根据所述清洁情况确定所述扫地机器人在所述清扫模式下的清扫参数,控制所述扫地机器人按照所述清扫参数对当前的清扫区域执行清扫。控制扫地机器人根据不同的房间类型确定与房间类型对应的清扫模式,进一步地根据清扫区域的清洁情况确定所述清扫模式下的清扫参数,以确定的清扫参数控制扫地机器人完成对该清洁区域的清扫,在扫地机器人对不同类型的房间进行清扫时,扫地机器人自动更换与清扫区域的房间类型对应的清扫模式,实现对不同类型的房间使用不同的清洁模式,达到精细化打扫的效果,且不需要用户进行手动替换,提高了扫地机器人的智能性。
根据本申请的一些实施例,根据所述房间类型确定清扫模式的步骤,包括:步骤S21,当所述房间类型为厨房类型时,则确定当前的清扫模式为厨房清扫模式;或者,步骤S22,当所述房间类型为卧室类型时,则确定当前的清扫模式为卧室清扫模式;或者,步骤S23, 当所述房间类型为客厅类型时,则确定当前的清扫模式为客厅清扫模式;或者,步骤S24,当所述房间类型为卫生间类型时,则确定当前的清扫模式为卫生间清扫模式。
在本申请中,厨房类型的清扫模式为在扫地机器人确定清扫区域为厨房时,确定的清扫模式。根据本申请的一些实施例,其确定方法可以通过安装于扫地机器人前进方向上的单目相机以及测距传感器拍摄到当前清扫区域中存在橱柜、餐厨垃圾桶时,即可确定扫地机器人当前执行清洁区域的房间类型为厨房类型,则确定扫地机器人当前的清扫模式为厨房清扫模式,扫地机器人在厨房清扫模式运行的功能模块为:湿拖、在水箱加入去油渍的清洗剂、打扫等功能。
卧室类型的清扫模式为在扫地机器人确定清扫区域为卧室时,确定的清扫模式。根据本申请的一些实施例,其确定方法可以通过安装于扫地机器人前进方向上的摄像头拍摄到当前清扫区域中存在床、衣柜时,即可确定扫地机器人当前执行清洁区域的房间类型为卧室类型,则确定扫地机器人当前的清扫模式为卧室清扫模式,扫地机器人在卧室清扫模式运行的功能模块为:干拖、在水箱加入除螨的空气清香剂、打扫等功能。
客厅类型的清扫模式为在扫地机器人确定清扫区域为客厅时,确定的清扫模式。根据本申请的一些实施例,其确定方法可以通过安装于扫地机器人前进方向上的摄像头拍摄到当前清扫区域中存在沙发、扫地机器人、鞋柜时,即可确定扫地机器人当前执行清洁区域的房间类型为客厅类型,则确定扫地机器人当前的清扫模式为客厅清扫模式,扫地机器人在客厅清扫模式运行的功能模块为:打扫、吸尘、湿拖等功能。
卫生间类型的清扫模式为在扫地机器人确定清扫区域为卫生间时,确定的清扫模式。根据本申请的一些实施例,其确定方法可以通过安装于扫地机器人前进方向上的摄像头拍摄到当前清扫区域中存在马桶、洗手台等家具时,即可确定扫地机器人当前执行清洁区域的房间类型为卫生间类型,则确定扫地机器人当前的清扫模式为卫生间清扫模式,扫地机器人在卫生间清扫模式运行的功能模块为:打扫、干拖等功能。
在本实施例中,扫地机器人可根据扫地机器人获取到的家具确定清扫区域的房间类型,进而以不同的清扫参数执行清扫,实现了在清扫的过程中根据房间类型的不同设置不同的清扫参数,提高了清扫的效率以及扫地机器人的智能性。
参照图4,图4为本申请的又一实施例。所述根据所述清洁情况确定所述扫地机器人在所述清扫模式下的清扫参数的步骤,包括:
步骤S31,将当前环境的清洁情况与在所述清扫模式下的预设清洁情况进行比对;
步骤S32,根据比对结果确定所述扫地机器人在所述清扫模式下的清扫参数。
在确定扫地机器人当前的清扫模式后,进一步地根据当前的清扫区域的清洁情况确定不同清扫模式下所对应的清扫参数,可将当前环境的清洁情况与预设的清洁情况进行比对。 在当前环境的清洁情况比预设清洁情况更干净时,则采用如降低风机的吸附力度等方式降低扫地机器人各个功能模块的工作参数;在当前环境的清洁情况比预设清洁情况更肮脏时,则提高风机的吸附力度等提高扫地机器人各个功能模块的工作参数,以实现对清洁区域的清洁。
所述将当前环境的清洁情况与在所述清扫模式下的预设清洁情况进行比对的步骤,包括:
步骤S311,获取将当前环境中不同区域的清洁情况,将每一区域的清洁情况与所述清扫模式下对应的预设区域的预设清洁情况进行比对。
在本实施例中,将当前环境中的清扫区域划分成不同的区域,在扫地机器人确定当前环境的清洁情况时,根据划分的每一区域的清洁情况对应地与在所述清扫模式下对应的预设区域的预设清洁情况进行比对。例如,将卧室类型的清扫区域划分成为8个区域,扫地机器人将区域1的清洁情况与预设区域1对应的清洁情况进行比对,将区域2的清洁情况与预设区域2对应的清洁情况进行比对,……,进而得到当前清扫区域的整体清洁情况,提高了扫地机器人确定当前清扫区域的清洁情况的准确性。
所述根据比对结果确定所述扫地机器人在所述清扫模式下的清扫参数的步骤,包括:
步骤S321,根据每一区域的比对结果调整所述扫地机器人在所述清扫模式下的每一区域的所述清扫参数;
步骤S322,确定以调整后的所述清扫参数作为所述每一区域的所述清扫参数。
在本申请中,在扫地机器人获取到当前环境中的每一区域的清洁情况时,将每一区域的清洁情况与该区域对应的预设清洁情况进行比对,以确定在所述清扫模式下的清扫参数。例如,扫地机器人将客厅分别设置成12个区域,其中区域1为玄关区域,将区域1中的清洁情况与预设区域1的清洁情况进行比对时,确定区域1的灰尘浓度比预设区域1的灰尘浓度要高,即可确定增大预设区域1中所对应的清扫参数,以将区域1清扫干净。
在本实施例中,扫地机器人能够根据划分的清扫区域的清洁情况对清扫模式下的清扫参数进行调整,控制扫地机器人在环境较脏的区域加大清洁力度,在环境较干净的区域,降低清洁力度,能够智能地根据清扫区域的清洁情况调整清扫参数,防止扫地机器人在同一清扫模式下使用统一的清扫参数时,存在清洁力度不够导致清扫区域仍然存在脏污或者清洁力度过大导致能源浪费的问题。
根据本申请的一些实施例,在本申请中,当扫地机器人在执行清扫的过程中,还可以获取清扫区域的用户信息,当识别到用户处于睡眠状态时,则降低扫地机器人的清扫参数,防止扫地机器人执行工作时发出的噪音影响用户休息。
基于上一实施例,本申请提出又一实施例。所述控制所述扫地机器人按照所述清扫参数 执行清扫的步骤之后,还包括:步骤S40,获取与所述扫地机器人连接的终端账号;步骤S50,将所述清扫过程、所述家具信息以及所述房间类型显示于登录所述终端账号的显示界面。
在本申请中,在扫地机器人将其清扫过程、以及识别的家具信息、以及房间类型显示于已建立连接关系的终端APP中,其显示方式可以为首先将扫地机器人在清扫过程中的家具信息显示于终端的显示界面,进而显示确定的房间类型,继而显示清扫过程。通过在终端显示扫地机器人的清洁过程、家具信息以及房间类型可以让用户得知扫地机器人的清扫情况,为用户重新设置扫地机器人的清扫参数提供依据。
下面以目标对象为房间门,环境信息包括房间区域信息为例说明本申请另一实施例的扫地控制方法。
在图2所示的扫地机器人中,网络接口1004主要用于连接后台服务器,与后台服务器进行数据通信;用户接口1003主要用于连接安装于用户智能终端上的客户端,与客户端进行数据通信;而处理器1001可以用于调用存储器1005中存储的扫地控制程序,并执行以下操作:
获取待清扫区域的图像信息,对所述图像信息进行深度学习识别,获取所述图像信息中的房间区域信息;
根据所述房间区域信息,确定所述待清洁区域中的可清扫区域,所述可清扫区域为所述待清扫区域中扫地机器人能够进入的可通行房间区域;
依据清扫运动轨迹,确定目标区域,控制所述扫地机器人对所述目标区域补扫,所述目标区域为所述可清洁区域中存在的未清扫的区域。
根据本申请的一些实施例,处理器1001还可以调用存储器1005中存储的扫地控制程序,执行以下操作:
控制所述扫地机器人遍历所述待清扫区域;
调用激光传感器,采集所述扫地机器人在所述待清扫区域内运动过程中的检测数据;
根据所述检测数据,建立所述待清扫区域的栅格地图。
根据本申请的一些实施例,处理器1001还可以调用存储器1005中存储的扫地控制程序,执行以下操作:
控制所述扫地机器人遍历所述待清扫区域;
调用拍照模块,对待清扫区域进行拍照,获取待清扫区域的所述图像信息;
将所述图像信息输入到房间门识别网络,调用训练好的房间门识别模型识别所述图像信息中待清扫区域的房间门信息。
根据本申请的一些实施例,处理器1001还可以调用存储器1005中存储的扫地控制程序, 执行以下操作:
将所述图像信息输入到房间门识别网络,调用训练好的房间门识别模型识别所述图像信息中待清扫区域的房间门信息;该房间门信息用于指示房间区域信息;
根据获取到的房间门信息,对栅格地图进行区域分解,获得栅格地图中待清扫区域的与每个房间门对应的房间区域的户型环境信息。
根据本申请的一些实施例,处理器1001还可以调用存储器1005中存储的扫地控制程序,执行以下操作:
控制所述扫地机器人基于所述房间区域信息,初始化探索所述待清洁区域;
获取所述扫地机器人的探索轨迹,判断所述扫地机器人能否进入与所述房间门信息对应的所述房间区域;
若所述扫地机器人能够进入所述与房间门信息对应的所述房间区域,将所述扫地机器人能够进入的可通行房间区域标记为可清洁区域,将可清洁区域作为目标对象的类型。
根据本申请的一些实施例,处理器1001还可以调用存储器1005中存储的扫地控制程序,执行以下操作:
响应清扫控制指令,控制所述扫地机器人对可清洁区域进行清扫;
调用所述栅格地图与所述扫地机器人清洁运动轨迹进行对比,确定本次清扫存在漏扫房间区域;
调用导航模块进行导航,控制所述扫地机器人移动到漏扫房间区域,调用传感器检测所述漏扫房间区域的所述房间门是否能够通过;
若所述房间门能够通过,确定所述漏扫房间区域为目标区域,控制所述扫地机器人进入所述目标区域进行补扫。
根据本申请的一些实施例,处理器1001还可以调用存储器1005中存储的扫地控制程序,执行以下操作:
响应清扫控制指令,控制所述扫地机器人对待清洁区域进行清扫;
调用所述栅格地图与所述扫地机器人清洁运动轨迹进行对比,确定本次清扫存在漏扫房间区域;
调用导航模块进行导航,控制所述扫地机器人移动到漏扫房间区域,调用传感器检测所述漏扫房间区域的所述房间门能否通过;
若所述房间门不能通过,标记所述漏扫房间区域为不可通区域,上报所述不可通区域的所述房间区域信息到服务器,停止对所述不可通区域的漏扫检测。其中,将不可通区域作为该目标对象的类型。
参见图5,图5为本申请扫地控制方法的第二实施例流程图。
本实施例中,扫地控制方法包括以下步骤:
步骤S40:获取待清扫区域的图像信息,对所述图像信息进行深度学习识别,获取所述图像信息中的房间区域信息;
本实施例中,扫地机器人在上电启动后,控制扫地机器人对待清洁区域进行首次探索,遍历待清洁区域。其中,待清洁区域可以是扫地机器人工作环境进行划分的房间区域范围。根据本申请的一些实施例,待清洁区域可以是房屋内整个区域,也可以是其中部分区域,在一具体实施例中,待清扫区域为整个房屋内的区域,用户通过安装在智能终端上的扫地应用程序客户端对扫地机器人进行下达清洁任务,其中,该清洁任务为基于待清扫区域的范围,对扫地机器人下达对待清扫区域的局部或全部区域的任务。在另一具体实施例中,用户通过安装在智能终端上的扫地应用程序客户端对扫地机器人下达内容为清扫待清洁区域内的房间A和房间B的清洁任务,驱动扫地机器人进行清扫房间A和房间B。
根据本申请的一些实施例,该扫地机器人设置有激光传感器,在扫地即日起遍历待清洁区域时,通过激光传感器发射探测信号,并接收待清洁区域内反射回来的检测数据,对检测数据进行解析,建立栅格地图,通过栅格地图将待清扫区域划分为若干个栅格,并标记每个栅格是否存在房间区域。根据本申请的一些实施例,为了确保栅格地图的完整性,扫地机器人为建立完整的栅格地图所遍历的待清洁区域为整个房屋内的区域。扫地机器人根据建立完成的栅格地图初步获取待清洁区域的户型环境信息。
根据本申请的一些实施例,该扫地机器人还设置有拍照模块,在扫地机器人遍历待清洁区域时,通过拍照模块对待清洁区域进行实时拍摄,采集待清扫区域的图像信息,其中,待清扫区域的图像信息为扫地机器人通过拍照模块采集的反映待清洁区域内的环境信息的多张图片,根据本申请的一些实施例,该图像信息也可以是扫地机器人在遍历待清洁区域时采集待清洁区域内的环境信息的视频中提取到的图像帧。根据本申请的一些实施例,在一具体实施例中,该拍照模块为深度相机,基于该深度相机获取到的图像信息为深度图像信息。
根据本申请的一些实施例,该扫地机器人预设了房间门识别网络,该房间门识别网络包括已经训练完成的房间门识别模型,其中,房间门识别模型是基于神经网络算法所建立的,且经过多次训练后,能够识别图像中的房间门信息的网络模型。扫地机器人在获取待清洁区域的图像信息后,将图像信息输入房间门识别网络,利用深度学习算法,调用房间门识别模型,对图像信息进行目标检测,根据房间门识别模型,提取图像信息中的房间门特征,判断扫地机器人在遍历待清扫区域内时采集到的图像信息是否存在房间门,若该图像信息存在房间门特征是,通过房间门识别模型输出房间门的位置信息,使得扫地机器人能够获取待检测区域中的房间门信息。其中,房间门的位置信息可以包括坐标或其它能够反映位 置关系的数据。
步骤S50:根据所述房间区域信息,确定所述待清洁区域中的可清扫区域;
本实施例中,通过训练好的房间门识别模型进行识别后,获取待检测区域中的房间门信息后,其中,获取到的房间门信息包括房间门的位置信息,在栅格地图上对应位置处标记房间门信息,并对栅格地图进行区域分解,获得与每个房间门对应的房间,获取能够完整反映待清扫区域的户型环境信息的栅格地图。扫地机器人根据进行区域分解后的栅格地图,能够识别待清扫区域内的房间区域信息,并根据区域分解后的栅格地图进行路径规划,其中,房间区域信息包括房间门信息和待清扫区域的户型环境信息。
扫地机器人获取进行区域分解后的栅格地图,获取栅格地图中携带的房间区域信息,对栅格地图中分解的待清扫区域内的房间区域进行探索,遍历每个房间区域,判断分解的房间区域是否能够进入清扫。扫地机器人读取栅格地图的房间区域信息,生成探索路径规划,其中,探索路径规划是扫地机器人基于栅格地图反映的房间区域信息,计算得到的能够对栅格地图中的全部或部分房间区域进行探索的扫地机器人行进路线规划,调用导航模块对房间区域信息中的房间进行探索,并记录扫地机器人的探索轨迹。其中,探索轨迹为扫地机器人在探索过程中的运动轨迹。根据扫地机器人的探索轨迹判断待清洁区域中的可清洁区域,根据本申请的一些实施例,将探索轨迹与栅格地图进行对比,识别扫地机器人在房间门附近区域的探索轨迹,若该探索轨迹显示扫地机器人能到达房间门,根据该探索轨迹确定该房间门能够到达,确定待清扫区域中的各房间的连通关系。将该连通关系存储到指定的存储器中。控制扫地机器人继续识别该探索轨迹,判断扫地机器人能否通过该房间门,进入与该房间门对应的房间区域,若该探索轨迹延伸到房间区域内部,即表示该房间区域扫地机器人能够进入,将扫地机器人能够进入的房间区域标记为可清扫区域,其中,可清扫区域为待清扫区域中扫地机器人能够进入其中进行打扫的房间区域。
根据本申请的一些实施例,若扫地机器人在对房间进行探索过程中,某个房间门可能存在障碍物或其它因素导致扫地机器人无法到达该房间门时,调用拍照模块进行拍照,并生成房间门不可达的通知信息到与该扫地机器人绑定的,安装在智能终端上的扫地控制客户端,对使用扫地机器人的用户进行提示。
根据本申请的一些实施例,若扫地机器人在对房间进行探索过程中,在确定某个房间门能够到达,且扫地机器人无法通过该房间门进入到与该房间门相连的房间区域内部时,判断该房间门无法通过,生成无法进入该房间区域的通知信息到与该扫地机器人绑定的安装于智能终端上的扫地控制客户端,对使用扫地机器人的用户进行提示。
步骤S60:依据清扫运动轨迹,确定目标区域,控制所述扫地机器人对所述目标区域补扫,所述目标区域为所述可清洁区域中存在的未清扫的区域。
本实施例中,扫地机器人响应用户通过安装在智能终端上的扫地应用程序客户端下达的清扫控制指令,读取清扫控制指令,获取清扫任务,识别清扫任务中需要进行清扫的待清扫区域,调用导航模块根据已经划分好房间区域的栅格地图规划清扫线路,控制扫地机器人根据导航模块规划的清扫路线对清扫任务中的待清扫区域进行清扫。
根据本申请的一些实施例,扫地机器人在进行清扫时,记录清扫运动轨迹,其中,清扫运动轨迹为扫地机器人从接收到清扫控制指令后,移动到待清扫区域进行清扫时的运动轨迹。在扫地机器人清扫结束后,对记录到的清扫运动轨迹进行分析,检测扫地机器人在清扫过程中是否存在漏扫的房间。根据本申请的一些实施例,分析清扫运动轨迹可以得到扫地机器人在清扫过程中对那些房间进行了清扫,将已清扫的房间和清扫控制指令中的清扫任务规定的待清扫区域是否匹配,根据本申请的一些实施例,若清扫运动轨迹显示扫地机器人已清扫区域和控制指令中规定的待清扫区域相匹配,则扫地机器人在本次清扫任务中并不存在漏扫房间,扫地机器人完成本次清扫任务,在系统中核销本次清扫任务。
根据本申请的一些实施例,若在分析扫地机器人的清扫运动轨迹时,根据清扫运动轨迹获知到的扫地机器人的已清扫区域和控制指令中规定的待清扫区域不匹配,即待清扫区域中存在扫地机器人未进行清扫的区域。判定扫地机器人在本次清扫任务中存在漏扫情况。调用已经划分好房间区域的栅格地图,确定被漏扫的房间的位置,控制扫地机器人导航至接近前述被漏扫的房间,并使扫地机器人移动至该房间的房间门附近,重新检测扫地机器人能否打开该房间的房间门或进入该被漏扫的房间。
根据本申请的一些实施例,扫地机器人调用拍照模块,对漏扫房间的房间门进行拍摄,对拍摄到的照片进行识别,判断房间门是否被堵塞,若识别结果为无堵塞物,扫地机器人重新移动到房间门,判断扫地机器人能否通过被漏扫房间的房间门,若扫地机器人能够通过该房间门进入被漏扫的房间,标记该未打扫的区域为目标区域,控制扫地机器人重新对目标区域进行补扫。扫地机器人完成对全部被漏扫的房间的可通过性检测后,对全部扫地机器人能够进入的目标区域进行补扫。
本实施例中,通过获取待清扫区域的图像信息,提取图像信息中的房间门信息,并根据该房间门信息对生成的栅格地图进行划分区域,以获得栅格地图中的每个与房间门相对应的房间区域,根据已经划分好房间区域的栅格地图进行清扫路线规划,在清扫完成后,根据扫地机器人清扫运动轨迹与清扫控制指令规定的待清扫区域进行对比,判断扫地机器人是否存在漏扫房间,若存在漏扫房间,控制扫地机器人重新判断该漏扫房间能否到达,若该漏扫房间能够到达,重新对漏扫房间进行补扫,确保扫地机器人能够对清扫任务中的房间进行有效打扫,避免漏扫房间,提高扫地机器人的清扫效率与清扫效果。
参见图6,图6为本申请扫地控制方法第三实施例流程图,本实施例中,扫地控制方法 包括以下步骤:
步骤S61:响应清扫控制指令,控制所述扫地机器人对待清洁区域进行清扫;
步骤S62:调用所述栅格地图与所述扫地机器人清洁运动轨迹进行对比,确定本次清扫存在漏扫房间区域;
步骤S63:调用导航模块进行导航,控制所述扫地机器人移动到漏扫房间区域,调用传感器检测所述漏扫房间区域的所述房间门能否通过;
步骤S64:若所述房间门不能通过,标记所述漏扫房间区域为不可通区域,上报所述不可通区域的所述房间区域信息到客户端,停止对所述不可通区域的漏扫检测。
本实施例中,扫地机器人响应用户通过安装在智能终端上的扫地应用程序客户端下达的清扫控制指令,读取清扫控制指令,获取清扫任务,识别清扫任务中需要进行清扫的可清扫区域,调用导航模块根据已经划分好房间区域的栅格地图规划清扫线路,控制扫地机器人根据导航模块规划的清扫路线对清扫任务中的可清扫区域进行清扫。
根据本申请的一些实施例,扫地机器人在进行清扫时,记录清扫运动轨迹,其中,清扫运动轨迹为扫地机器人从接收到清扫控制指令后,移动到待清扫区域进行清扫时的运动轨迹。在扫地机器人清扫结束后,对记录到的清扫运动轨迹进行分析,将清扫运动轨迹与栅格地图进行对比,检测扫地机器人在清扫过程中是否存在漏扫的房间。若对比结果显示扫地机器人存在未对清扫任务中的一个或多个房间进行打扫,确定本次清扫任务中,扫地机器人存在漏扫。在栅格地图中确定漏扫房间的位置,控制扫地机器人导航到被漏扫的房间门区域,并调用拍照模块进行拍摄,对拍摄到的照片进行障碍物识别,判断房间门区域的路段是否存在障碍物,导致扫地机器人无法到达该房间门。
根据本申请的一些实施例,若扫地机器人识别照片结果为存在障碍物,扫地机器人无法到达该房间门,标记该漏扫房间为不可达房间,扫地机器人向与扫地机器人绑定的移动终端发送本次清扫任务中存在不可达房间的提示信息,停止对该漏扫房间进行漏扫检测。
根据本申请的一些实施例,若扫地机器人识别照片结果为不存在障碍物,扫地机器人移动到房间门,调用双目传感器判断该房间门能否通过,若扫地机器人根据双目传感器获取该房间门处于紧闭状态,判断该房间门无法通过,扫地机器人无法进入被漏扫的房间进行清扫,标记该房间为不可通房间,扫地机器人向与扫地机器人绑定的移动终端发送本次清扫任务中存在不可通房间的提示信息,停止对该漏扫房间进行漏扫检测。
本实施例中,通过对扫地机器人在清扫任务中存在的漏扫房间进行漏扫检测,对清扫任务中的扫地机器人不能到达的房间门和/或扫地机器人无法进入的房间区域进行标记,并发送对应的提示信息对用户进行提醒,使得用户能够对扫地机器人未打扫的区域进行排查,避免扫地机器人在进行漏扫检测时遇到不能到达的房间门和/或扫地机器人无法进入的房 间区域重复进行漏扫检测,导致扫地机器人卡死的情况发生,进而提高漏扫检测效率,确保扫地机器人能够有效地对能够进行补扫的被漏扫房间进行补扫。
此外,参见图7,图7为本申请的扫地控制装置的模块示意图。
为了实现上述实施例,本申请还提供一种扫地控制装置,所述扫地控制装置包括:
房间区域识别模块10,用于获取待清扫区域的图像信息,对所述图像信息进行深度学习识别,获取所述图像信息中的房间区域信息;
清扫区域确定模块20,用于根据所述房间区域信息,确定所述待清洁区域中的可清扫区域,所述可清扫区域为所述待清扫区域中扫地机器人能够到达的可通行房间区域;
目标区域补扫模块30,用于依据清扫运动轨迹,确定目标区域,控制所述扫地机器人对所述目标区域补扫,所述目标区域为所述可清洁区域中存在的未清扫的区域。
此外,为实现上述目的,本申请另一方面还提供一种扫地机器人,所述扫地机器人包括存储器、处理器及存储在存储器上并在处理器上运行的扫地机器人的控制程序,所述处理器执行所述扫地机器人的控制程序时实现如上所述扫地控制方法的步骤。
此外,为实现上述目的,本申请另一方面还提供一种计算机可读存储介质,所述计算机可读存储介质上存储有扫地机器人的控制程序,所述扫地机器人的控制程序被处理器执行时实现如上所述扫地控制方法的步骤。
本领域内的技术人员可以理解,本申请的实施例可提供为方法、计算机程序产品。因此,本申请可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本申请可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
本申请是参照根据本申请实施例的方法、设备、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可 编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
Claims (20)
- 一种扫地控制方法,其特征在于,所述扫地控制方法包括:获取所述扫地机器人的当前环境信息;根据所述环境信息确定目标对象的类型;根据所述目标对象的类型确定对应的清扫模式,并控制所述扫地机器人按照所确定的清扫模式执行清扫。
- 根据权利要求1所述的扫地控制方法,其特征在于,所述环境信息包括房间区域信息,所述获取所述扫地机器人的房间区域信息包括:获取所述环境信息中获取待清扫区域的图像信息,对所述图像信息进行深度学习识别,获取所述图像信息中的房间区域信息。
- 根据权利要求2所述的扫地控制方法,其特征在于,所述获取待清扫区域的图像信息,对所述图像信息进行深度学习识别,获取所述图像信息中的房间区域信息步骤包括:控制所述扫地机器人遍历所述待清扫区域;调用拍照模块,对所述待清扫区域进行拍照,获取所述待清扫区域的图像信息;将所述图像信息输入到房间门识别网络,调用训练好的房间门识别模型识别所述图像信息中待清扫区域的房间门信息,所述房间门信息用于指示房间区域信息。
- 根据权利要求2所述的扫地控制方法,其特征在于,所述获取待清洁区域的图像信息,对所述图像信息进行深度学习识别,获取所述图像信息中的房间区域信息步骤之前,还包括:控制所述扫地机器人遍历所述待清扫区域;调用激光传感器,采集所述扫地机器人在所述待清扫区域内运动过程中的检测数据;根据所述检测数据,建立所述待清扫区域的栅格地图。
- 根据权利要求4所述的扫地控制方法,其特征在于,所述建立所述待清扫区域的栅格地图步骤,还包括:将所述图像信息输入到房间门识别网络,调用训练好的房间门识别模型识别所述图像信息中待清扫区域的房间门信息;根据获取到的房间门信息,对栅格地图进行区域分解,获得栅格地图中待清扫区域的与每个房间门对应的房间区域的户型环境信息。
- 根据权利要求2所述的扫地控制方法,其特征在于,所述目标对象包括房间门,所述根据所述房间区域信息,确定所述房间门对应的类型步骤包括:控制所述扫地机器人基于所述房间区域信息,初始化探索所述待清洁区域;获取所述扫地机器人的探索轨迹,判断所述扫地机器人能否进入与所述房间门对应的房间区域;若所述扫地机器人能够进入与所述房间门对应的房间区域,将所述扫地机器人能够进入的可通行房间区域标记为可清洁区域,以作为所述房间门对应的类型。
- 根据权利要求6所述的扫地控制方法,其特征在于,所述根据所述目标对象的类型确定对应的清扫模式,并控制所述扫地机器人按照所确定的清扫模式执行清扫包括:依据清扫运动轨迹,确定所述可清洁区域内的目标区域,控制所述扫地机器人对所述目标区域补扫。
- 根据权利要求7所述的扫地控制方法,其特征在于,所述依据清扫运动轨迹,确定所述可清洁区域内的目标区域,控制所述扫地机器人对所述目标区域补扫的步骤包括:响应清扫控制指令,控制所述扫地机器人对可清洁区域进行清扫;调用可清扫区域的栅格地图与所述扫地机器人清洁运动轨迹进行对比,确定本次清扫存在漏扫房间区域;调用导航模块进行导航,控制所述扫地机器人移动到漏扫房间区域,调用传感器检测所述漏扫房间区域的所述房间门是否能够通过;若所述房间门能够通过,确定所述漏扫房间区域为所述目标区域,控制所述扫地机器人进入所述目标区域进行补扫。
- 根据权利要求7所述的扫地控制方法,其特征在于,所述根据所述目标对象的类型确定对应的清扫模式,并控制所述扫地机器人按照所确定的清扫模式执行清扫步骤,还包括:响应清扫控制指令,控制所述扫地机器人对可清洁区域进行清扫;调用可清扫区域的栅格地图与所述扫地机器人清洁运动轨迹进行对比,确定本次清扫存在漏扫房间区域;调用导航模块进行导航,控制所述扫地机器人移动到漏扫房间区域,调用传感器检测所述漏扫房间区域的所述房间门能否通过;若所述房间门不能通过,标记所述漏扫房间区域为不可通区域,上报所述不可通区域的所述房间区域信息到服务器,停止对所述不可通区域的漏扫检测。
- 根据权利要求1所述的扫地机器人的控制方法,其特征在于,所述环境信息包括家具信息,所述家具信息包括家具的类型信息以及家具的位置信息,所述获取扫地机器人当前家具信息的步骤,包括:步骤S1,控制所述扫地机器人获取当前环境的图像数据;步骤S2,基于深度学习的第一目标检测算法确定所述图像数据中每一家具的类型信息,以及确定所述家具在所述图像数据中的位置以及宽度值和高度值;以及,步骤S3,将所述图像数据中每一所述家具的像素与测距传感器获取的数据进行结合得 到所述图像数据中每一所述家具的深度值;步骤S4,获取多帧不同时刻的图像数据执行步骤S1至步骤S3得到所述家具的位置信息。
- 根据权利要求10所述的扫地控制方法,其特征在于,所述目标对象为家具,所述根据所述目标对象的类型确定对应的清扫模式的步骤,包括:当家具所在的房间类型为厨房类型时,则确定当前的清扫模式为厨房清扫模式;或者,当家具所在的房间类型为卧室类型时,则确定当前的清扫模式为卧室清扫模式;或者,当家具所在的房间类型为客厅类型时,则确定当前的清扫模式为客厅清扫模式;或者,当家具所在的房间类型为卫生间类型时,则确定当前的清扫模式为卫生间清扫模式。
- 根据权利要求11所述的扫地控制方法,其特征在于,所述控制所述扫地机器人按照所确定的清扫模式执行清扫包括:获取当前环境的清洁情况,根据所述清洁情况确定所述扫地机器人在所述清扫模式下的清扫参数,控制所述扫地机器人按照所述清扫参数执行清扫。
- 根据权利要求12所述的扫地控制方法,其特征在于,所述根据所述清洁情况确定所述扫地机器人在所述清扫模式下的清扫参数的步骤,包括:将当前环境的清洁情况与在所述清扫模式下的预设清洁情况进行比对;根据比对结果确定所述扫地机器人在所述清扫模式下的清扫参数。
- 根据权利要求13所述的扫地控制方法,其特征在于,所述将当前环境的清洁情况与在所述清扫模式下的预设清洁情况进行比对的步骤,包括:获取当前环境中不同区域的清洁情况,将每一区域的清洁情况与所述清扫模式下对应的预设区域的预设清洁情况进行比对。
- 根据权利要求13所述的扫地控制方法,其特征在于,所述根据比对结果确定所述扫地机器人在所述清扫模式下的清扫参数的步骤,包括:根据每一区域的比对结果调整所述扫地机器人在所述清扫模式下的每一区域的所述清扫参数;确定以调整后的所述清扫参数作为所述每一区域的所述清扫参数。
- 根据权利要求12所述的扫地控制方法,其特征在于,所述获取当前环境的清洁情况的步骤,包括:基于深度学习的第二目标检测算法提取所述图像数据中的脏污数据;统计所述脏污数据的数值和面积;根据所述数值和所述面积确定当前环境的清洁情况。
- 根据权利要求16所述的扫地控制方法,其特征在于,所述控制所述扫地机器人按 照所述清扫参数执行清扫的步骤之后,还包括:获取与所述扫地机器人连接的终端账号;将所述清扫过程、所述家具信息以及所述房间类型显示于登录所述终端账号的显示界面。
- 一种扫地控制装置,其特征在于,所述扫地控制装置包括:房间区域识别模块,用于获取待清扫区域的图像信息,对所述图像信息进行深度学习识别,获取所述图像信息中的房间区域信息;清扫区域确定模块,用于根据所述房间区域信息,确定所述待清洁区域中的可清扫区域,所述可清扫区域为所述待清扫区域中扫地机器人能够进入的可通行房间区域;目标区域补扫模块,用于依据清扫运动轨迹,确定目标区域,控制所述扫地机器人对所述目标区域补扫,所述目标区域为所述可清洁区域中存在的未清扫的区域。
- 一种扫地机器人,其特征在于,所述扫地机器人包括存储器、处理器及存储在存储器上并在所述处理器上运行的扫地机器人的控制程序,所述处理器执行所述扫地机器人的控制程序时实现如权利要求1至17中任一项所述的方法的步骤。
- 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质上存储有扫地控制程序,所述扫地控制程序被处理器执行时实现如权利要求1至17中任一项所述的扫地控制方法的步骤。
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011369013.8A CN112401763A (zh) | 2020-11-27 | 2020-11-27 | 扫地机器人的控制方法、扫地机器人以及计算机可读存储介质 |
CN202011369013.8 | 2020-11-27 | ||
CN202011374059.9 | 2020-11-30 | ||
CN202011374059.9A CN112462780B (zh) | 2020-11-30 | 2020-11-30 | 扫地控制方法、装置、扫地机器人及计算机可读存储介质 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022111539A1 true WO2022111539A1 (zh) | 2022-06-02 |
Family
ID=81754992
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/132880 WO2022111539A1 (zh) | 2020-11-27 | 2021-11-24 | 扫地控制方法、装置、扫地机器人及计算机可读介质 |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2022111539A1 (zh) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115167455A (zh) * | 2022-08-03 | 2022-10-11 | 科大讯飞股份有限公司 | 一种自主移动设备控制方法及自主移动设备 |
CN115153350A (zh) * | 2022-07-14 | 2022-10-11 | 深圳拓邦股份有限公司 | 扫地机器人的补扫方法、装置、存储介质及扫地机器人 |
CN115429175A (zh) * | 2022-09-05 | 2022-12-06 | 北京云迹科技股份有限公司 | 清洁机器人控制方法、装置、电子设备和介质 |
CN115429155A (zh) * | 2022-07-29 | 2022-12-06 | 云鲸智能(深圳)有限公司 | 清洁机器人的控制方法、装置、系统及存储介质 |
CN115486762A (zh) * | 2022-08-08 | 2022-12-20 | 深圳市景创科技电子股份有限公司 | 基于九轴传感器的扫地设备的控制方法、扫地设备和介质 |
CN115715651A (zh) * | 2022-12-29 | 2023-02-28 | 科大讯飞股份有限公司 | 扫地机器人控制方法、装置、设备及可读存储介质 |
CN115900029A (zh) * | 2022-12-21 | 2023-04-04 | 宁波奥克斯电气股份有限公司 | 空调器的联动控制方法、装置、空调器和可读存储介质 |
CN116123657A (zh) * | 2023-01-04 | 2023-05-16 | 青岛海尔空调器有限总公司 | 智能家居系统的控制方法及系统、计算机设备 |
CN116418259A (zh) * | 2023-06-08 | 2023-07-11 | 深圳市德壹医疗科技有限公司 | 电机功率调节方法、装置、设备和存储介质 |
CN117193333A (zh) * | 2023-10-27 | 2023-12-08 | 高捷体育股份有限公司 | 一种地坪除尘机器人路径规划方法、清洁系统和处理器 |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006215860A (ja) * | 2005-02-04 | 2006-08-17 | Matsushita Electric Ind Co Ltd | 自律走行装置 |
CN102138769A (zh) * | 2010-01-28 | 2011-08-03 | 深圳先进技术研究院 | 清洁机器人及其清扫方法 |
CN109662651A (zh) * | 2017-10-13 | 2019-04-23 | 松下家电研究开发(杭州)有限公司 | 一种扫地机器人自动预约清扫的方法 |
DE102018212166A1 (de) * | 2018-07-20 | 2020-01-23 | BSH Hausgeräte GmbH | Automatische Reinigung einer Bodenfläche |
CN110897565A (zh) * | 2019-11-29 | 2020-03-24 | 珠海格力电器股份有限公司 | 一种多功能扫地机器人的控制系统及其方法 |
CN111096714A (zh) * | 2019-12-25 | 2020-05-05 | 江苏美的清洁电器股份有限公司 | 一种扫地机器人的控制系统及方法和扫地机器人 |
CN111202472A (zh) * | 2020-02-18 | 2020-05-29 | 深圳市愚公科技有限公司 | 一种扫地机器人的终端地图构建方法、终端设备及清扫系统 |
CN111449571A (zh) * | 2020-03-09 | 2020-07-28 | 珠海格力电器股份有限公司 | 基于定位系统的清扫方法、装置、设备及计算机可读介质 |
CN112401763A (zh) * | 2020-11-27 | 2021-02-26 | 深圳市杉川致行科技有限公司 | 扫地机器人的控制方法、扫地机器人以及计算机可读存储介质 |
CN112462780A (zh) * | 2020-11-30 | 2021-03-09 | 深圳市杉川致行科技有限公司 | 扫地控制方法、装置、扫地机器人及计算机可读存储介质 |
-
2021
- 2021-11-24 WO PCT/CN2021/132880 patent/WO2022111539A1/zh active Application Filing
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006215860A (ja) * | 2005-02-04 | 2006-08-17 | Matsushita Electric Ind Co Ltd | 自律走行装置 |
CN102138769A (zh) * | 2010-01-28 | 2011-08-03 | 深圳先进技术研究院 | 清洁机器人及其清扫方法 |
CN109662651A (zh) * | 2017-10-13 | 2019-04-23 | 松下家电研究开发(杭州)有限公司 | 一种扫地机器人自动预约清扫的方法 |
DE102018212166A1 (de) * | 2018-07-20 | 2020-01-23 | BSH Hausgeräte GmbH | Automatische Reinigung einer Bodenfläche |
CN110897565A (zh) * | 2019-11-29 | 2020-03-24 | 珠海格力电器股份有限公司 | 一种多功能扫地机器人的控制系统及其方法 |
CN111096714A (zh) * | 2019-12-25 | 2020-05-05 | 江苏美的清洁电器股份有限公司 | 一种扫地机器人的控制系统及方法和扫地机器人 |
CN111202472A (zh) * | 2020-02-18 | 2020-05-29 | 深圳市愚公科技有限公司 | 一种扫地机器人的终端地图构建方法、终端设备及清扫系统 |
CN111449571A (zh) * | 2020-03-09 | 2020-07-28 | 珠海格力电器股份有限公司 | 基于定位系统的清扫方法、装置、设备及计算机可读介质 |
CN112401763A (zh) * | 2020-11-27 | 2021-02-26 | 深圳市杉川致行科技有限公司 | 扫地机器人的控制方法、扫地机器人以及计算机可读存储介质 |
CN112462780A (zh) * | 2020-11-30 | 2021-03-09 | 深圳市杉川致行科技有限公司 | 扫地控制方法、装置、扫地机器人及计算机可读存储介质 |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115153350A (zh) * | 2022-07-14 | 2022-10-11 | 深圳拓邦股份有限公司 | 扫地机器人的补扫方法、装置、存储介质及扫地机器人 |
CN115429155A (zh) * | 2022-07-29 | 2022-12-06 | 云鲸智能(深圳)有限公司 | 清洁机器人的控制方法、装置、系统及存储介质 |
CN115429155B (zh) * | 2022-07-29 | 2023-09-29 | 云鲸智能(深圳)有限公司 | 清洁机器人的控制方法、装置、系统及存储介质 |
CN115167455A (zh) * | 2022-08-03 | 2022-10-11 | 科大讯飞股份有限公司 | 一种自主移动设备控制方法及自主移动设备 |
CN115486762A (zh) * | 2022-08-08 | 2022-12-20 | 深圳市景创科技电子股份有限公司 | 基于九轴传感器的扫地设备的控制方法、扫地设备和介质 |
CN115429175A (zh) * | 2022-09-05 | 2022-12-06 | 北京云迹科技股份有限公司 | 清洁机器人控制方法、装置、电子设备和介质 |
CN115900029A (zh) * | 2022-12-21 | 2023-04-04 | 宁波奥克斯电气股份有限公司 | 空调器的联动控制方法、装置、空调器和可读存储介质 |
CN115715651A (zh) * | 2022-12-29 | 2023-02-28 | 科大讯飞股份有限公司 | 扫地机器人控制方法、装置、设备及可读存储介质 |
CN116123657A (zh) * | 2023-01-04 | 2023-05-16 | 青岛海尔空调器有限总公司 | 智能家居系统的控制方法及系统、计算机设备 |
CN116123657B (zh) * | 2023-01-04 | 2024-05-24 | 青岛海尔空调器有限总公司 | 智能家居系统的控制方法及系统、计算机设备 |
CN116418259A (zh) * | 2023-06-08 | 2023-07-11 | 深圳市德壹医疗科技有限公司 | 电机功率调节方法、装置、设备和存储介质 |
CN116418259B (zh) * | 2023-06-08 | 2023-08-25 | 深圳市德壹医疗科技有限公司 | 电机功率调节方法、装置、设备和存储介质 |
CN117193333A (zh) * | 2023-10-27 | 2023-12-08 | 高捷体育股份有限公司 | 一种地坪除尘机器人路径规划方法、清洁系统和处理器 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2022111539A1 (zh) | 扫地控制方法、装置、扫地机器人及计算机可读介质 | |
CN112401763A (zh) | 扫地机器人的控制方法、扫地机器人以及计算机可读存储介质 | |
CN111657798B (zh) | 基于场景信息的清扫机器人控制方法、装置和清扫机器人 | |
CN111035328B (zh) | 机器人清洁方法及机器人 | |
RU2624737C2 (ru) | Способ и устройство для уборки мусора | |
CN112462780B (zh) | 扫地控制方法、装置、扫地机器人及计算机可读存储介质 | |
KR101984214B1 (ko) | 로봇 청소기의 청소 작업을 제어하기 위한 장치 및 방법 | |
CN112650205B (zh) | 一种清洁监控方法、清洁设备、服务器及存储介质 | |
US10293489B1 (en) | Control method and system, and cleaning robot using the same | |
CN107518833A (zh) | 一种扫地机器人的障碍物识别方法 | |
CN105411491A (zh) | 一种基于环境监测的家庭智能清洁系统及方法 | |
JP2015536489A (ja) | 床面を自律式に点検または処理するロボットおよび方法 | |
WO2022227533A1 (zh) | 清扫控制方法、装置和空调机 | |
CN111643017B (zh) | 基于日程信息的清扫机器人控制方法、装置和清扫机器人 | |
CN111973075A (zh) | 基于户型图的地面清扫方法、装置、扫地机和计算机介质 | |
CN108436921A (zh) | 一种扫地机器人智能控制方法 | |
CN111142531A (zh) | 基于家电联动的清洁机器人控制方法以及清洁机器人 | |
JP7173846B2 (ja) | 掃除機の制御システム、自律走行型掃除機、掃除システム、および掃除機の制御方法 | |
CN112971644B (zh) | 扫地机器人的清洁方法、装置、存储介质和扫地机器人 | |
WO2023217190A1 (zh) | 清洁方法、装置、清洁设备以及存储介质 | |
CN117297449A (zh) | 清洁设置方法、清洁设备、计算机程序产品和存储介质 | |
CN113490446A (zh) | 自主行走型清扫机、自主行走型清扫机的控制方法以及程序 | |
CN111528737A (zh) | 一种扫地机控制方法及装置 | |
CN116327039A (zh) | 自动清洁地面的方法及装置 | |
CN114947624B (zh) | 一种木地板清扫方法和装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21897039 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21897039 Country of ref document: EP Kind code of ref document: A1 |