WO2023098455A1 - 清洁设备的运行控制方法及装置、存储介质及电子装置 - Google Patents

清洁设备的运行控制方法及装置、存储介质及电子装置 Download PDF

Info

Publication number
WO2023098455A1
WO2023098455A1 PCT/CN2022/131571 CN2022131571W WO2023098455A1 WO 2023098455 A1 WO2023098455 A1 WO 2023098455A1 CN 2022131571 W CN2022131571 W CN 2022131571W WO 2023098455 A1 WO2023098455 A1 WO 2023098455A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
area
cleaning device
virtual
information
Prior art date
Application number
PCT/CN2022/131571
Other languages
English (en)
French (fr)
Inventor
丘伟楠
Original Assignee
追觅创新科技(苏州)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 追觅创新科技(苏州)有限公司 filed Critical 追觅创新科技(苏州)有限公司
Publication of WO2023098455A1 publication Critical patent/WO2023098455A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions

Definitions

  • the present application relates to the field of smart home, in particular, to an operation control method and device for cleaning equipment, a storage medium and an electronic device.
  • an application program matching the cleaning device may run on the user's terminal device. Users can configure virtual restricted areas on the area map displayed in the configuration interface of the application to set areas where cleaning equipment is allowed to clean and areas where cleaning equipment is not allowed to clean. In addition, cleaning equipment can also establish virtual exclusion zones based on trapped history.
  • the user can limit the rooms that allow cleaning robots to perform area cleaning and the rooms that do not allow cleaning equipment to perform area cleaning by setting virtual walls between the rooms.
  • the cleaning robot can set the area where the furniture is located as a virtual restricted area based on the trapped history.
  • the operation control method of the cleaning equipment in the related art has the problem that the cleaning equipment is easily trapped due to the establishment error of the virtual forbidden zone.
  • the purpose of this application is to provide an operation control method and device for cleaning equipment, a storage medium, and an electronic device, so as to at least solve the problem that cleaning equipment is easily trapped due to errors in the establishment of virtual forbidden areas in the operation control method for cleaning equipment in the related art. question.
  • a method for controlling the operation of cleaning equipment including: acquiring virtual restricted area information corresponding to a target area map, wherein the target area map is the area to be cleaned by the cleaning equipment.
  • the area map to which the area belongs, the virtual restricted area information is used to indicate the virtual restricted area in the area map; target object information is obtained through the first sensor on the cleaning device, wherein the target object information is used Object information representing the target scene object matching the virtual forbidden area in the current area where the cleaning equipment is located; in the case that the cleaning equipment is trapped during the process of cleaning the area to be cleaned, according to the
  • the virtual restricted area information and the target object information control the cleaning device to perform a target escape operation, wherein the cleaned device after escape is outside the virtual restricted area.
  • the performing target detection by the first sensor on the cleaning device to obtain target object information includes: performing target recognition on point cloud data collected by the first sensor to obtain the Target object information, wherein the target object information is an object point cloud of the target scene object.
  • the performing target recognition on the point cloud data collected by the first sensor to obtain the target object information includes: performing target recognition on the point cloud data collected by the first sensor , to obtain candidate object information, wherein the candidate object information is the object point cloud of the candidate object contained in the current area; according to the position information of the candidate object and the information of the virtual restricted area, select from the candidate object The target scene object matching the virtual forbidden zone is extracted to obtain the target object information.
  • the controlling the cleaning device to perform a target escape operation according to the information of the virtual restricted area and the information of the target object includes: determining that the target scene object is allowed according to the information of the target object When the cleaning device crosses the obstacle-crossing scene object, and the cleaning device passes through the virtual wall of the virtual restricted area and enters an area in the target area map other than the area to be cleaned, control the cleaning The device crosses the target scene object and enters the area to be cleaned.
  • the controlling the cleaning device to perform a target escape operation according to the information of the virtual restricted area and the information of the target object includes: determining that the target scene object is the target according to the information of the target object type of scene object, and the cleaning device has entered the target object area where the target scene object is located through the virtual wall of the virtual restricted zone, control the cleaning device to perform all tasks that match the target type.
  • the controlling the cleaning device to perform the target escape operation matching the target type includes: when the target type includes a type that does not allow the cleaning device to pass through at the bottom , collecting point cloud data through the second sensor on the cleaning device to obtain target point cloud data; according to the target point cloud data, identifying the moving track that matches the cleaning device entering the target object area An exit, wherein the size of the exit allows the passage of the cleaning equipment; the cleaning equipment is controlled to move from the exit to the target object area along the moving track; In the case where the distance between them is less than or equal to the distance threshold, the cleaning device is controlled to move along the target boundary detected by the distance sensor of the cleaning device until it moves out of the target object area, wherein the The target boundary is at least one of the following: the wall, the boundary of the target scene object.
  • the controlling the cleaning device to perform the target escape operation according to the information of the virtual restricted area and the information of the target object includes: when the cleaning device detects a cliff or detects the cleaning In the case that the wheels of the equipment are in a falling state and are trapped, the cleaning equipment is controlled to perform a first escape operation, wherein the first escape operation is used to control the cleaning equipment to leave the detected cliff or control the wheels to escape In the falling state, in the process of performing the first escape operation, the cleaning device ignores the virtual restricted area; after performing the first escape operation, it is detected that the cleaning device passes through the virtual restricted area. In the case of a wall, the cleaning equipment is controlled to perform a second escape operation, wherein the second escape operation is used to control the cleaning equipment to leave the range of the virtual forbidden zone.
  • the method further includes: when the boundary of the target scene object is located in the area to be cleaned , control the cleaning device to clean along the boundary of the target scene object; in the case that the boundary of the target scene object is outside the area to be cleaned, control the cleaning device to clean along the virtual restricted area.
  • the wall is cleaned.
  • an operation control device for cleaning equipment including: a first acquisition unit, configured to acquire virtual restricted area information corresponding to the target area map, wherein the target area map is The area map of the area to be cleaned to be cleaned by the cleaning device, the virtual restricted area information is used to indicate the virtual restricted area in the area map; the detection unit is used to perform target detection through the first sensor on the cleaning device , to obtain the target object information, wherein the target object information is used to indicate the object information of the target scene object matching the virtual restricted area in the current area where the cleaning device is located; the first control unit is configured to When the cleaning device is trapped during the cleaning process of the area to be cleaned, the cleaning device is controlled to perform a target escape operation according to the virtual restricted area information and the target object information, wherein the cleaning device after escape outside the virtual exclusion zone.
  • the detection unit includes: a first recognition module, configured to perform target recognition on the point cloud data collected by the first sensor to obtain the target object information, wherein the target object The information is an object point cloud of the target scene object.
  • the first recognition module includes: a recognition submodule, configured to perform target recognition on the point cloud data collected by the first sensor to obtain candidate object information, wherein the candidate object information It is the object point cloud of the candidate objects contained in the current area; the selection submodule is used to select from the candidate objects that match the virtual restricted area according to the position information of the candidate object and the virtual restricted area information the target scene object to obtain the target object information.
  • the first control unit includes: a first control module, configured to determine, according to the target object information, that the target scene object is a scene object that the cleaning device is allowed to pass through, and the In the case that the cleaning equipment enters into the area other than the area to be cleaned in the target area map through the virtual wall of the virtual restricted area, control the cleaning equipment to enter the area to be cleaned by crossing the target scene object Inside.
  • the first control unit includes: a second control module, configured to determine, according to the target object information, that the target scene object is a target type scene object and the cleaning equipment has been worn When entering the target object area where the target scene object is located through the virtual wall of the virtual restricted area, control the cleaning device to perform the target escape operation matching the target type, wherein the target type includes At least one of the following: a type in which the bottom does not allow the cleaning device to pass through, and a type in which the distance from the wall is less than or equal to a distance threshold.
  • the first control unit includes: a collection module, configured to pass through the second object on the cleaning device when the target type includes a type that the bottom does not allow the cleaning device to pass through.
  • the sensor collects point cloud data to obtain target point cloud data;
  • the second identification module is configured to identify an exit that matches the movement trajectory of the cleaning device entering the target object area according to the target point cloud data, wherein , the size of the exit allows the cleaning device to pass through;
  • the third control module is used to control the cleaning device to move out of the target object area from the exit along the moving track;
  • the fourth control module uses In the case where the target type includes a type whose distance from the wall is less than or equal to a distance threshold, control the cleaning device to move along the target boundary detected by the distance sensor of the cleaning device until Moving out of the target object area, wherein the target boundary is at least one of the following: the wall, the boundary of the target scene object.
  • the first control unit includes: a fifth control module, configured to be used when the cleaning device is trapped due to detection of a cliff or detection of wheels of the cleaning device being in a falling state , controlling the cleaning equipment to perform a first escape operation, wherein the first escape operation is used to control the cleaning equipment to leave the detected cliff or control the wheels to escape from the falling state, and when performing the first escape During the operation, the cleaning device ignores the virtual restricted area; the sixth control module is configured to detect that the cleaning device passes through the virtual wall of the virtual restricted area after performing the first escape operation, Controlling the cleaning device to perform a second escape operation, wherein the second escape operation is used to control the cleaning device to leave the range of the virtual forbidden zone.
  • a fifth control module configured to be used when the cleaning device is trapped due to detection of a cliff or detection of wheels of the cleaning device being in a falling state , controlling the cleaning equipment to perform a first escape operation, wherein the first escape operation is used to control the cleaning equipment to leave the detected cliff or control the wheels to escape from the
  • the apparatus further includes: a second control unit, configured to, after the object detection is performed by the first sensor on the cleaning device, when the boundary of the target scene object is located in the In the case of an area to be cleaned, control the cleaning device to clean along the boundary of the target scene object; a third control unit, configured to, when the boundary of the target scene object is outside the area to be cleaned , controlling the cleaning device to clean along the virtual wall of the virtual forbidden zone.
  • a computer-readable storage medium is also provided, and a computer program is stored in the computer-readable storage medium, wherein the computer program is configured to execute the above-mentioned interface test when running method.
  • an electronic device including a memory, a processor, and a computer program stored on the memory and operable on the processor, wherein the above-mentioned processor executes the above-mentioned The test method for the interface.
  • the real escape scene is combined with the virtual restricted area information, and the virtual restricted area information corresponding to the target area map is obtained.
  • the target area map is the area map of the area to be cleaned by the cleaning equipment, and the virtual restricted area information It is used to indicate the virtual restricted zone in the area map; the first sensor on the cleaning device performs target detection to obtain target object information, and the target object information is used to indicate the target scene object matching the virtual restricted zone in the current area where the cleaning device is located;
  • the cleaning equipment is trapped in the process of cleaning the area to be cleaned, the cleaning equipment is controlled to perform the target escape operation according to the information of the virtual restricted area and the information of the target object.
  • the cleaned equipment after escape is outside the virtual restricted area.
  • the escape can make the escape more intelligent, achieve the technical effect of reducing the trapping rate of the cleaning equipment and improving the user experience, and then solve the cleaning equipment operation control method in the related technology caused by the establishment error of the virtual forbidden zone. Easy to get stuck on.
  • FIG. 1 is a schematic diagram of a hardware environment of an optional cleaning device operation control method according to an embodiment of the present application
  • Fig. 2 is a schematic flowchart of an optional operation control method of cleaning equipment according to an embodiment of the present application
  • FIG. 3 is a schematic diagram of an optional virtual forbidden zone according to an embodiment of the present application.
  • FIG. 4 is a schematic diagram of another optional virtual forbidden zone according to an embodiment of the present application.
  • FIG. 5 is a schematic diagram of another optional virtual forbidden zone according to an embodiment of the present application.
  • FIG. 6 is a schematic diagram of an optional slide rail according to an embodiment of the present application.
  • Fig. 7 is a structural block diagram of an optional operation control device for cleaning equipment according to an embodiment of the present application.
  • Fig. 8 is a structural block diagram of an optional electronic device according to an embodiment of the present application.
  • an operation control method of a cleaning device is provided.
  • the above cleaning device operation control method may be applied to a hardware environment composed of a terminal device 102 , a cleaning device 104 and a server 106 as shown in FIG. 1 .
  • the terminal device 102 can be connected to the cleaning device 104 and/or server 106 (for example, an Internet of Things platform or a cloud server) through a network to control the cleaning device 104, for example, to communicate with the cleaning device 104 Bind and configure the cleaning function of the cleaning device 104 .
  • server 106 for example, an Internet of Things platform or a cloud server
  • the cleaning device 104 may include a host computer and a base station (for example, a sweeping machine and a base station, a cleaning machine and a base station), and the host computer and the base station may be connected through a network to determine the current status of the peer terminal (for example, battery status, working status, location, etc.) information, etc.).
  • a host computer and a base station for example, a sweeping machine and a base station, a cleaning machine and a base station
  • the host computer and the base station may be connected through a network to determine the current status of the peer terminal (for example, battery status, working status, location, etc.) information, etc.).
  • the foregoing network may include but not limited to at least one of the following: a wired network and a wireless network.
  • the above-mentioned wired network may include but not limited to at least one of the following: wide area network, metropolitan area network, local area network, and the above-mentioned wireless network may include but not limited to at least one of the following: WIFI (Wireless Fidelity, Wireless Fidelity), bluetooth, infrared.
  • WIFI Wireless Fidelity, Wireless Fidelity
  • the network used by the terminal device 102 to communicate with the cleaning device 104 and/or the server 106 and the network used by the cleaning device 104 to communicate with the server 106 may be the same or different.
  • the terminal device 102 may not be limited to a PC, a mobile phone, a tablet computer, etc.
  • the cleaning device 104 may include but not limited to: a self-cleaning robot, for example, an automatic mop washing robot, a sweeping robot, etc.
  • the server 106 may be a server of an Internet of Things platform .
  • the operation control method of the cleaning device in the embodiment of the present application may be executed by the terminal device 102, the cleaning device 104, or the server 106 alone, or jointly executed by at least two of the terminal device 102, the cleaning device 104, and the server 106.
  • the execution of the cleaning device operation control method of the embodiment of the present application by the terminal device 102 or the cleaning device 104 may also be performed by a client installed on it.
  • FIG. 2 is a schematic flowchart of an optional operation control method of the cleaning equipment according to the embodiment of the present application, as shown in FIG. 2 , the flow of the method may include the following steps:
  • Step S202 acquiring virtual restricted area information corresponding to the target area map, wherein the target area map is the area map to which the cleaning area to be cleaned by the cleaning device belongs, and the virtual restricted area information is used to indicate the virtual restricted area in the area map.
  • the operation control method of the cleaning equipment in this embodiment can be applied to the following scenarios: in the process of cleaning the area to be cleaned in the target area by the cleaning equipment, the cleaning equipment is controlled in combination with the real scene and the virtual forbidden area, so as to reduce the Probability of cleaning equipment being trapped.
  • the above-mentioned target area can be an indoor area in a family, or other areas such as restaurants, offices, and factory workshops, or other areas that can be cleaned by cleaning equipment.
  • the above-mentioned cleaning equipment may be a smart vacuum cleaner, a smart sweeper, a smart sweeper integrating sweeping and mopping, or other robots with cleaning functions, which are not limited in this embodiment.
  • the area to be cleaned by the cleaning device is the area to be cleaned.
  • the cleaning device may acquire a target area map corresponding to the above target area, where the target area map is the area map to which the area to be cleaned belongs, and the area to be cleaned may be all or part of the target area. Based on the set cleaning area information, the cleaning device can determine the area to be cleaned.
  • the above cleaning area information may be the default area information, for example, all or a specific part of the default cleaning target area, or it may be the cleaning area information generated by the terminal device in response to the detection of the selection operation performed on the target area map.
  • the cleaning area information may be
  • the above selection operation may be an operation of selecting a sub-area to be cleaned from a plurality of sub-areas included in the target area by sending it to the cleaning device from the server.
  • the target area map can be a map formed by the area image or area point cloud collected by the cleaning device using an image acquisition component (such as a camera, a laser sensor), or a map obtained by other means, for example, a map received from the terminal device side
  • an image acquisition component such as a camera, a laser sensor
  • the map may be a map formed by an area image or an area point cloud collected by an image acquisition component on other devices. This is not limited in this embodiment.
  • the target area map may also include virtual restricted area information, and the cleaning device may also obtain the virtual restricted area information in the target area map.
  • the virtual restricted area information is used to indicate the virtual restricted area in the map of the target area, and the virtual restricted area refers to an area marked by a virtual wall, a restricted area line, etc., where cleaning equipment is prohibited from entering.
  • the virtual wall and restricted area line here are virtual area boundaries, and the cleaning equipment can pass through the virtual wall, restricted area line, etc. and enter the virtual restricted area.
  • the virtual restricted area can be a restricted area established by the user through the target area map displayed on his terminal device, or a restricted area established by the cleaning device based on historical trapped records. In this embodiment, there is no limitation on the way of establishing the virtual restricted area.
  • a virtual restricted area can be set at the stairway or furniture on the room area map.
  • step S204 the first sensor on the cleaning device performs target detection to obtain target object information, wherein the target object information is used to indicate the object information of the target scene object that matches the virtual restricted zone in the current area where the cleaning device is located.
  • the cleaning device can perform target detection through the first sensor on it, and determine the scene objects located in the current area where the cleaning device is located (that is, scene object), according to the position information of the virtual restricted area and the detected position information of the scene object, the target scene object matching the virtual restricted area can be determined, and then the object information of the target scene object, that is, the target object information can be obtained.
  • the first sensor can be an image acquisition device, or other devices with object recognition functions.
  • the first sensor can be an LDS (Laser Direct Structuring, laser direct structuring) laser distance measuring sensor, which can perform point detection by emitting laser light.
  • Cloud data acquisition determine the target object information according to the collected point cloud data.
  • the first sensor can be arranged on the outside (for example, front side, left side, right side, rear side, etc.), top or bottom of the cleaning device, and its number can be one or more, and different sides of the cleaning device can be arranged with one or more sensors. Multiple.
  • the current area where the above-mentioned cleaning equipment is located is the collection area of the first sensor, which indicates the range of the area that it can collect. It can belong to the area to be cleaned (at this time, the cleaning equipment can be located in the area to be cleaned), or it can belong to the area other than the target area. Areas other than the area to be cleaned (at this time, the cleaning equipment can pass through other areas entered into such as virtual restricted areas), or partly belong to the area to be cleaned, and partly belong to other areas, which is not limited in this embodiment .
  • the cleaning device may perform target recognition (scene object recognition) on the scene data collected by the first sensor, and determine the detected scene objects.
  • the cleaning device can use the reference object characteristics of various reference objects and the information of the virtual restricted area to match the collected scene data, determine the target scene objects that match various reference objects, and obtain the objects of all target scene objects in the current area Information, that is, target object information.
  • each of the above-mentioned reference objects can be a scene object of a predetermined object type , and its corresponding reference object features can be pre-stored in the cleaning device, or can be stored in the server, so that the cleaning device or the server can perform object recognition.
  • the corresponding predetermined object types may be the same or different. This is not limited in this embodiment.
  • the sweeping machine can detect obstacles through the sensors on it, and obtain obstacles that match the virtual restricted area, such as steps, slide rails, dark carpets, etc.
  • Step S206 when the cleaning device is trapped during the cleaning process of the area to be cleaned, control the cleaning device to perform a target escape operation according to the information of the virtual restricted area and the information of the target object, wherein the released cleaning device is outside the virtual restricted area.
  • the cleaning device may be trapped in a certain position, which can be a position in the area to be cleaned or a target Other positions in the area except the area to be cleaned, for example, the cleaning device passes through the virtual wall and enters other areas in the target area except the area to be cleaned, the cleaning device is trapped at the bottom of the furniture, and the cleaning device triggers down-looking or falling, etc. After it is determined that it is trapped, the cleaning device can escape according to the configured escape strategy.
  • the machine ie, cleaning equipment
  • the machine cannot understand the real purpose of establishing the virtual restricted area.
  • the machine regards the virtual restricted areas as real walls and cannot escape from the narrow area; Because getting out of trouble in a complex scene may cause the machine to stray into or pass through the restricted area, causing the machine to alarm and be trapped.
  • the cleaning equipment escape solution in the related art cannot identify the real complex scene and the virtual restricted area escape scene, and also lacks a corresponding escape method and compensation plan.
  • the cleaning device can combine the real scene with the information of the virtual restricted area, comprehensively judge the purpose of establishing the virtual restricted area, determine the current scene of getting out of trouble, and use the corresponding strategy for getting out of trouble, which can improve the efficiency of
  • the escape effect of the virtual restricted area makes the escape more intelligent, reduces the trapping rate of cleaning equipment, and thus enhances the user's sense of experience.
  • the cleaning device can determine the type of the currently trapped scene according to the information of the virtual restricted area and the information of the target object, and control the cleaning device to perform an escape operation that matches the type of the currently trapped scene based on the determined scene type, that is, Target escape operation. After the target escape operation is performed, the cleaning equipment can escape from the scene where it is trapped, and at the same time, the cleaned equipment after escape is outside the virtual restricted area on the map.
  • the cleaning device can combine the real scene (for example, items in the scene) and the information of the virtual restricted area to comprehensively determine the purpose of establishing the virtual restricted area, and identify the escape scene corresponding to the established virtual restricted area.
  • the above-mentioned escape scene may include But not limited to at least one of the following:
  • Extrication scene 1 The machine (cleaning equipment) is easy to recognize the obstacle, and the actual forbidden area blocks the obstacle, that is, the machine is easy to cross the obstacle, and the virtual forbidden area is used to prevent the machine from crossing the obstacle;
  • Escape Scenario 2 The upper surface is stuck or LDS collision escapes, and the scene away from the virtual wall is considered, that is, the upper surface of the machine in the virtual restricted area will be stuck or can be escaped through LDS collision, and it is necessary to consider moving away from the virtual wall when escaping. restricted area scene;
  • Escaping scene 3 Prioritize the triggered looking down or falling, then deal with the virtual restricted area (for example, the virtual wall of the virtual restricted area), and finally deal with other obstacles.
  • the virtual restricted area for example, the virtual wall of the virtual restricted area
  • the cleaning device can determine the escape operation to be performed, for example, stay away from the virtual restricted area, and for example, first ignore the virtual restricted area, break away from obstacles or pass through the virtual restricted area to enter the waiting area. After cleaning the area, stay away from the virtual restricted area. For example, ignore the virtual restricted area and deal with looking down or falling, and then stay away from the virtual restricted area. This is not limited in this embodiment.
  • the virtual restricted area information corresponding to the target area map is obtained, wherein the target area map is the area map to which the area to be cleaned by the cleaning equipment belongs, and the virtual restricted area information is used to indicate the virtual restricted area in the area map ;
  • the cleaning equipment is controlled to perform the target escape operation, which solves the problem that the cleaning equipment is easy to be trapped due to the establishment error of the virtual forbidden zone in the operation control method of the cleaning equipment in the related technology.
  • the problem of trapping is reduced, the trapping rate of cleaning equipment is reduced, and the user experience is improved.
  • the first sensor on the cleaning device performs target detection to obtain target object information, including:
  • the first sensor may be a point cloud sensor, that is, a sensor for collecting point cloud data. Cleaning is set in the process of cleaning the area.
  • Point cloud sensors can be used to collect point data to obtain the collected point cloud data.
  • the collected point cloud data can include point cloud data of obstacles within the collection range.
  • data collection by the first sensor may be performed periodically, for example, the first sensor may perform data collection in real time, that is, data collection is performed every target duration (for example, 1s),
  • the execution may also be triggered by an event, for example, when a collision of the cleaning device is detected and the distance between the cleaning device and the virtual restricted area is less than or equal to the first distance threshold, data collection is triggered.
  • the cleaning equipment can perform object recognition.
  • the cleaning device can use the reference object point cloud and virtual restricted area information of various reference objects to match the collected point cloud data, determine the target scene objects that match various reference objects, and obtain all objects in the current area The object point cloud of the scene object, so as to obtain the target object information.
  • using a point cloud sensor for scene object recognition can improve the accuracy and convenience of scene object recognition.
  • target recognition is performed on the point cloud data collected by the first sensor to obtain target object information, including:
  • the cleaning device may first perform target recognition on the collected point cloud data to obtain candidate object information.
  • the candidate object information is object information that identifies the candidate objects included in the current area.
  • the candidate objects may be scene objects of a predetermined object type, and the object information of the candidate objects may be object point clouds of the candidate objects.
  • the cleaning device may first determine the object to be identified through contour detection, that is, the detected object that may be a required scene object; then, use the object information of the object to be identified (such as , object point cloud) is matched with the object information of the reference object, and the object information of the object to be recognized that matches the object information of the reference object is determined as the object information of the candidate object, that is, the object point cloud of the candidate object.
  • contour detection that is, the detected object that may be a required scene object
  • object information of the object to be identified such as , object point cloud
  • the cleaning device can determine the target scene object matching the virtual forbidden zone according to the position information of the virtual forbidden zone and the object information of the candidate objects, and then obtain target object information, which can be an object point cloud of the target object.
  • the cleaning device may determine a target scene object matching the virtual restricted area according to the position information of the virtual restricted area and the position information of the candidate object, and the position information of the candidate object may be included in the object information of the candidate object.
  • the manner of determining the target scene object matching the virtual restricted area may include but not limited to at least one of the following:
  • a candidate object whose shape matches the boundary shape of the virtual forbidden zone is determined as a target scene object matching the virtual forbidden zone.
  • the slide rail or step between the living room and the balcony calculate whether the virtual restricted area and the step are close to parallel and have similar width. If they are close to parallel and have similar width, it can be determined that the scene object matching the virtual restricted area is the slide rail or step.
  • the accuracy of determining the scene object matching the virtual forbidden zone can be improved. efficiency.
  • the cleaning equipment is controlled to perform the target escape operation, including:
  • the cleaning device can calculate the distance between the cleaning device and the virtual restricted area. Then, according to the target area map and target object information, determine which side of the restricted area the cleaning equipment is located.
  • the cleaning device may adopt a strategy of staying away from the virtual wall.
  • Cleaning equipment can also sweep along specific boundaries. If the cleaning device passes through the virtual wall of the virtual restricted zone and enters an area other than the area to be cleaned in the target area map, the cleaning device can be controlled to pass through the target scene object and enter the area to be cleaned. After entering the area to be cleaned, the cleaning device can adopt a strategy of staying away from the virtual wall, and can also clean along a specific boundary.
  • the specific boundary is the boundary of the virtual restricted area and the boundary of the target scene object, which is closer to the area to be cleaned, that is, when the target scene object is located in the virtual restricted area, the specific boundary is the boundary of the virtual restricted area, In the case where the target scene object is located outside the virtual restricted area, the specific boundary is the boundary of the target scene object.
  • the escape scene corresponding to the established virtual restricted area can be identified as a machine (sweeping machine, the aforementioned cleaning machine, etc.) An example of the device) It is easy to identify the scene where the obstacle is crossed, and the actual restricted area blocks the obstacle crossing.
  • the sweeping machine can calculate the distance between the machine and the restricted area, and judge which side of the restricted area the machine is on based on the map information and the real scene information of escape (an example of target object information).
  • the strategy of staying away from the virtual wall can be adopted; if the machine has passed through the virtual restricted area, the virtual restricted area can be ignored, and the strategy of staying away from the virtual wall can be adopted after entering the room through the obstacle-crossing strategy over the steps or slide rails.
  • the sweeper when cleaning an area, the sweeper passes through the restricted area line and enters the inside of the hanger.
  • the sweeping machine can recognize that the scene object matching the virtual restricted area is the bottom end of the clothes hanger, and determine that the current escape scene is the escape scene 1, and it has passed through the virtual restricted area.
  • the sweeper can use the obstacle-surmounting strategy to cross the slide rails and enter the living room, and then adopt the strategy of moving away from the virtual wall.
  • the area to be cleaned by the sweeper is the living room. If the sweeper enters the balcony through the virtual restricted area, the sweeper can recognize that the scene object matching the virtual restricted area is the slide rail between the living room and the balcony, and determine that the current escape scene is escape scene 1, and it has passed through the virtual restricted area . At this time, the sweeper can use the obstacle-surmounting strategy to cross the slide rails and enter the living room, and then adopt the strategy of moving away from the virtual wall.
  • the cleaning equipment is controlled to ignore the virtual forbidden zone and wait for cleaning, which can improve the convenience of the cleaning equipment's escape and success rate.
  • the cleaning equipment is controlled to perform the target escape operation, including:
  • the cleaning device when it is determined according to the target object information that the target scene object is a target type scene object, and the cleaning device has passed through the virtual wall of the virtual forbidden zone and entered the target object area where the target scene object is located, control the cleaning device to perform Matching target escape operation, wherein the target type includes at least one of the following: a type whose bottom does not allow cleaning equipment to pass through, and a type whose distance from the wall is less than or equal to a distance threshold.
  • the cleaning device can determine the target scene object.
  • Object type ie, target type
  • the object type of the target scene object can be a type whose bottom does not allow cleaning equipment to pass through, for example, furniture with a too low bottom, or a type whose distance from the wall is less than or equal to the distance threshold (which can be the second distance threshold) , for example, a bed that is closer to the wall.
  • the cleaning device can perform a target escape operation matching the object type of the target scene object, thereby controlling the cleaning device to escape.
  • the cleaning device can perform different trapping operations.
  • the above target trapping operations can be pre-configured, that is, corresponding trapping operations are configured for different object types, or the cleaning device can try multiple trapping operations separately, and
  • the plurality of evacuation operations may include but not limited to at least one of the following: obstacle surmounting operations, local navigation operations, and sideways operations. This is not limited in this embodiment.
  • the sweeper can calculate the point cloud obstacle positions and restricted area positions recognized by the sensors on it, and identify the current escape scene as: easy upper surface stuck or LDS collision escape, while considering Scene away from the virtual wall. According to the outline of the point cloud, the sweeper can judge the type of furniture, combined with the area of the restricted area, through the cooperation of the escape strategy with local navigation, edge and other modules, the sweeper can escape the trouble.
  • the corresponding escape operation is used to perform the escape, which can improve the success rate of the cleaning equipment escape.
  • controlling the cleaning device to perform a target escape operation matching the target type includes at least one of the following:
  • the target type includes a type that does not allow the cleaning equipment to pass through the bottom, collect point cloud data through the second sensor on the cleaning equipment to obtain target point cloud data; An exit matching the movement trajectory to the target object area, wherein the size of the exit allows the cleaning equipment to pass through; controlling the cleaning equipment to move out of the target object area from the exit along the movement trajectory;
  • the cleaning device when the target type includes a type whose distance from the wall is less than or equal to the distance threshold, control the cleaning device to move along the target boundary detected by the distance sensor of the cleaning device until it moves out of the target object area , wherein the target boundary is at least one of the following: a wall, the boundary of the target scene object.
  • the cleaning equipment can perform local navigation, which can follow the historical movement trajectory or re-plan The new trajectory returns to the area to be cleaned.
  • the cleaning device can collect point cloud data through the second sensor to obtain target point cloud data.
  • the second sensor and the first sensor may be the same sensor (for example, a point cloud sensor), or different sensors.
  • the cleaning device can identify the target point cloud data, determine a moving path whose size allows the cleaning device to pass, and move out of the target object area along the determined moving track.
  • the cleaning equipment can return to the area to be cleaned along the historical movement trajectory: identify the target point cloud data, and identify the exit that matches the movement trajectory (ie, the historical movement trajectory) of the cleaning equipment entering the target object area , the size of the identified exit allows the cleaning equipment to pass through; and move out of the target object area from the identified exit along the historical movement trajectory.
  • the sweeper can identify based on the point cloud data collected by the point cloud sensor, determine the exits or moving passages that allow it to pass through in its local area, and use the determined exits or moving passages Move out the bottom of the furniture.
  • the space for the cleaning device to move is small, similar to a narrow passage (that is, a narrow area), and the cleaning device can be controlled by edge detection.
  • the wall moves until it moves out of the target object area (into the area to be cleaned).
  • a distance sensor for example, LDS laser ranging sensor
  • LDS laser ranging sensor can be arranged on the cleaning device, which can perform distance measurement by emitting detection signals, thereby detecting obstacles, such as walls, and the cleaning device can pass along the distance The wall detected by the sensor moves.
  • the boundary of the target scene object can also be detected by the distance sensor, and the cleaning device can also move along the boundary of the target scene object detected by the distance sensor until it moves out Target object area.
  • the aforementioned distance sensor may be the same sensor as the aforementioned first sensor and/or the second sensor, or may be a different sensor.
  • the cleaning equipment is controlled to perform the target escape operation, including:
  • the cleaning device may be equipped with a down-view sensor and/or a drop sensor, the down-view sensor is used for cliff detection, and the drop sensor is used for drop detection, that is, to detect the falling state of the wheels of the cleaning device.
  • the cleaning device can perform down-looking detection and/or drop detection through the down-looking sensor and/or drop sensor, and determine whether to trigger down-looking (the down-looking sensor is triggered) or fall (the drop sensor is triggered, and the wheels of the cleaning device are in a falling position at this time) state).
  • the cleaning device can prioritize looking down or falling, and at this time, the cleaning device ignores the virtual restricted area.
  • the cleaning device may perform a first escape operation, and the first escape operation is used to control the cleaning device to leave the detected cliff or control the wheels to escape from the falling state.
  • the first escape operation may be an escape operation matched with looking down or falling. If looking down is triggered, the cleaning device can perform forward operation, backward operation, rotation operation (rotate left, right, or rotate in situ, etc.) and other escape operations. If a fall is triggered, the cleaning device can perform reverse operations, rotation operations, and other escape operations. In this embodiment, there is no limitation on the first escape operation.
  • looking down or falling may be falsely triggered.
  • some dark objects such as dark carpets
  • it may be misjudged as triggering looking down.
  • some objects with narrow grooves such as slide rails
  • the cleaning device can determine whether it is falsely triggered to look down or fall. If it is determined that a look down or a fall was falsely triggered, the cleaning device may ignore the triggered look down or drop. If it is not accidentally triggered to look down or fall, the cleaning device can perform the first escape operation.
  • the amplitudes of reflected waves of ultrasonic waves are different.
  • the cleaning device can perform object recognition by sending ultrasonic waves, etc. Based on the reflected waves of the ultrasonic waves, it can be determined whether the down look is triggered by mistake. If a fall is triggered, the cleaning device can determine the outline, shape, etc. of the scene object that triggers the downward look based on the point cloud data collected by the point cloud sensor, and determine whether the fall is triggered by mistake. If the scene object that triggers the fall is determined to be an obstacle that allows the cleaning device to cross object, it can be determined that the fall was triggered by mistake, otherwise, it is determined that the fall was not triggered by mistake.
  • the sweeper when cleaning the bathroom, when moving to the position shown in Figure 5, the sweeper triggers the look-down. At this time, the sweeper can ignore the virtual restricted area and give priority to look-down processing.
  • the depth of the groove in the middle of the slide rail may trigger looking down or the wheels fall, and the sweeper can determine false triggers through ultrasonic waves or sensed point clouds Looking down or falling.
  • the cleaning device can determine the current location information, and determine the positional relationship between the cleaning device and the virtual restricted area based on the current location information and the location information of the virtual restricted area. If it is determined that the cleaning equipment passes through the virtual wall of the virtual restricted area (possibly entering the virtual restricted area, it is also possible to perform the second escape operation, the second escape operation can be used to control the cleaning equipment to leave the scope of the virtual restricted area.
  • the cleaning device may determine the current escape scene (for example, escape scene 1, escape scene 2), and perform a corresponding escape operation based on the determined escape scene, that is, the second escape operation.
  • the machine is prone to mistakenly trigger looking down or falling down.
  • the sweeper can perform trouble-free processing according to the priority order of looking down/falling first, then virtual walls, and finally other trouble-free scenes. .
  • the sweeping machine can first judge whether the downward view/fall is triggered by mistake. If it is, ignore the downward view/fall. If not, it will give priority to the downward view/fall.
  • the safety of device operation can be improved, and the efficiency of device escape can be improved by prioritizing the downward view/falling, then processing the virtual wall, and finally processing other escape scenarios.
  • the above method further includes:
  • the cleaning device may clean the area to be cleaned based on the positional relationship between the area to be cleaned, the virtual restricted area, and the target scene object.
  • the cleaning device can adopt a strategy of staying away from the virtual wall and cleaning along a specific boundary.
  • the specific boundary is the one closer to the area to be cleaned among the boundary of the virtual forbidden zone and the boundary of the target scene object. If the boundary of the target scene object is located in the area to be cleaned, the specific boundary is the boundary of the target scene object, and the cleaning device can perform area cleaning along the boundary of the target scene object; if the boundary of the target scene object is outside the area to be cleaned, the specific boundary is
  • the boundary of the virtual restricted area may be a virtual wall of the virtual restricted area, and the cleaning equipment may clean the area along the virtual wall of the virtual restricted area.
  • the area to be cleaned can be cleaned, which can improve the rationality of area cleaning and reduce the probability of equipment being trapped.
  • FIG. 7 is a structural block diagram of an optional operation control device for cleaning equipment according to an embodiment of the present application. As shown in Fig. 7, the device may include:
  • the first acquiring unit 702 is configured to acquire virtual restricted area information corresponding to the target area map, wherein the target area map is the area map to which the area to be cleaned by the cleaning device belongs, and the virtual restricted area information is used to indicate the virtual restricted area in the area map ;
  • the detection unit 704 is connected to the first acquisition unit 702, and is used to perform target detection through the first sensor on the cleaning device to obtain target object information, wherein the target object information is used to indicate that the current area where the cleaning device is located matches the virtual forbidden zone The object information of the target scene object;
  • the first control unit 706, connected to the detection unit 704, is used to control the cleaning device to perform the target escape operation according to the information of the virtual restricted area and the information of the target object when the cleaning device is trapped during the cleaning process of the area to be cleaned, wherein, The cleaning equipment after escape is outside the virtual restricted area.
  • first acquisition unit 702 in this embodiment can be used to perform the above step S202
  • the detection unit 704 in this embodiment can be used to perform the above step S204
  • the first control unit 706 in this embodiment can be It is used to execute the above step S206.
  • the cleaning equipment obtains the virtual restricted area information corresponding to the target area map, wherein the target area map is the area map of the area to be cleaned by the cleaning equipment, and the virtual restricted area information is used to indicate the virtual restricted area in the area map; through the cleaning equipment
  • the first sensor on the device performs target detection to obtain target object information, wherein the target object information is used to indicate the target scene object that matches the virtual forbidden zone in the current area where the cleaning device is located; during the cleaning process of the area to be cleaned, the cleaning device is In the case of being trapped, according to the information of the virtual forbidden area and the information of the target object, the cleaning equipment is controlled to perform the target escape operation, which solves the problem that the cleaning equipment is easy to be trapped due to the establishment error of the virtual forbidden area in the operation control method of the cleaning equipment in the related technology. The trapping rate of cleaning equipment is reduced, and the user experience is improved.
  • the detection unit includes:
  • the first recognition module is configured to perform target recognition on the point cloud data collected by the first sensor to obtain target object information, wherein the target object information is an object point cloud of a target scene object.
  • the first identification module includes:
  • the recognition sub-module is used to perform target recognition on the point cloud data collected by the first sensor to obtain candidate object information, wherein the candidate object information is the object point cloud of the candidate object contained in the current area;
  • the selection sub-module is used to select the target scene object matching the virtual forbidden zone from the candidate objects according to the position information of the candidate object and the virtual forbidden zone information, and obtain the target object information.
  • the first control unit includes:
  • the first control module is used to determine according to the target object information that the target scene object is a scene object that the cleaning device is allowed to cross, and the cleaning device passes through the virtual wall of the virtual restricted area and enters the target area map except for the area to be cleaned. Next, control the cleaning equipment to cross the target scene object and enter the area to be cleaned.
  • the first control unit includes:
  • the second control module is used to control the cleaning when it is determined according to the target object information that the target scene object is a target type scene object, and the cleaning equipment has passed through the virtual wall of the virtual restricted area and entered the target object area where the target scene object is located.
  • the device performs a target escape operation matching the target type, wherein the target type includes at least one of the following: a type whose bottom does not allow the cleaning device to pass through, and a type whose distance from the wall is less than or equal to a distance threshold.
  • the first control unit includes:
  • the collection module is used to collect the point cloud data through the second sensor on the cleaning device to obtain the target point cloud data when the target type includes a type that does not allow the cleaning device to pass through the bottom;
  • the point cloud data identifies an exit that matches the movement trajectory of the cleaning equipment entering the target object area, wherein the size of the exit allows the cleaning equipment to pass through;
  • the third control module is used to control the cleaning equipment to move out of the exit along the movement trajectory target audience area;
  • the fourth control module is configured to control the cleaning device to move along the target boundary detected by the distance sensor of the cleaning device until the target type includes a type whose distance from the wall is less than or equal to the distance threshold Moving out of the target object area, wherein the target boundary is at least one of the following: a wall, the boundary of the target scene object.
  • the first control unit includes:
  • the fifth control module is configured to control the cleaning equipment to perform a first escape operation when the cleaning equipment is trapped due to detection of a cliff or a falling state of the wheels of the cleaning equipment, wherein the first escape operation is used to control cleaning The device leaves the detected cliff or controls the wheels to escape from the falling state. During the first escape operation, the cleaning device ignores the virtual restricted area;
  • the sixth control module is configured to control the cleaning equipment to perform a second escape operation when it is detected that the cleaning equipment passes through the virtual wall of the virtual restricted area after performing the first escape operation, wherein the second escape operation is used to control the cleaning The device leaves the range of the virtual restricted area.
  • the above-mentioned device also includes:
  • the second control unit is used to control the cleaning device to clean along the boundary of the target scene object when the boundary of the target scene object is located in the area to be cleaned after the target is detected by the first sensor on the cleaning device;
  • the third control unit is configured to control the cleaning device to clean along the virtual wall of the virtual restricted area when the boundary of the target scene object is outside the area to be cleaned.
  • the above modules can run in the hardware environment shown in FIG. 1 , and can be implemented by software or by hardware, wherein the hardware environment includes a network environment.
  • a storage medium is also provided.
  • the above-mentioned storage medium may be used to execute the program code of any one of the above-mentioned cleaning device operation control methods in the embodiments of the present application.
  • the foregoing storage medium may be located on at least one network device among the plurality of network devices in the network shown in the foregoing embodiments.
  • the storage medium is configured to store program codes for performing the following steps:
  • the above-mentioned storage medium may include, but not limited to, various media capable of storing program codes such as USB flash drive, ROM, RAM, removable hard disk, magnetic disk, or optical disk.
  • an electronic device for implementing the above cleaning device operation control method, where the electronic device may be a server, a terminal, or a combination thereof.
  • Fig. 8 is a structural block diagram of an optional electronic device according to an embodiment of the present application. 804 and memory 806 complete mutual communication through communication bus 808, wherein,
  • the communication bus may be a PCI (Peripheral Component Interconnect, Peripheral Component Interconnect Standard) bus, or an EISA (Extended Industry Standard Architecture, Extended Industry Standard Architecture) bus, etc.
  • the communication bus can be divided into an address bus, a data bus, a control bus, and the like. For ease of representation, only one thick line is used in FIG. 8 , but it does not mean that there is only one bus or one type of bus.
  • the communication interface is used for communication between the electronic device and other devices.
  • the above-mentioned memory may include RAM, and may also include non-volatile memory (non-volatile memory), for example, at least one disk memory.
  • non-volatile memory non-volatile memory
  • the memory may also be at least one storage device located away from the aforementioned processor.
  • the memory 806 may include, but is not limited to, the first acquisition unit 702, the detection unit 704, and the first control unit 706 in the control device of the above-mentioned device. In addition, it may also include but not limited to other module units in the control device of the above equipment, which will not be described in detail in this example.
  • processor can be general-purpose processor, can include but not limited to: CPU (Central Processing Unit, central processing unit), NP (Network Processor, network processor) etc.; Can also be DSP (Digital Signal Processing, digital signal processor ), ASIC (Application Specific Integrated Circuit, application specific integrated circuit), FPGA (Field-Programmable Gate Array, field programmable gate array) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
  • CPU Central Processing Unit, central processing unit
  • NP Network Processor, network processor
  • DSP Digital Signal Processing, digital signal processor
  • ASIC Application Specific Integrated Circuit
  • FPGA Field-Programmable Gate Array, field programmable gate array
  • other programmable logic devices discrete gate or transistor logic devices, discrete hardware components.
  • FIG. 8 is only for illustration, and the device implementing the operation control method of the above-mentioned cleaning device can be a terminal device, and the terminal device can be a smart phone (such as an Android phone, an iOS phone, etc.), Tablet PCs, PDAs, and mobile Internet devices (Mobile Internet Devices, MID), PAD and other terminal equipment.
  • FIG. 8 does not limit the structure of the above-mentioned electronic device.
  • the electronic device may also include more or less components than those shown in FIG. 8 (such as a network interface, a display device, etc.), or have a different configuration from that shown in FIG. 8 .
  • the integrated units in the above embodiments are realized in the form of software function units and sold or used as independent products, they can be stored in the above computer-readable storage medium.
  • the technical solution of the present application is essentially or part of the contribution to the prior art, or all or part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium.
  • Several instructions are included to make one or more computer devices (which may be personal computers, servers or network devices, etc.) execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the disclosed client can be implemented in other ways.
  • the device embodiments described above are only illustrative, for example, the division of the units is only a logical function division, and there may be other division methods in actual implementation, for example, multiple units or components can be combined or can be Integrate into another system, or some features may be ignored, or not implemented.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of units or modules may be in electrical or other forms.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place or distributed to multiple network units. Part or all of the units can be selected according to actual needs to achieve the purpose of the solution provided in this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units can be implemented in the form of hardware or in the form of software functional units.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Inking, Control Or Cleaning Of Printing Machines (AREA)

Abstract

提供一种清洁设备的运行控制方法及装置、存储介质及电子装置,方法包括:获取与目标区域地图对应的虚拟禁区信息,目标区域地图为清洁设备待清洁的待清洁区域所属的区域地图,虚拟禁区信息用于指示区域地图中的虚拟禁区(S202);通过清洁设备上的第一传感器进行目标检测,得到目标对象信息,目标对象信息用于表示清洁设备所在的当前区域内与虚拟禁区匹配的目标场景对象的对象信息(S204);在对待清洁区域进行清洁的过程中清洁设备被困的情况下,根据虚拟禁区信息和目标对象信息,控制清洁设备执行目标脱困操作,脱困后的清洁设备处于虚拟禁区以外(S206)。解决了相关技术中的清洁设备的运行控制方法存在由于虚拟禁区建立误差导致的清洁设备易被困的问题。

Description

清洁设备的运行控制方法及装置、存储介质及电子装置
本申请要求如下专利申请的优先权:于2021年12月02日提交中国专利局、申请号为202111464917.3、发明名称为“清洁设备的运行控制方法及装置、存储介质及电子装置”的中国专利申请,上述专利申请的全部内容通过引用结合在本申请中。
【技术领域】
本申请涉及智能家居领域,具体而言,涉及一种清洁设备的运行控制方法及装置、存储介质及电子装置。
【背景技术】
目前,用户的终端设备上可以运行有与清洁设备(例如,清洁机器人)匹配的应用程序。用户可以通过在应用程序的配置界面中显示的区域地图上配置虚拟禁区,以设置允许清洁设备进行清洁的区域和不允许清洁设备进行清洁的区域。此外,清洁设备也可以基于被困历史记录建立虚拟禁区。
例如,用户可以通过在房间之间设置虚拟墙,以限制允许清洁机器人进行区域清洁的房间和不允许清洁设备进行区域清洁的房间。当清洁机器人由于上表面卡死被困在家具底部,清洁机器人可以基于被困历史记录将家具所在的区域设置为虚拟禁区。
然而,终端设备上显示的区域地图与实际的房间场景存在较大的比例差异,所建立的虚拟禁区存在误差,而清洁设备自身建立的虚拟禁区也会由于获取到的信息有限导致存在误差。由于虚拟禁区建立的误差,导致清洁设备容易被困,从而减低了用户的使用体验。
由此可见,相关技术中的清洁设备的运行控制方法,存在由于虚拟禁区建立误差导致的清洁设备易被困的问题。
【发明内容】
本申请的目的在于提供一种清洁设备的运行控制方法及装置、存储介质及电子装置,以至少解决相关技术中的清洁设备的运行控制方法存在由于虚拟禁区建立误差导致的清洁设备易被困的问题。
本申请的目的是通过以下技术方案实现:
根据本申请实施例的一个方面,提供了一种清洁设备的运行控制方法,包括:获取与目标区域地图对应的虚拟禁区信息,其中,所述目标区域地图为所述清洁设备待清洁的待清洁区域所属的区域地图,所述虚拟禁区信息用于指示所述区域地图中的虚拟禁区;通过所述清洁设备上的第一传感器进行目标检测,得到目标对象信息,其中,所述目标对象信息用于表示所述清洁设备所在的当前区域内与所述虚拟禁区匹配的目标场景对象的对象信息;在对所述待清洁区域进行清洁的过程中所述清洁设备被困的情况下,根据所述虚拟禁区信息和所述目标对象信息,控制所述清洁设备执行目标脱困操作,其中,脱困后的所述清洁设备处于所述虚拟禁区以外。
在一个示例性实施例中,所述通过所述清洁设备上的第一传感器进行目标检测,得到目标对象信息,包括:对所述第一传感器采集到的点云数据进行目标识别,得到所述目标对象信息,其中,所述目标对象信息为所述目标场景对象的对象点云。
在一个示例性实施例中,所述对所述第一传感器采集到的点云数据进行目标识别,得到所述目标对象信息,包括:对所述第一传感器采集到的点云数据进行目标识别,得到候选对象信息,其中,所述候选对象信息为所述当前区域内包含的候选对象的对象点云;根据所述候选对象的位置信息和所述虚拟禁区信息,从所述候选对象中选取出与所述虚拟禁区匹配的所述目标场景对象,得到所述目标对象信息。
在一个示例性实施例中,所述根据所述虚拟禁区信息和所述目标对象信息,控制所述清洁设备执行目标脱困操作,包括:在根据所述目标对象信息确定所述目标场景对象为允许所述清洁设备越过越障的场景对象、且所述清洁设备穿过所述虚拟禁区的虚拟墙进入到所述目标区域地图中除了所述待清洁区域以外的区域的情况下,控制所述清洁设备越过所述目标场景对象进入到所述待清洁区域内。
在一个示例性实施例中,所述根据所述虚拟禁区信息和所述目标对象信息,控制所述清洁设备执行目标脱困操作,包括:在根据所述目标对象信息确定所述目标场景对象为目标类型的场景对象、且所述清洁设备已穿过所述虚拟禁区的虚拟墙进入到所述目标场景对象所在的目标对象区域的 情况下,控制所述清洁设备执行与所述目标类型匹配的所述目标脱困操作,其中,所述目标类型包括以下至少之一:底部不允许所述清洁设备通过的类型,与墙体之间的距离小于或者等于距离阈值的类型。
在一个示例性实施例中,所述控制所述清洁设备执行与所述目标类型匹配的所述目标脱困操作,包括:在所述目标类型包括底部不允许所述清洁设备通过的类型的情况下,通过所述清洁设备上的第二传感器进行点云数据采集,得到目标点云数据;根据所述目标点云数据,识别出与所述清洁设备进入到所述目标对象区域的移动轨迹匹配的出口,其中,所述出口的大小允许所述清洁设备通过;控制所述清洁设备沿着所述移动轨迹从所述出口处移动出所述目标对象区域;在所述目标类型包括与墙体之间的距离小于或者等于距离阈值的类型的情况下,控制所述清洁设备沿着通过所述清洁设备的距离传感器所检测到的目标边界进行移动,直到移动出所述目标对象区域,其中,所述目标边界为以下至少之一:所述墙体,所述目标场景对象的边界。
在一个示例性实施例中,所述根据所述虚拟禁区信息和所述目标对象信息,控制所述清洁设备执行目标脱困操作,包括:在所述清洁设备由于检测到悬崖或者检测到所述清洁设备的车轮处于跌落状态导致被困的情况下,控制所述清洁设备执行第一脱困操作,其中,所述第一脱困操作用于控制所述清洁设备离开检测到的悬崖或者控制所述车轮脱离所述跌落状态,在执行所述第一脱困操作的过程中,所述清洁设备无视虚拟禁区;在执行完所述第一脱困操作之后,检测到所述清洁设备穿过所述虚拟禁区的虚拟墙的情况下,控制所述清洁设备执行第二脱困操作,其中,所述第二脱困操作用于控制所述清洁设备离开所述虚拟禁区的范围。
在一个示例性实施例中,在所述通过所述清洁设备上的第一传感器进行目标检测之后,所述方法还包括:在所述目标场景对象的边界位于所述待清洁区域内的情况下,控制所述清洁设备沿着所述目标场景对象的边界进行清洁;在所述目标场景对象的边界位于所述待清洁区域以外的情况下,控制所述清洁设备沿着所述虚拟禁区的虚拟墙进行清洁。
根据本申请实施例的另一个方面,还提供了一种清洁设备的运行控制 装置,包括:第一获取单元,用于获取与目标区域地图对应的虚拟禁区信息,其中,所述目标区域地图为所述清洁设备待清洁的待清洁区域所属的区域地图,所述虚拟禁区信息用于指示所述区域地图中的虚拟禁区;检测单元,用于通过所述清洁设备上的第一传感器进行目标检测,得到目标对象信息,其中,所述目标对象信息用于表示所述清洁设备所在的当前区域内与所述虚拟禁区匹配的目标场景对象的对象信息;第一控制单元,用于在对所述待清洁区域进行清洁的过程中所述清洁设备被困的情况下,根据所述虚拟禁区信息和所述目标对象信息,控制所述清洁设备执行目标脱困操作,其中,脱困后的所述清洁设备处于所述虚拟禁区以外。
在一个示例性实施例中,所述检测单元包括:第一识别模块,用于对所述第一传感器采集到的点云数据进行目标识别,得到所述目标对象信息,其中,所述目标对象信息为所述目标场景对象的对象点云。
在一个示例性实施例中,所述第一识别模块包括:识别子模块,用于对所述第一传感器采集到的点云数据进行目标识别,得到候选对象信息,其中,所述候选对象信息为所述当前区域内包含的候选对象的对象点云;选取子模块,用于根据所述候选对象的位置信息和所述虚拟禁区信息,从所述候选对象中选取出与所述虚拟禁区匹配的所述目标场景对象,得到所述目标对象信息。
在一个示例性实施例中,所述第一控制单元包括:第一控制模块,用于在根据所述目标对象信息确定所述目标场景对象为允许所述清洁设备越过的场景对象、且所述清洁设备穿过所述虚拟禁区的虚拟墙进入到所述目标区域地图中除了所述待清洁区域以外的区域的情况下,控制所述清洁设备越过所述目标场景对象进入到所述待清洁区域内。
在一个示例性实施例中,所述第一控制单元包括:第二控制模块,用于在根据所述目标对象信息确定所述目标场景对象为目标类型的场景对象、且所述清洁设备已穿过所述虚拟禁区的虚拟墙进入到所述目标场景对象所在的目标对象区域的情况下,控制所述清洁设备执行与所述目标类型匹配的所述目标脱困操作,其中,所述目标类型包括以下至少之一:底部不允许所述清洁设备通过的类型,与墙体之间的距离小于或者等于距离阈 值的类型。
在一个示例性实施例中,所述第一控制单元包括:采集模块,用于在所述目标类型包括底部不允许所述清洁设备通过的类型的情况下,通过所述清洁设备上的第二传感器进行点云数据采集,得到目标点云数据;第二识别模块,用于根据所述目标点云数据,识别出与所述清洁设备进入到所述目标对象区域的移动轨迹匹配的出口,其中,所述出口的大小允许所述清洁设备通过;第三控制模块,用于控制所述清洁设备沿着所述移动轨迹从所述出口处移动出所述目标对象区域;第四控制模块,用于在所述目标类型包括与墙体之间的距离小于或者等于距离阈值的类型的情况下,控制所述清洁设备沿着通过所述清洁设备的距离传感器所检测到的目标边界进行移动,直到移动出所述目标对象区域,其中,所述目标边界为以下至少之一:所述墙体,所述目标场景对象的边界。
在一个示例性实施例中,所述第一控制单元包括:第五控制模块,用于在所述清洁设备由于检测到悬崖或者检测到所述清洁设备的车轮处于跌落状态导致被困的情况下,控制所述清洁设备执行第一脱困操作,其中,所述第一脱困操作用于控制所述清洁设备离开检测到的悬崖或者控制所述车轮脱离所述跌落状态,在执行所述第一脱困操作的过程中,所述清洁设备无视虚拟禁区;第六控制模块,用于在执行完所述第一脱困操作之后,检测到所述清洁设备穿过所述虚拟禁区的虚拟墙的情况下,控制所述清洁设备执行第二脱困操作,其中,所述第二脱困操作用于控制所述清洁设备离开所述虚拟禁区的范围。
在一个示例性实施例中,所述装置还包括:第二控制单元,用于在所述通过所述清洁设备上的第一传感器进行目标检测之后,在所述目标场景对象的边界位于所述待清洁区域内的情况下,控制所述清洁设备沿着所述目标场景对象的边界进行清洁;第三控制单元,用于在所述目标场景对象的边界位于所述待清洁区域以外的情况下,控制所述清洁设备沿着所述虚拟禁区的虚拟墙进行清洁。
根据本申请实施例的又一方面,还提供了一种计算机可读的存储介质,该计算机可读的存储介质中存储有计算机程序,其中,该计算机程序被设 置为运行时执行上述接口的测试方法。
根据本申请实施例的又一方面,还提供了一种电子装置,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,其中,上述处理器通过计算机程序执行上述的接口的测试方法。
在本申请实施例中,采用真实脱困场景结合虚拟禁区信息的方式,通过获取与目标区域地图对应的虚拟禁区信息,目标区域地图为清洁设备待清洁的待清洁区域所属的区域地图,虚拟禁区信息用于指示区域地图中的虚拟禁区;通过清洁设备上的第一传感器进行目标检测,得到目标对象信息,目标对象信息用于表示清洁设备所在的当前区域内与虚拟禁区匹配的目标场景对象;在对待清洁区域进行清洁的过程中清洁设备被困的情况下,根据虚拟禁区信息和目标对象信息,控制清洁设备执行目标脱困操作,脱困后的清洁设备处于虚拟禁区以外,由于结合虚拟禁区和真实场景进行脱困,可以使脱困更智能化,达到降低清洁设备的被困率、提高用户使用体验的技术效果,进而解决了相关技术中的清洁设备的运行控制方法存在由于虚拟禁区建立误差导致的清洁设备易被困的问题。
【附图说明】
此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本申请的实施例,并与说明书一起用于解释本申请的原理。
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,对于本领域普通技术人员而言,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1是根据本申请实施例的一种可选的清洁设备的运行控制方法的硬件环境的示意图;
图2是根据本申请实施例的一种可选的清洁设备的运行控制方法的流程示意图;
图3是根据本申请实施例的一种可选的虚拟禁区的示意图;
图4是根据本申请实施例的另一种可选的虚拟禁区的示意图;
图5是根据本申请实施例的又一种可选的虚拟禁区的示意图;
图6是根据本申请实施例的一种可选的滑轨的示意图;
图7是根据本申请实施例的一种可选的清洁设备的运行控制装置的结构框图;
图8是根据本申请实施例的一种可选的电子装置的结构框图。
【具体实施方式】
下文中将参考附图并结合实施例来详细说明本申请。需要说明的是,在不冲突的情况下,本申请中的实施例及实施例中的特征可以相互组合。
需要说明的是,本申请的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。
根据本申请实施例的一个方面,提供了一种清洁设备的运行控制方法。可选地,在本实施例中,上述清洁设备的运行控制方法可以应用于如图1所示的由终端设备102、清洁设备104和服务器106所构成的硬件环境中。如图1所示,终端设备102可以通过网络与清洁设备104和/或服务器106(例如,物联网平台或者云端服务器)进行连接,以对清洁设备104的进行控制,例如,与清洁设备104进行绑定、配置清洁设备104的清洁功能。清洁设备104可以包括主机和基站(例如,扫地机和基站,清洗机和基座),主机和基站之间可以通过网络进行连接,以确定对端的当前状态(例如,电量状态、工作状态、位置信息等)。
上述网络可以包括但不限于以下至少之一:有线网络,无线网络。上述有线网络可以包括但不限于以下至少之一:广域网,城域网,局域网,上述无线网络可以包括但不限于以下至少之一:WIFI(Wireless Fidelity,无线保真),蓝牙,红外。终端设备102与清洁设备104和/或服务器106进行通信所使用的网络与清洁设备104与服务器106进行通信所使用的网络可以是相同的,也可以是不同的。终端设备102可以并不限定于为PC、手机、平板电脑等,清洁设备104可以包括但不限于:自清洁机器人,例如,自动洗拖布机器人、扫地机器人等,服务器106可以是物联网平台的服务器。
本申请实施例的清洁设备的运行控制方法可以由终端设备102、清洁设备104或者服务器106单独来执行,也可以由终端设备102、清洁设备104和 服务器106中的至少两个共同执行。其中,终端设备102或者清洁设备104执行本申请实施例的清洁设备的运行控制方法也可以是由安装在其上的客户端来执行。
以由清洁设备104来执行本实施例中的清洁设备的运行控制方法为例,图2是根据本申请实施例的一种可选的清洁设备的运行控制方法的流程示意图,如图2所示,该方法的流程可以包括以下步骤:
步骤S202,获取与目标区域地图对应的虚拟禁区信息,其中,目标区域地图为清洁设备待清洁的待清洁区域所属的区域地图,虚拟禁区信息用于指示区域地图中的虚拟禁区。
本实施例中的清洁设备的运行控制方法可以应用到以下场景中:在清洁设备对目标区域中的待清洁区域进行清洁的过程中,结合真实场景和虚拟禁区对清洁设备进行运行控制,以降低清洁设备被困的概率。上述目标区域可以是家庭中的室内区域,也可以是其他如餐厅、办公室、工厂车间区域,还可以是其他可以通过清洁设备进行清洁的区域。上述清洁设备可以是智能吸尘器、智能扫地机、集扫、拖于一体的智能清扫机,还可以是其他具有清洁功能的机器人,本实施例中对此不做限定。
清洁设备当前待清洁的区域为待清洁区域。在开始进行区域清洁之前、或者之后,清洁设备可以获取与上述目标区域对应的目标区域地图,目标区域地图为待清洁区域所属的区域地图,待清洁区域可以是目标区域的全部或者部分。基于设置的清洁区域信息,清洁设备可以确定出待清洁区域。上述清洁区域信息可以是默认的区域信息,例如,默认清洁目标区域的全部或者特定部分,也可以是终端设备响应检测到对目标区域地图执行的选取操作所生成的清洁区域信息,清洁区域信息可以通过服务器端发送给清洁设备,上述选取操作可以是从目标区域所包含的多个子区域中选取出待清洁的子区域的操作。
目标区域地图可以是清洁设备使用图像采集部件(比如,摄像头、激光传感器)采集到的区域图像或者区域点云形成的地图,还可以是通过其他方式获得的地图,例如,从终端设备侧接收的地图,该地图可以是通过其他设备上的图像采集部件采集到的区域图像或者区域点云形成的地图。 本实施例中对此不做限定。
在目标区域地图中还可以包含有虚拟禁区信息,清洁设备还可以获取目标区域地图中的虚拟禁区信息。该虚拟禁区信息用于指示目标区域地图中的虚拟禁区,虚拟禁区是指通过虚拟墙、禁区线等所标识的禁止清洁设备进入的区域。这里的虚拟墙和禁区线为虚拟的区域边界,清洁设备可以穿过虚拟墙、禁区线等进入到虚拟禁区之内。虚拟禁区可以是用户通过在其终端设备上显示的目标区域地图中所建立的禁区,也可以是清洁设备基于历史被困记录所建立的禁区。本实施例中对于虚拟禁区的建立方式不做限定。
例如,为了避免由于发生跌落或者被困在家具底部等情况导致的清洁设备损坏或者无法继续工作,可以在房间区域地图的楼梯口处或者家具等地方设置虚拟禁区。
步骤S204,通过清洁设备上的第一传感器进行目标检测,得到目标对象信息,其中,目标对象信息用于表示清洁设备所在的当前区域内与虚拟禁区匹配的目标场景对象的对象信息。
为了避免由于建立的虚拟禁区与真实场景存在误差导致的清洁设备被困的情况,清洁设备可以通过其上的第一传感器进行目标检测,确定位于清洁设备所在的当前区域内的场景对象(即,场景物体),根据虚拟禁区的位置信息以及检测到的场景对象的位置信息,可以确定与虚拟禁区匹配的目标场景对象,进而得到目标场景对象的对象信息,即,目标对象信息。
第一传感器可以是图像采集设备,也可以是其他具有对象识别功能的设备,例如,第一传感器可以为LDS(Laser Direct Structuring,激光直接成型技术)激光测距传感器,其可以通过发射激光进行点云数据采集,根据采集到的点云数据确定目标对象信息。第一传感器可以布设在清洁设备的外侧(例如,前侧、左侧、右侧、后侧等)、顶部或者底部,其数量可以为一个或多个,清洁设备的不同侧可以布设有一个或多个。
上述清洁设备所在的当前区域为第一传感器的采集区域,表示其能够采集到的区域范围,其可以属于待清洁区域(此时,清洁设备可以位于待清洁区域),也可以属于目标区域中除了待清洁区域以外的其他区域(此时, 清洁设备可以穿过了虚拟禁区等进入到的其他区域),也可以是部分属于待清洁区域、部分属于其他区域,本实施例中对此不做限定。
在本实施例中,清洁设备可以对第一传感器采集到的场景数据进行目标识别(场景对象识别),确定检测到的场景对象。清洁设备可以分别使用各种参考对象的参考对象特征和虚拟禁区信息对采集到的场景数据进行匹配,确定出与各种参考对象匹配的目标场景对象,得到当前区域内的所有目标场景对象的对象信息,即,目标对象信息。
需要说明的是,可以预先配置所需识别的多种参考对象(即,进行目标识别所参考的对象)以及每种参考对象的参考对象特征,上述每种参考对象可以是预定对象类型的场景对象,其对应的参考对象特征可以预先存储到清洁设备中,也可以存储到服务器端,以便由清洁设备或者服务器端进行目标识别。对于不同的区域类型,其对应的预定对象类型可以是相同的,也可以是不同的。本实施例中对此不做限定。
例如,扫地机可以通过其上的传感器进行障碍物检测,得到与虚拟禁区所匹配的障碍物,比如,台阶、滑轨、深色地毯等。
步骤S206,在对待清洁区域进行清洁的过程中清洁设备被困的情况下,根据虚拟禁区信息和目标对象信息,控制清洁设备执行目标脱困操作,其中,脱困后的清洁设备处于虚拟禁区以外。
在对待清洁区域进行清洁的过程中,由于进行避障(即,躲避障碍物)等原因,清洁设备可能会被困到某个位置,该位置可以是待清洁区域内的位置,也可以是目标区域中除了待清洁区域以外的其他位置,例如,清洁设备穿过虚拟墙进入到目标区域中除了待清洁区域以外的其他区域、清洁设备被困到家具底部、清洁设备触发下视或者跌落等。在确定被困之后,清洁设备可以按照配置的脱困策略进行脱困。
由于缺少虚拟禁区与真实场景的融合修正,机器(即,清洁设备)无法领会建立虚拟禁区的真实目的。并且,在由真实的脱困场景(越障、台阶、狭窄区域等)结合虚拟禁区所组成的复杂场景中,一方面,机器将虚拟禁区视为真实墙壁而无法从狭窄区域逃离,另一方面,由于在复杂场景中脱困可能会导致机器误入禁区或穿过禁区,造成机器报警被困。可见, 相关技术中的清洁设备的脱困方案无法针对真实复杂场景和虚拟禁区的脱困场景识别,也缺少相对应的脱困方法和弥补方案。
可见,在真实脱困场景和虚拟禁区组成的复杂脱困场景,清洁设备的脱困策略不完善,被困后的弥补方案也不完善,总体表现为不够智能化,且存在脱困场景误识别(例如,误触发下视/跌落)的问题。在本实施例中,在进行脱困时,清洁设备可以将真实场景与虚拟禁区信息结合,综合判断出建立虚拟禁区的目的,确定当前的脱困场景,并采用对应的脱困策略进行脱困,可以提高针对虚拟禁区的脱困效果,使脱困更智能化,减少清洁设备的被困率,进而增强用户的体验感。
在本实施例中,清洁设备可以根据虚拟禁区信息和目标对象信息,确定当前被困的场景类型,并基于确定的场景类型控制清洁设备执行与当前被困的场景类型匹配的脱困操作,即,目标脱困操作。在执行目标脱困操作之后,清洁设备可以脱离被困在的场景,同时,脱困后的清洁设备处于地图上的虚拟禁区以外。
可选地,清洁设备可以结合真实场景(比如,场景中的物品)和虚拟禁区信息,综合判断出建立虚拟禁区的目的,识别出与建立的虚拟禁区所对应的脱困场景,上述脱困场景可以包括但不限于以下至少之一:
脱困场景1:机器(清洁设备)易识别越障、实际禁区阻拦越障的场景,即,机器容易越过障碍物、通过虚拟禁区阻拦机器越过障碍物的场景;
脱困场景2:上表面卡死或LDS碰撞脱困、同时考虑远离虚拟墙的场景,即,在虚拟禁区中机器的上表面会被卡死或者可以通过LDS碰撞进行脱困、同时脱困时需要考虑远离虚拟禁区的场景;
脱困场景3:优先处理触发的下视或者跌落、再处理虚拟禁区(比如,虚拟禁区的虚拟墙)、最后处理其他障碍物的场景。
可选地,结合所识别出的脱困场景,清洁设备可以确定待执行的脱困操作,例如,远离虚拟禁区,又例如,先无视虚拟禁区、在从障碍物中脱离或者穿过虚拟禁区进入到待清洁区域之后再远离虚拟禁区,再例如,先无视虚拟禁区处理下视或者跌落之后,再远离虚拟禁区等等。本实施例中对此不做限定。
通过上述步骤S202至步骤S206,获取与目标区域地图对应的虚拟禁区信息,其中,目标区域地图为清洁设备待清洁的待清洁区域所属的区域地图,虚拟禁区信息用于指示区域地图中的虚拟禁区;通过清洁设备上的第一传感器进行目标检测,得到目标对象信息,其中,目标对象信息用于表示清洁设备所在的当前区域内与虚拟禁区匹配的目标场景对象;在对待清洁区域进行清洁的过程中清洁设备被困的情况下,根据虚拟禁区信息和目标对象信息,控制清洁设备执行目标脱困操作,解决了相关技术中的清洁设备的运行控制方法存在由于虚拟禁区建立误差导致的清洁设备易被困的问题,降低了清洁设备的被困率,提高了用户使用体验。
在一个示例性实施例中,通过清洁设备上的第一传感器进行目标检测,得到目标对象信息,包括:
S11,对第一传感器采集到的点云数据进行目标识别,得到目标对象信息,其中,目标对象信息为目标场景对象的对象点云。
在本实施例中,第一传感器可以是点云传感器,即,用于采集点云数据的传感器。清洁设在进行区域清洁的过程中,可以使用点云传感器进行点数据采集,得到采集到的点云数据,采集到的点云数据中可以包含采集范围内的障碍物的点云数据。
可选地,第一传感器(例如,点云传感器)进行数据采集可以是周期执行的,例如,第一传感器可以实时进行数据采集,即,每隔目标时长(例如,1s)进行一次数据采集,也可以是事件触发执行的,例如,在检测到清洁设备发生碰撞、清洁设备与虚拟禁区的距离小于或者等于第一距离阈值时,触发进行数据采集。
在采集到的点云数据之后,清洁设备可以进行目标识别。比如,清洁设备可以分别使用各种参考对象的参考对象点云和虚拟禁区信息对采集到的点云数据进行匹配,确定出与各种参考对象匹配的目标场景对象,得到当前区域内的所有目标场景对象的对象点云,从而得到目标对象信息。
通过本实施例,使用点云传感器进行场景对象识别,可以提高场景对象识别的准确性和便捷性。
在一个示例性实施例中,对第一传感器采集到的点云数据进行目标识 别,得到目标对象信息,包括:
S21,对第一传感器采集到的点云数据进行目标识别,得到候选对象信息,其中,候选对象信息为当前区域内包含的候选对象的对象点云;
S22,根据候选对象的位置信息和虚拟禁区信息,从候选对象中选取出与虚拟禁区匹配的目标场景对象,得到目标对象信息。
在本实施例中,在进行目标识别时,清洁设备可以首先对采集到的点云数据进行目标识别,得到候选对象信息。候选对象信息为识别到当前区域内包含的候选对象的对象信息,这里,候选对象可以为预定对象类型的场景对象,候选对象的对象信息可以是候选对象的对象点云。
可选地,为了识别出候选对象,清洁设备可以首先通过轮廓检测确定出待识别对象,即,检测到的、可能为所需的场景对象的对象;然后,使用待识别对象的对象信息(例如,对象点云)与参考对象的对象信息进行匹配,将与参考对象的对象信息匹配的待识别对象的对象信息,确定为候选对象的对象信息,即,候选对象的对象点云。
清洁设备可以根据虚拟禁区的位置信息以及候选对象的对象信息,确定与虚拟禁区匹配的目标场景对象,进而得到目标对象信息,其可以是目标对象的对象点云。
可选地,清洁设备可以根据虚拟禁区的位置信息和候选对象的位置信息,确定与虚拟禁区匹配的目标场景对象,上述候选对象的位置信息可以包含在候选对象的对象信息中。确定与虚拟禁区匹配的目标场景对象的方式可以包括但不限于以下至少之一:
将位于虚拟禁区内的候选对象,确定为与虚拟禁区匹配的目标场景对象;
将与虚拟禁区的边界之间的距离小于或者等于第一距离阈值的候选对象,确定为与虚拟禁区匹配的目标场景对象;
将形状与虚拟禁区的边界形状匹配的候选对象,确定为与虚拟禁区匹配的目标场景对象。
例如,在客厅与阳台中间的滑轨或台阶处,计算虚拟禁区与台阶是否接近平行、且宽度近似,如果接近平行、且宽度近似,可以确定与虚拟禁 区匹配的场景对象为滑轨或台阶。
通过本实施例,通过先识别场景中的候选对象,再根据识别出的候选对象的对象信息和虚拟禁区的位置信息确定与虚拟禁区匹配的场景对象,可以提高与虚拟禁区匹配的场景对象确定的效率。
在一个示例性实施例中,根据虚拟禁区信息和目标对象信息,控制清洁设备执行目标脱困操作,包括:
S31,在根据目标对象信息确定目标场景对象为允许清洁设备越过越障的场景对象、且清洁设备穿过虚拟禁区的虚拟墙进入到目标区域地图中除了待清洁区域以外的区域的情况下,控制清洁设备越过目标场景对象进入到待清洁区域内。
在本实施例中,如果根据目标对象信息确定出当前的脱困场景为前述脱困场景1,即,机器易识别越障、实际禁区阻拦越障的场景,清洁设备可以计算清洁设备与虚拟禁区之间的距离,然后根据目标区域地图和目标对象信息,确定出清洁设备位于禁区的哪一侧。
如果清洁设备位于待清洁区域内,且未穿过虚拟禁区的虚拟墙,则清洁设备可以采取远离虚拟墙的策略。清洁设备还可以沿着特定边界进行清扫。如果清洁设备穿过虚拟禁区的虚拟墙进入到目标区域地图中除了待清洁区域以外的区域,则可以控制清洁设备越过目标场景对象,进入到待清洁区域内。在进入到待清洁区域内之后,清洁设备可以采取远离虚拟墙的策略,还可以沿着特定边界进行清扫。
需要说明的是,特定边界为虚拟禁区的边界和目标场景对象的边界中,更靠近待清洁区域的一个,即,在目标场景对象位于虚拟禁区中的情况下,特定边界为虚拟禁区的边界,在目标场景对象位于虚拟禁区外的情况下,特定边界为目标场景对象的边界。
例如,在客厅与阳台中间的滑轨或台阶处,如果计算出虚拟禁区与台阶接近平行、且宽度近似,此时可以将与建立的虚拟禁区对应的脱困场景识别为机器(扫地机,前述清洁设备的一种示例)易识别越障、实际禁区阻拦越障的场景。扫地机可以计算机器与禁区之间的距离,并根据地图信息与脱困真实场景信息(目标对象信息的一种示例),判断机器在禁区哪一 侧。如果机器在房间内侧,可以采取远离虚拟墙策略;如果机器已经穿过虚拟禁区,则可以忽略虚拟禁区,通过越障策略越过台阶或者滑轨进入房间内后,再采取远离虚拟墙策略。
作为一个示例,如图3所示,在进行区域清扫时,扫地机穿过禁区线进入到衣架内测。扫地机可以识别出与虚拟禁区匹配的场景对象为衣架的底端,并确定当前的脱困场景为脱困场景1、且自身已经穿过虚拟禁区。此时,扫地机可以通过越障策略越过滑轨进入客厅,再采取远离虚拟墙策略。
又例如,如图4所示,扫地机待清扫的区域为客厅。如果扫地机穿过虚拟禁区进入到阳台,扫地机可以识别出与虚拟禁区匹配的场景对象为客厅与阳台中间的滑轨,并确定当前的脱困场景为脱困场景1、且自身已经穿过虚拟禁区。此时,扫地机可以通过越障策略越过滑轨进入客厅,再采取远离虚拟墙策略。
通过本实施例,对于机器易识别越障、实际禁区阻拦越障的脱困场景,在清洁设备已经穿过虚拟禁区时,控制清洁设备先忽略虚拟禁区到待清洁,可以提高清洁设备脱困的便捷性和成功率。
在一个示例性实施例中,根据虚拟禁区信息和目标对象信息,控制清洁设备执行目标脱困操作,包括:
S41,在根据目标对象信息确定目标场景对象为目标类型的场景对象、且清洁设备已穿过虚拟禁区的虚拟墙进入到目标场景对象所在的目标对象区域的情况下,控制清洁设备执行与目标类型匹配的目标脱困操作,其中,目标类型包括以下至少之一:底部不允许清洁设备通过的类型,与墙体之间的距离小于或者等于距离阈值的类型。
在本实施例中,如果根据目标对象信息确定出当前的脱困场景为前述脱困场景2,即,上表面卡死或LDS碰撞脱困、同时考虑远离虚拟墙的场景,清洁设备可以确定目标场景对象的对象类型(即,目标类型)。目标场景对象的对象类型可以是底部不允许清洁设备通过的类型,例如,底部过低的家具,也可以是与墙体之间的距离小于或等于距离阈值(可以是第二距离阈值)的类型,例如,距离墙体较近的床。
如果已穿过虚拟禁区的虚拟墙进入到目标场景对象所在的目标对象区 域,清洁设备可以执行与目标场景对象的对象类型匹配的目标脱困操作,从而控制清洁设备脱困。对于不同的对象类型,清洁设备可以执行不同的脱困操作,上述目标脱困操作可以是预先配置的,即,为不同对象类型配置对应的脱困操作,也可以是清洁设备分别尝试多个脱困操作,并基于尝试结果确定出的脱困操作,上述多个脱困操作可以包括但不限于以下至少之一:越障操作,局部导航操作,沿边操作。本实施例中对此不做限定。
例如,对于各种家具底部外轮廓边缘,扫地机可以计算其上的传感器识别到的点云障碍位置和禁区位置,识别出当前的脱困场景为:易上表面卡死或LDS碰撞脱困、同时考虑远离虚拟墙的场景。根据点云轮廓,扫地机可以判断家具类型,结合禁区范围,通过脱困策略与局部导航、沿边等模块配合,进行扫地机的脱困。
通过本实施例,按照与虚拟禁区匹配的场景对象的对象类型,结合虚拟禁区的禁区范围,采用对应的脱困操作进行脱困,可以提高清洁设备脱困的成功率。
在一个示例性实施例中,控制清洁设备执行与目标类型匹配的目标脱困操作,包括以下至少之一:
S51,在目标类型包括底部不允许清洁设备通过的类型的情况下,通过清洁设备上的第二传感器进行点云数据采集,得到目标点云数据;根据目标点云数据,识别出与清洁设备进入到目标对象区域的移动轨迹匹配的出口,其中,出口的大小允许清洁设备通过;控制清洁设备沿着移动轨迹从出口处移动出目标对象区域;
S52,在目标类型包括与墙体之间的距离小于或者等于距离阈值的类型的情况下,控制清洁设备沿着通过清洁设备的距离传感器所检测到的目标边界进行移动,直到移动出目标对象区域,其中,目标边界为以下至少之一:墙体,目标场景对象的边界。
如果目标类型包括底部不允许清洁设备通过的类型(例如,衣柜的底部四周高、中间低,其底部不允许扫地机通过),清洁设备可以进行局部导航,其可以沿着历史移动轨迹或者重新规划新的移动轨迹返回到待清洁区域内。在进行移动轨迹规划时,清洁设备可以通过第二传感器进行点云数 据采集,得到目标点云数据。第二传感器与第一传感器可以是相同的传感器(例如,点云传感器),也可以是不同的传感器。
对于目标点云数据,清洁设备可以对目标点云数据进行识别,确定出大小允许清洁设备通过的移动路径,并沿着确定出的移动轨迹移动出目标对象区域。可选地,清洁设备可以沿着历史移动轨迹返回到待清洁区域内:对目标点云数据进行识别,识别出与清洁设备进入到目标对象区域的移动轨迹(即,历史移动轨迹)匹配的出口,识别出的出口的大小允许清洁设备通过;沿着历史移动轨迹从识别出的出口处移动出目标对象区域。
例如,扫地机在进入到家具底部之后,其可以基于点云传感器采集到的点云数据进行识别,确定其局部范围内允许其通过的出口、或者移动通道,并从确定出的出口或者移动通道移动出家具底部。
如果目标类型包括与墙体之间的距离小于或者等于距离阈值的类型,此时允许清洁设备移动的空间较小,类似于狭窄通道(即,狭窄区域),可以通过沿边检测控制清洁设备沿着墙体进行移动,直到移动出目标对象区域(移动进待清洁区域)。可选地,清洁设备上可以布设有距离传感器(例如,LDS激光测距传感器),其可以通过发射检测信号进行测距,从而检测到障碍物,例如,墙体,清洁设备可以沿着通过距离传感器所检测到的墙体进行移动。
可选地,如果目标场景对象的底部较低,通过距离传感器也可以检测到目标场景对象的边界,清洁设备也可以沿着通过距离传感器所检测到的目标场景对象的边界进行移动,直到移动出目标对象区域。上述距离传感器与前述的第一传感器和/或第二传感器可以是相同的传感器,也可以是不同的传感器。
通过本实施例,基于场景对象的类型采用局部导航和/或沿边移动的方式控制清洁设备脱困,可以提高清洁设备脱困的成功率。
在一个示例性实施例中,根据虚拟禁区信息和目标对象信息,控制清洁设备执行目标脱困操作,包括:
S61,在清洁设备由于检测到悬崖或者检测到清洁设备的车轮处于跌落状态导致被困的情况下,控制清洁设备执行第一脱困操作,其中,第一脱 困操作用于控制清洁设备离开检测到的悬崖或者控制车轮脱离跌落状态,在执行第一脱困操作的过程中,清洁设备无视虚拟禁区;
S62,在执行完第一脱困操作之后,检测到清洁设备穿过虚拟禁区的虚拟墙的情况下,控制清洁设备执行第二脱困操作,其中,第二脱困操作用于控制清洁设备离开虚拟禁区的范围。
清洁设备上可以配置有下视传感器和/或跌落传感器,下视传感器用于进行悬崖检测,跌落传感器用于进行跌落检测,即,检测清洁设备的车轮的跌落状态。清洁设备可以通过下视传感器和/或跌落传感器进行下视检测和/或跌落检测,确定是否触发下视(下视传感器被触发)或跌落(跌落传感器被触发,此时清洁设备的车轮处于跌落状态)。
如果触发下视或跌落,可以确定出当前的脱困场景为前述脱困场景3,即,优先处理触发的下视或者跌落、再处理虚拟禁区、最后处理其他障碍物的场景。清洁设备可以优先处理下视或者跌落,此时清洁设备无视虚拟禁区。在处理下视或者跌落时,清洁设备可以执行第一脱困操作,第一脱困操作用于控制清洁设备离开检测到的悬崖或者控制车轮脱离跌落状态。
第一脱困操作可以是与下视或跌落匹配的脱困操作。如果触发下视,清洁设备可以执行前进操作、后退操作、旋转操作(向左旋转、向右旋转、或者原地旋转等)等脱困操作。如果触发跌落,清洁设备可以执行后退操作、旋转操作等脱困操作。本实施例中对于第一脱困操作不做限定。
在一些场景下,下视或者跌落可能会误触发,例如,对于一些深色物体(比如,深色地毯),由于其表面对于光的反射较弱,可能会误判为触发下视,又例如,一些具有较窄凹槽的物体(例如,滑轨)也可能会误触发下视或者跌落。为了提高脱困效率,清洁设备可以确定是否误触发下视或者跌落。如果确定误触发下视或者跌落,清洁设备可以忽略触发的下视或者跌落。如果不是误触发下视或者跌落,清洁设备可以执行第一脱困操作。
可选地,对于悬崖和深色物体,超声波的反射波的幅值是不同的。如果触发下视,清洁设备可以通过发送超声波等方式进行对象识别,基于超声波的反射波可以确定出是否误触发下视。如果触发跌落,清洁设备可以基于点云传感器采集到的点云数据确定触发下视的场景对象的轮廓、形状 等,确定是否误触发跌落,如果确定触发跌落的场景对象为允许清洁设备越过的障碍物,则可以确定误触发跌落,否则,确定不是误触发跌落。
例如,在打扫卫生间时,在移动到如图5所示的位置时,扫地机触发下视,此时扫地机可以无视虚拟禁区,优先进行下视处理。
又例如,在扫地机移动到如图6所示的滑轨处时,滑轨中间的凹槽的深度可能会触发下视或者轮子跌落,扫地机可以通过超声波或者感知到的点云确定误触发下视或者跌落。
在执行完第一脱困操作之后,清洁设备可以确定当前所处的位置信息,基于当前所处的位置信息和虚拟禁区的位置信息,确定清洁设备与虚拟禁区的位置关系。如果确定清洁设备穿过虚拟禁区的虚拟墙(可能进入到虚拟禁区内,也可能执行第二脱困操作,第二脱困操作可以用于控制清洁设备离开虚拟禁区的范围。在执行第二脱困操作时,清洁设备可以判断当前的脱困场景(例如,脱困场景1、脱困场景2),并基于确定的脱困场景执行对应的脱困操作,即,第二脱困操作。
例如,在深色地毯或凹槽滑轨,机器容易误触发下视或跌落,扫地机可以按照优先处理下视/跌落、再处理虚拟墙、最后处理其他脱困场景的优先级顺序,进行脱困处理。扫地机可以首先判断是否误触发下视/跌落,如果是,忽略下视/跌落,如果否,则优先处理下视/跌落,待脱困成功后,再结合禁区范围,设计脱困策略。
通过本实施例,通过优先处理下视/跌落、再处理虚拟墙、最后处理其他脱困场景,可以提高设备运行的安全性,提高设备脱困的效率。
在一个示例性实施例中,在通过清洁设备上的第一传感器进行目标检测之后,上述方法还包括:
S71,在目标场景对象的边界位于待清洁区域内的情况下,控制清洁设备沿着目标场景对象的边界进行清洁;
S72,在目标场景对象的边界位于待清洁区域以外的情况下,控制清洁设备沿着虚拟禁区的虚拟墙进行清洁。
在本实施例中,在通过第一传感器进行目标检测之后,清洁设备可以基于待清洁区域、虚拟禁区和目标场景对象的位置关系,对待清洁区域进 行清洁。在进行区域清洁时,清洁设备可以采取远离虚拟墙的策略,沿着特定边界进行清扫。
特定边界为虚拟禁区的边界和目标场景对象的边界中,更靠近待清洁区域的一个。如果目标场景对象的边界位于待清洁区域内,特定边界为目标场景对象的边界,清洁设备可以沿着目标场景对象的边界进行区域清洁;如果目标场景对象的边界位于待清洁区域以外,特定边界为虚拟禁区的边界,可以是虚拟禁区的虚拟墙,清洁设备可以沿着虚拟禁区的虚拟墙进行区域清洁。
通过本实施例,基于待清洁区域、虚拟禁区和目标场景对象的位置关系,对待清洁区域进行清洁,可以提高区域清洁的合理性,降低设备被困的概率。
需要说明的是,对于前述的各方法实施例,为了简单描述,故将其都表述为一系列的动作组合,但是本领域技术人员应该知悉,本申请并不受所描述的动作顺序的限制,因为依据本申请,某些步骤可以采用其他顺序或者同时进行。其次,本领域技术人员也应该知悉,说明书中所描述的实施例均属于优选实施例,所涉及的动作和模块并不一定是本申请所必须的。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到根据上述实施例的方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM(Read-Only Memory,只读存储器)/RAM(Random Access Memory,随机存取存储器)、磁碟、光盘)中,包括若干指令用以使得一台终端设备(可以是手机,计算机,服务器,或者网络设备等)执行本申请各个实施例所述的方法。
根据本申请实施例的又一个方面,还提供了一种用于实施上述清洁设备的运行控制方法的清洁设备的运行控制装置。图7是根据本申请实施例的一种可选的清洁设备的运行控制装置的结构框图,如图7所示,该装置可以 包括:
第一获取单元702,用于获取与目标区域地图对应的虚拟禁区信息,其中,目标区域地图为清洁设备待清洁的待清洁区域所属的区域地图,虚拟禁区信息用于指示区域地图中的虚拟禁区;
检测单元704,与第一获取单元702相连,用于通过清洁设备上的第一传感器进行目标检测,得到目标对象信息,其中,目标对象信息用于表示清洁设备所在的当前区域内与虚拟禁区匹配的目标场景对象的对象信息;
第一控制单元706,与检测单元704相连,用于在对待清洁区域进行清洁的过程中清洁设备被困的情况下,根据虚拟禁区信息和目标对象信息,控制清洁设备执行目标脱困操作,其中,脱困后的清洁设备处于虚拟禁区以外。
需要说明的是,该实施例中的第一获取单元702可以用于执行上述步骤S202,该实施例中的检测单元704可以用于执行上述步骤S204,该实施例中的第一控制单元706可以用于执行上述步骤S206。
通过上述模块,获取与目标区域地图对应的虚拟禁区信息,其中,目标区域地图为清洁设备待清洁的待清洁区域所属的区域地图,虚拟禁区信息用于指示区域地图中的虚拟禁区;通过清洁设备上的第一传感器进行目标检测,得到目标对象信息,其中,目标对象信息用于表示清洁设备所在的当前区域内与虚拟禁区匹配的目标场景对象;在对待清洁区域进行清洁的过程中清洁设备被困的情况下,根据虚拟禁区信息和目标对象信息,控制清洁设备执行目标脱困操作,解决了相关技术中的清洁设备的运行控制方法存在由于虚拟禁区建立误差导致的清洁设备易被困的问题,降低了清洁设备的被困率,提高了用户使用体验。
在一个示例性实施例中,检测单元包括:
第一识别模块,用于对第一传感器采集到的点云数据进行目标识别,得到目标对象信息,其中,目标对象信息为目标场景对象的对象点云。
在一个示例性实施例中,第一识别模块包括:
识别子模块,用于对第一传感器采集到的点云数据进行目标识别,得到候选对象信息,其中,候选对象信息为当前区域内包含的候选对象的对 象点云;
选取子模块,用于根据候选对象的位置信息和虚拟禁区信息,从候选对象中选取出与虚拟禁区匹配的目标场景对象,得到目标对象信息。
在一个示例性实施例中,第一控制单元包括:
第一控制模块,用于在根据目标对象信息确定目标场景对象为允许清洁设备越过的场景对象、且清洁设备穿过虚拟禁区的虚拟墙进入到目标区域地图中除了待清洁区域以外的区域的情况下,控制清洁设备越过目标场景对象进入到待清洁区域内。
在一个示例性实施例中,第一控制单元包括:
第二控制模块,用于在根据目标对象信息确定目标场景对象为目标类型的场景对象、且清洁设备已穿过虚拟禁区的虚拟墙进入到目标场景对象所在的目标对象区域的情况下,控制清洁设备执行与目标类型匹配的目标脱困操作,其中,目标类型包括以下至少之一:底部不允许清洁设备通过的类型,与墙体之间的距离小于或者等于距离阈值的类型。
在一个示例性实施例中,第一控制单元包括:
采集模块,用于在目标类型包括底部不允许清洁设备通过的类型的情况下,通过清洁设备上的第二传感器进行点云数据采集,得到目标点云数据;第二识别模块,用于根据目标点云数据,识别出与清洁设备进入到目标对象区域的移动轨迹匹配的出口,其中,出口的大小允许清洁设备通过;第三控制模块,用于控制清洁设备沿着移动轨迹从出口处移动出目标对象区域;
第四控制模块,用于在目标类型包括与墙体之间的距离小于或者等于距离阈值的类型的情况下,控制清洁设备沿着通过清洁设备的距离传感器所检测到的目标边界进行移动,直到移动出目标对象区域,其中,目标边界为以下至少之一:墙体,目标场景对象的边界。
在一个示例性实施例中,第一控制单元包括:
第五控制模块,用于在清洁设备由于检测到悬崖或者检测到清洁设备的车轮处于跌落状态导致被困的情况下,控制清洁设备执行第一脱困操作,其中,第一脱困操作用于控制清洁设备离开检测到的悬崖或者控制车轮脱 离跌落状态,在执行第一脱困操作的过程中,清洁设备无视虚拟禁区;
第六控制模块,用于在执行完第一脱困操作之后,检测到清洁设备穿过虚拟禁区的虚拟墙的情况下,控制清洁设备执行第二脱困操作,其中,第二脱困操作用于控制清洁设备离开虚拟禁区的范围。
在一个示例性实施例中,上述装置还包括:
第二控制单元,用于在通过清洁设备上的第一传感器进行目标检测之后,在目标场景对象的边界位于待清洁区域内的情况下,控制清洁设备沿着目标场景对象的边界进行清洁;
第三控制单元,用于在目标场景对象的边界位于待清洁区域以外的情况下,控制清洁设备沿着虚拟禁区的虚拟墙进行清洁。
此处需要说明的是,上述模块与对应的步骤所实现的示例和应用场景相同,但不限于上述实施例所公开的内容。
需要说明的是,上述模块作为装置的一部分可以运行在如图1所示的硬件环境中,可以通过软件实现,也可以通过硬件实现,其中,硬件环境包括网络环境。
根据本申请实施例的又一个方面,还提供了一种存储介质。可选地,在本实施例中,上述存储介质可以用于执行本申请实施例中上述任一项清洁设备的运行控制方法的程序代码。
可选地,在本实施例中,上述存储介质可以位于上述实施例所示的网络中的多个网络设备中的至少一个网络设备上。
可选地,在本实施例中,存储介质被设置为存储用于执行以下步骤的程序代码:
S1,获取与目标区域地图对应的虚拟禁区信息,其中,目标区域地图为清洁设备待清洁的待清洁区域所属的区域地图,虚拟禁区信息用于指示区域地图中的虚拟禁区;
S2,通过清洁设备上的第一传感器进行目标检测,得到目标对象信息,其中,目标对象信息用于表示清洁设备所在的当前区域内与虚拟禁区匹配的目标场景对象的对象信息;
S3,在对待清洁区域进行清洁的过程中清洁设备被困的情况下,根据虚拟禁区信息和目标对象信息,控制清洁设备执行目标脱困操作,其中,脱困后的清洁设备处于虚拟禁区以外。
可选地,本实施例中的具体示例可以参考上述实施例中所描述的示例,本实施例中对此不再赘述。
可选地,在本实施例中,上述存储介质可以包括但不限于:U盘、ROM、RAM、移动硬盘、磁碟或者光盘等各种可以存储程序代码的介质。
根据本申请实施例的又一个方面,还提供了一种用于实施上述清洁设备的运行控制方法的电子装置,该电子装置可以是服务器、终端、或者其组合。
图8是根据本申请实施例的一种可选的电子装置的结构框图,如图8所示,包括处理器802、通信接口804、存储器806和通信总线808,其中,处理器802、通信接口804和存储器806通过通信总线808完成相互间的通信,其中,
存储器806,用于存储计算机程序;
处理器802,用于执行存储器806上所存放的计算机程序时,实现如下步骤:
S1,获取与目标区域地图对应的虚拟禁区信息,其中,目标区域地图为清洁设备待清洁的待清洁区域所属的区域地图,虚拟禁区信息用于指示区域地图中的虚拟禁区;
S2,通过清洁设备上的第一传感器进行目标检测,得到目标对象信息,其中,目标对象信息用于表示清洁设备所在的当前区域内与虚拟禁区匹配的目标场景对象的对象信息;
S3,在对待清洁区域进行清洁的过程中清洁设备被困的情况下,根据虚拟禁区信息和目标对象信息,控制清洁设备执行目标脱困操作,其中,脱困后的清洁设备处于虚拟禁区以外。
可选地,在本实施例中,通信总线可以是PCI(Peripheral Component Interconnect,外设部件互连标准)总线、或EISA(Extended Industry Standard  Architecture,扩展工业标准结构)总线等。该通信总线可以分为地址总线、数据总线、控制总线等。为便于表示,图8中仅用一条粗线表示,但并不表示仅有一根总线或一种类型的总线。通信接口用于上述电子装置与其他设备之间的通信。
上述的存储器可以包括RAM,也可以包括非易失性存储器(non-volatile memory),例如,至少一个磁盘存储器。可选地,存储器还可以是至少一个位于远离前述处理器的存储装置。
作为一种示例,上述存储器806中可以但不限于包括上述设备的控制装置中的第一获取单元702、检测单元704、以及第一控制单元706。此外,还可以包括但不限于上述设备的控制装置中的其他模块单元,本示例中不再赘述。
上述处理器可以是通用处理器,可以包含但不限于:CPU(Central Processing Unit,中央处理器)、NP(Network Processor,网络处理器)等;还可以是DSP(Digital Signal Processing,数字信号处理器)、ASIC(Application Specific Integrated Circuit,专用集成电路)、FPGA(Field-Programmable Gate Array,现场可编程门阵列)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。
可选地,本实施例中的具体示例可以参考上述实施例中所描述的示例,本实施例在此不再赘述。
本领域普通技术人员可以理解,图8所示的结构仅为示意,实施上述清洁设备的运行控制方法的设备可以是终端设备,该终端设备可以是智能手机(如Android手机、iOS手机等)、平板电脑、掌上电脑以及移动互联网设备(Mobile Internet Devices,MID)、PAD等终端设备。图8其并不对上述电子装置的结构造成限定。例如,电子装置还可包括比图8中所示更多或者更少的组件(如网络接口、显示装置等),或者具有与图8所示的不同的配置。
本领域普通技术人员可以理解上述实施例的各种方法中的全部或部分步骤是可以通过程序来指令终端设备相关的硬件来完成,该程序可以存储于一计算机可读存储介质中,存储介质可以包括:闪存盘、ROM、RAM、 磁盘或光盘等。
上述本申请实施例序号仅仅为了描述,不代表实施例的优劣。
上述实施例中的集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在上述计算机可读取的存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在存储介质中,包括若干指令用以使得一台或多台计算机设备(可为个人计算机、服务器或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。
在本申请的上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见其他实施例的相关描述。
在本申请所提供的几个实施例中,应该理解到,所揭露的客户端,可通过其它的方式实现。其中,以上所描述的装置实施例仅仅是示意性的,例如所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,单元或模块的间接耦合或通信连接,可以是电性或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例中所提供的方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
以上所述仅是本申请的优选实施方式,应当指出,对于本技术领域的普通技术人员来说,在不脱离本申请原理的前提下,还可以做出若干改进和润饰,这些改进和润饰也应视为本申请的保护范围。

Claims (15)

  1. 一种清洁设备的运行控制方法,其特征在于,包括:
    获取与目标区域地图对应的虚拟禁区信息,其中,所述目标区域地图为所述清洁设备待清洁的待清洁区域所属的区域地图,所述虚拟禁区信息用于指示所述区域地图中的虚拟禁区;
    通过所述清洁设备上的第一传感器进行目标检测,得到目标对象信息,其中,所述目标对象信息用于表示所述清洁设备所在的当前区域内与所述虚拟禁区匹配的目标场景对象的对象信息;
    在对所述待清洁区域进行清洁的过程中所述清洁设备被困的情况下,根据所述虚拟禁区信息和所述目标对象信息,控制所述清洁设备执行目标脱困操作,其中,脱困后的所述清洁设备处于所述虚拟禁区以外。
  2. 根据权利要求1所述的方法,其中,所述通过所述清洁设备上的第一传感器进行目标检测,得到目标对象信息,包括:
    对所述第一传感器采集到的点云数据进行目标识别,得到所述目标对象信息,其中,所述目标对象信息为所述目标场景对象的对象点云。
  3. 根据权利要求2所述的方法,其中,所述对所述第一传感器采集到的点云数据进行目标识别,得到所述目标对象信息,包括:
    对所述第一传感器采集到的点云数据进行目标识别,得到候选对象信息,其中,所述候选对象信息为所述当前区域内包含的候选对象的对象点云;
    根据所述候选对象的位置信息和所述虚拟禁区信息,从所述候选对象中选取出与所述虚拟禁区匹配的所述目标场景对象,得到所述目标对象信息。
  4. 根据权利要求2所述的方法,其中,所述对所述第一传感器采集到的点云数据进行目标识别,得到所述目标对象信息,包括:
    使用参考对象的参考对象特征和虚拟禁区信息,对所述第一传感器采集到的点云数据进行匹配,确定出所述清洁设备所在的当前区域内与所述参考对象以及所述虚拟禁区匹配的目标场景对象,得到所述目标对象信息。
  5. 根据权利要求4所述的方法,其中,所述使用参考对象的参考对象特征和虚拟禁区信息,对所述第一传感器采集到的点云数据进行匹配,包 括:
    通过轮廓检测确定出清洁设备所在的当前区域内的待识别对象;
    将待识别对象的对象信息与参考对象的对象信息进行匹配,将与参考对象的对象信息匹配的待识别对象,确定为候选对象;
    根据虚拟禁区的位置信息和候选对象的位置信息,确定与虚拟禁区匹配的目标场景对象。
  6. 根据权利要求3或5所述的方法,其中,确定与虚拟禁区匹配的目标场景对象的方式包括但不限于以下至少之一:
    将位于虚拟禁区内的候选对象,确定为与虚拟禁区匹配的目标场景对象;
    将与虚拟禁区的边界之间的距离小于或者等于第一距离阈值的候选对象,确定为与虚拟禁区匹配的目标场景对象;
    将形状与虚拟禁区的边界形状匹配的候选对象,确定为与虚拟禁区匹配的目标场景对象。
  7. 根据权利要求1所述的方法,其中,所述根据所述虚拟禁区信息和所述目标对象信息,控制所述清洁设备执行目标脱困操作,包括:
    根据虚拟禁区信息和目标对象信息,确定出所述清洁设备当前被困的场景类型;
    基于确定的所述场景类型控制所述清洁设备执行与当前被困的场景类型匹配的脱困操作。
  8. 根据权利要求1所述的方法,其中,所述根据所述虚拟禁区信息和所述目标对象信息,控制所述清洁设备执行目标脱困操作,包括:
    在根据所述目标对象信息确定所述目标场景对象为允许所述清洁设备越过的场景对象、且所述清洁设备穿过所述虚拟禁区的虚拟墙进入到所述目标区域地图中除了所述待清洁区域以外的区域的情况下,控制所述清洁设备越过所述目标场景对象进入到所述待清洁区域内。
  9. 根据权利要求1所述的方法,其中,所述根据所述虚拟禁区信息和所述目标对象信息,控制所述清洁设备执行目标脱困操作,包括:
    在根据所述目标对象信息确定所述目标场景对象为目标类型的场景对 象、且所述清洁设备已穿过所述虚拟禁区的虚拟墙进入到所述目标场景对象所在的目标对象区域的情况下,控制所述清洁设备执行与所述目标类型匹配的所述目标脱困操作,其中,所述目标类型包括以下至少之一:底部不允许所述清洁设备通过的类型,与墙体之间的距离小于或者等于距离阈值的类型。
  10. 根据权利要求5所述的方法,其中,所述控制所述清洁设备执行与所述目标类型匹配的所述目标脱困操作,包括:
    在所述目标类型包括底部不允许所述清洁设备通过的类型的情况下,通过所述清洁设备上的第二传感器进行点云数据采集,得到目标点云数据;根据所述目标点云数据,识别出与所述清洁设备进入到所述目标对象区域的移动轨迹匹配的出口,其中,所述出口的大小允许所述清洁设备通过;控制所述清洁设备沿着所述移动轨迹从所述出口处移动出所述目标对象区域;
    在所述目标类型包括与墙体之间的距离小于或者等于距离阈值的类型的情况下,控制所述清洁设备沿着通过所述清洁设备的距离传感器所检测到的目标边界进行移动,直到移动出所述目标对象区域,其中,所述目标边界为以下至少之一:所述墙体,所述目标场景对象的边界。
  11. 根据权利要求1所述的方法,其中,所述根据所述虚拟禁区信息和所述目标对象信息,控制所述清洁设备执行目标脱困操作,包括:
    在所述清洁设备由于检测到悬崖或者检测到所述清洁设备的车轮处于跌落状态导致被困的情况下,控制所述清洁设备执行第一脱困操作,其中,所述第一脱困操作用于控制所述清洁设备离开检测到的悬崖或者控制所述车轮脱离所述跌落状态,在执行所述第一脱困操作的过程中,所述清洁设备无视虚拟禁区;
    在执行完所述第一脱困操作之后,检测到所述清洁设备穿过所述虚拟禁区的虚拟墙的情况下,控制所述清洁设备执行第二脱困操作,其中,所述第二脱困操作用于控制所述清洁设备离开所述虚拟禁区的范围。
  12. 根据权利要求1至5、7至11中任一项所述的方法,其中,在所述通过所述清洁设备上的第一传感器进行目标检测之后,所述方法还包括:
    在所述目标场景对象的边界位于所述待清洁区域内的情况下,控制所述清洁设备沿着所述目标场景对象的边界进行清洁;
    在所述目标场景对象的边界位于所述待清洁区域以外的情况下,控制所述清洁设备沿着所述虚拟禁区的虚拟墙进行清洁。
  13. 一种清洁设备的运行控制装置,其特征在于,包括:
    第一获取单元,用于获取与目标区域地图对应的虚拟禁区信息,其中,所述目标区域地图为所述清洁设备待清洁的待清洁区域所属的区域地图,所述虚拟禁区信息用于指示所述区域地图中的虚拟禁区;
    检测单元,用于通过所述清洁设备上的第一传感器进行目标检测,得到目标对象信息,其中,所述目标对象信息用于表示所述清洁设备所在的当前区域内与所述虚拟禁区匹配的目标场景对象的对象信息;
    第一控制单元,用于在对所述待清洁区域进行清洁的过程中所述清洁设备被困的情况下,根据所述虚拟禁区信息和所述目标对象信息,控制所述清洁设备执行目标脱困操作,其中,脱困后的所述清洁设备处于所述虚拟禁区以外。
  14. 一种计算机可读的存储介质,其特征在于,所述计算机可读的存储介质包括存储的程序,其中,所述程序运行时执行权利要求1至12中任一项所述的方法。
  15. 一种电子装置,包括存储器和处理器,其特征在于,所述存储器中存储有计算机程序,所述处理器被设置为通过所述计算机程序执行权利要求1至12中任一项所述的方法。
PCT/CN2022/131571 2021-12-02 2022-11-12 清洁设备的运行控制方法及装置、存储介质及电子装置 WO2023098455A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111464917.3 2021-12-02
CN202111464917.3A CN116211168A (zh) 2021-12-02 2021-12-02 清洁设备的运行控制方法及装置、存储介质及电子装置

Publications (1)

Publication Number Publication Date
WO2023098455A1 true WO2023098455A1 (zh) 2023-06-08

Family

ID=86568311

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/131571 WO2023098455A1 (zh) 2021-12-02 2022-11-12 清洁设备的运行控制方法及装置、存储介质及电子装置

Country Status (2)

Country Link
CN (1) CN116211168A (zh)
WO (1) WO2023098455A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117478714B (zh) * 2023-11-09 2024-03-08 南京特沃斯清洁设备有限公司 基于物联网的保洁设备控制方法及装置
CN117452955B (zh) * 2023-12-22 2024-04-02 珠海格力电器股份有限公司 清扫设备的控制方法、控制装置和清扫系统

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108829115A (zh) * 2018-10-09 2018-11-16 上海岚豹智能科技有限公司 一种机器人的运动控制方法及其计算设备
CN109394086A (zh) * 2018-11-19 2019-03-01 珠海市微半导体有限公司 一种基于被困的清洁机器人的继续行走方法、装置及芯片
US20190365176A1 (en) * 2019-07-11 2019-12-05 Lg Electronics Inc. Robot cleaner for cleaning in consideration of floor state through artificial intelligence and operating method for the same
US20200233433A1 (en) * 2019-01-23 2020-07-23 Jason Yan Virtual wall device and robot and control method thereof
CN111568306A (zh) * 2019-02-19 2020-08-25 北京奇虎科技有限公司 基于清洁机器人的清洁方法、装置、电子设备及存储介质
CN111714028A (zh) * 2019-03-18 2020-09-29 北京奇虎科技有限公司 清扫设备的禁区脱困方法、装置、设备及可读存储介质
CN112137509A (zh) * 2020-09-24 2020-12-29 江苏美的清洁电器股份有限公司 虚拟禁区的设置方法、装置和清洁机器人
CN112890692A (zh) * 2021-02-08 2021-06-04 美智纵横科技有限责任公司 设置清洁禁区的方法、装置、清洁设备及存储介质
CN113469000A (zh) * 2021-06-23 2021-10-01 追觅创新科技(苏州)有限公司 区域地图的处理方法及装置、存储介质及电子装置

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108829115A (zh) * 2018-10-09 2018-11-16 上海岚豹智能科技有限公司 一种机器人的运动控制方法及其计算设备
CN109394086A (zh) * 2018-11-19 2019-03-01 珠海市微半导体有限公司 一种基于被困的清洁机器人的继续行走方法、装置及芯片
US20200233433A1 (en) * 2019-01-23 2020-07-23 Jason Yan Virtual wall device and robot and control method thereof
CN111568306A (zh) * 2019-02-19 2020-08-25 北京奇虎科技有限公司 基于清洁机器人的清洁方法、装置、电子设备及存储介质
CN111714028A (zh) * 2019-03-18 2020-09-29 北京奇虎科技有限公司 清扫设备的禁区脱困方法、装置、设备及可读存储介质
US20190365176A1 (en) * 2019-07-11 2019-12-05 Lg Electronics Inc. Robot cleaner for cleaning in consideration of floor state through artificial intelligence and operating method for the same
CN112137509A (zh) * 2020-09-24 2020-12-29 江苏美的清洁电器股份有限公司 虚拟禁区的设置方法、装置和清洁机器人
CN112890692A (zh) * 2021-02-08 2021-06-04 美智纵横科技有限责任公司 设置清洁禁区的方法、装置、清洁设备及存储介质
CN113469000A (zh) * 2021-06-23 2021-10-01 追觅创新科技(苏州)有限公司 区域地图的处理方法及装置、存储介质及电子装置

Also Published As

Publication number Publication date
CN116211168A (zh) 2023-06-06

Similar Documents

Publication Publication Date Title
WO2023098455A1 (zh) 清洁设备的运行控制方法及装置、存储介质及电子装置
US10611023B2 (en) Systems and methods for performing occlusion detection
US10705535B2 (en) Systems and methods for performing simultaneous localization and mapping using machine vision systems
US11226633B2 (en) Mobile robot and method of controlling the same
CN110989631B (zh) 自移动机器人控制方法、装置、自移动机器人和存储介质
US20200097012A1 (en) Cleaning robot and method for performing task thereof
KR102490996B1 (ko) 환경에 기초하여 작동 속도를 변경하는 로봇 청소 장치
US20210247775A1 (en) Method for localizing robot, robot, and storage medium
US11547261B2 (en) Moving robot and control method thereof
US20210370511A1 (en) Cleaning robot and task performing method therefor
CN112739244A (zh) 移动机器人清洁系统
CN112515563B (zh) 障碍物的避让方法、扫地机器人及可读存储介质
CN112739505A (zh) 自主移动机器人对机器人工作区域的勘察
US20210213619A1 (en) Robot and control method therefor
CN112716401B (zh) 绕障清扫方法、装置、设备及计算机可读存储介质
CN111714028A (zh) 清扫设备的禁区脱困方法、装置、设备及可读存储介质
CN112741562A (zh) 扫地机控制方法、装置、设备及计算机可读存储介质
CN112445215A (zh) 自动导引车行驶控制方法、装置及计算机系统
CN113786125A (zh) 作业方法、自移动设备及存储介质
CN111343696A (zh) 自移动设备的通信方法、自移动设备及存储介质
KR20230134109A (ko) 청소 로봇 및 그의 태스크 수행 방법
US20220147050A1 (en) Methods and devices for operating an intelligent mobile robot
WO2023019922A1 (zh) 导航方法及自行走装置
WO2023155731A1 (zh) 清洁设备的运行控制方法及装置、存储介质及电子装置
CN117281433A (zh) 清洁设备的建图方法及系统、存储介质及清洁设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22900260

Country of ref document: EP

Kind code of ref document: A1