CN111202470A - Intelligent cleaning equipment, repositioning method and device, storage medium and electronic equipment - Google Patents

Intelligent cleaning equipment, repositioning method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN111202470A
CN111202470A CN201811392739.6A CN201811392739A CN111202470A CN 111202470 A CN111202470 A CN 111202470A CN 201811392739 A CN201811392739 A CN 201811392739A CN 111202470 A CN111202470 A CN 111202470A
Authority
CN
China
Prior art keywords
intelligent cleaning
image information
cleaning device
intelligent
cleaning apparatus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811392739.6A
Other languages
Chinese (zh)
Inventor
刘小禹
李海宾
李智军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Rockrobo Technology Co Ltd
Original Assignee
Beijing Rockrobo Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Rockrobo Technology Co Ltd filed Critical Beijing Rockrobo Technology Co Ltd
Priority to CN201811392739.6A priority Critical patent/CN111202470A/en
Publication of CN111202470A publication Critical patent/CN111202470A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/28Floor-scrubbing machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • A47L11/4008Arrangements of switches, indicators or the like
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The present disclosure relates to an intelligent cleaning device, a repositioning method and apparatus, a computer-readable storage medium, and an electronic device. The intelligent cleaning device comprises a visual receiving device used for acquiring image information in the environment where the intelligent cleaning device is located, a distance sensing device used for acquiring the spacing distance between the intelligent cleaning device and obstacles in the environment to construct a scene map of the environment where the intelligent cleaning device is located, and a control device used for re-determining the position information of the intelligent cleaning device in the scene map when the image information represents that the intelligent cleaning device is suddenly changed in position. According to the intelligent cleaning equipment, whether the position of the intelligent cleaning equipment changes suddenly or not can be determined according to the acquired image information, the position information of the intelligent cleaning equipment in the scene map is determined again when the position changes suddenly, the situation that the intelligent cleaning equipment is positioned wrongly after the position changes suddenly can be avoided, and the positioning accuracy of the intelligent cleaning equipment is enhanced.

Description

Intelligent cleaning equipment, repositioning method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of terminal technologies, and in particular, to an intelligent cleaning device, a repositioning method and apparatus, a computer-readable storage medium, and an electronic device.
Background
With the development of technology, a variety of intelligent cleaning devices, such as automatic floor sweeping robots, automatic floor mopping robots, etc., have appeared. The intelligent cleaning equipment can automatically perform cleaning operation, and is convenient for users. Taking an automatic sweeping robot as an example, the automatic cleaning of the area to be cleaned is realized by direct brushing, vacuum dust collection and other technologies.
However, when the intelligent cleaning device is in a kidnapping event during working or changes its position in the event of other events, i.e. the intelligent cleaning device is suddenly changed in position due to external events, the intelligent cleaning device usually cannot know where it is kidnapped or even does not recognize that its position changes due to the error of the sensor, noise, etc., so that there is a possibility that the cleaned area is rescanned and the cleaning is abandoned, which brings a bad user experience to the user.
Therefore, there is a need to provide an intelligent cleaning device, a method and apparatus for repositioning, a computer readable storage medium, an electronic device to at least partially address the above-mentioned problems.
Disclosure of Invention
The present disclosure provides an intelligent cleaning device, a repositioning method and apparatus, a storage medium, and an electronic device, to solve the deficiencies in the related art.
According to a first aspect of embodiments of the present disclosure, there is provided an intelligent cleaning device, comprising:
the visual receiving device is used for acquiring image information in the environment where the intelligent cleaning equipment is located;
distance sensing means for obtaining a separation distance between the intelligent cleaning device and an obstacle in the environment to construct a scene map of the environment in which the intelligent cleaning device is located; and
the control device is used for re-determining the position information of the intelligent cleaning device in the scene map when the image information represents that the position of the intelligent cleaning device suddenly changes.
Optionally, the method further includes:
a cleaning device, the working strategy of which can be matched with the scene type represented by the image information and the scene map constructed by the distance sensing device.
Optionally, the working policy includes at least one of:
clean area, clean mode.
Optionally, the optical axis of the visual receiver is disposed obliquely with respect to the travel plane of the intelligent cleaning device.
Optionally, the optical axis of the visual receiving device is aimed at a traveling direction of the intelligent cleaning device, and is used for acquiring image information located in front of the intelligent cleaning device along the traveling direction.
Optionally, the optical axis of the visual receiving device is aimed at a direction opposite to a traveling direction of the intelligent cleaning device, and is used for acquiring image information located behind the intelligent cleaning device along the traveling direction.
Optionally, the acute angle between the optical axis of the vision receiving device and the travel plane of the intelligent cleaning device is not less than 40 °;
or the acute angle is not more than 30 °.
Optionally, the angle of view of the visual receiver is not less than 65 ° or the angle of view is less than 45 °.
Optionally, the intelligent cleaning device includes a top cover, the top cover includes a first opening, the visual receiving device includes a lens, and the lens extends out from the first opening to obtain the image information.
Optionally, the intelligent cleaning device includes a top cover and a device main body, and the visual receiving device is located in an accommodating space defined by the top cover and the device main body;
the top cover comprises a second opening, a lens is arranged at the second opening to seal the second opening, and the lens of the visual receiving device acquires the image information through the lens.
Optionally, in the optical axis direction of the visual receiver, the cross-sectional area of the second opening at each position is positively related to the distance between each position and the lens, and at the same position, the cross-sectional area of the visual receiver viewing angle range is not greater than the cross-sectional area of the second opening.
Optionally, the device further comprises a bracket, wherein the bracket comprises a fixed seat, a mounting seat and a connecting part for connecting the fixed seat and the mounting seat;
the fixing seat is fixed on the equipment main body, the mounting seat and the traveling plane of the intelligent cleaning equipment form the acute angle, and the mounting seat is connected with the vision receiving device.
Optionally, the intelligent cleaning device includes a key structure, and centers of the key structure, the visual receiving device, and the distance sensing device are located on the same straight line.
Optionally, the key structure is located between the visual receiving device and the distance sensing device; or, the visual receiving device is positioned between the key structure and the distance sensing device.
According to a second aspect of the embodiments of the present disclosure, there is provided a repositioning method applied to an intelligent cleaning device, the method including:
acquiring image information in an environment where the intelligent cleaning equipment is located;
constructing a scene map about the environment according to the spacing distance between the intelligent cleaning equipment and the obstacles in the environment;
when the image information represents sudden position change of the intelligent cleaning device, the position information of the intelligent cleaning device in the scene map is determined again.
Optionally, the method further includes:
and determining the working strategy of the intelligent cleaning equipment according to the scene type represented by the scene map and the image information.
Optionally, the determining the operating strategy of the intelligent cleaning device according to the scene type and the scene map includes:
determining a cleaning area of the intelligent cleaning device;
and/or determining a cleaning mode of the intelligent cleaning device.
According to a third aspect of the embodiments of the present disclosure, there is provided a repositioning device applied to an intelligent cleaning device, including:
the image acquisition module is used for acquiring image information in the environment where the intelligent cleaning equipment is located;
the map building module is used for building a scene map of the environment according to the spacing distance between the intelligent cleaning equipment and the obstacles in the environment;
the position determining module is used for re-determining the position information of the intelligent cleaning device in the scene map when the image information represents the sudden change of the position of the intelligent cleaning device.
Optionally, the method further includes:
and the working strategy determining module is used for determining the working strategy of the intelligent cleaning equipment according to the scene type represented by the scene map and the image information.
Optionally, the work policy determining module includes:
a cleaning region determination unit that determines a cleaning region of the intelligent cleaning apparatus;
and/or a cleaning mode determination unit determining a cleaning mode of the intelligent cleaning device.
According to a fourth aspect of the embodiments of the present disclosure, there is provided a computer-readable storage medium having stored thereon computer instructions which, when executed by a control device, implement the steps of the method according to any one of the embodiments described above.
According to a fifth aspect of embodiments of the present disclosure, there is provided an electronic apparatus including:
a control device;
a memory for storing control device executable instructions;
wherein the control device is configured to implement the steps of the method according to any of the embodiments described above
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
according to the embodiment, the intelligent cleaning equipment can determine whether the position of the intelligent cleaning equipment changes suddenly according to the acquired image information, and re-determine the position information of the intelligent cleaning equipment in the scene map when the position changes suddenly, so that the situation that the intelligent cleaning equipment is positioned wrongly after the position changes suddenly can be avoided, and the positioning accuracy of the intelligent cleaning equipment is enhanced.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a schematic structural diagram illustrating an intelligent cleaning device according to an exemplary embodiment.
FIG. 2 is a schematic cross-sectional view of an intelligent cleaning apparatus shown in accordance with an exemplary embodiment.
FIG. 3 is a block diagram illustrating the structure of an intelligent cleaning device according to an exemplary embodiment.
FIG. 4 is a diagram illustrating an application scenario for an intelligent cleaning device, according to an exemplary embodiment.
FIG. 5 is a schematic diagram illustrating another configuration of an intelligent cleaning device according to an exemplary embodiment.
FIG. 6 is a schematic diagram illustrating a further intelligent cleaning device according to an exemplary embodiment.
FIG. 7 is a top view of an intelligent cleaning device, shown according to an exemplary embodiment.
Fig. 8 is a schematic diagram illustrating the mating of a top cover with a visual receiver device, according to an exemplary embodiment.
FIG. 9 is a top view of another intelligent cleaning device shown in accordance with an exemplary embodiment.
FIG. 10 is a partial cross-sectional view of an intelligent cleaning device shown in accordance with an exemplary embodiment.
FIG. 11 is a schematic diagram illustrating a structure of a stent according to an exemplary embodiment.
FIG. 12 is a flow chart illustrating a method of relocation in accordance with an exemplary embodiment.
Fig. 13-15 are block diagrams of a relocation apparatus shown in accordance with an example embodiment.
FIG. 16 is a block diagram illustrating a method for relocating a device in accordance with an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present application. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
Fig. 1 is a schematic structural diagram illustrating an intelligent cleaning device according to an exemplary embodiment. FIG. 2 is a schematic cross-sectional view of an intelligent cleaning apparatus shown in accordance with an exemplary embodiment. FIG. 3 is a block diagram illustrating the structure of an intelligent cleaning device according to an exemplary embodiment. As shown in fig. 1 to 3, the intelligent cleaning apparatus 100 may include a vision receiving device 1, a cleaning device 2, a distance sensing device 3, and a control device 4. The control device 4 may include a controller or the like electrically connected to the vision receiving device 1, the cleaning device 2, and the distance measuring device 3, respectively, in a wired or wireless manner to receive information from and/or send control commands to the above-described devices. The visual receiving device 1 may include a camera or the like for acquiring image information in the environment where the intelligent cleaning apparatus 100 is located. The cleaning device 2 may include a cleaning brush or the like, which implements the cleaning function of the intelligent cleaning apparatus 100. The distance sensing device 3 may include a laser distance measuring device (LDS) or the like, which can be used to acquire a separation distance between the intelligent cleaning device 100 and an obstacle in the environment, so as to construct a scene map about the environment in which the intelligent cleaning device 100 is located according to the acquired separation distance. According to the constructed scene map, the intelligent cleaning device 100 can plan a walking path to avoid colliding with obstacles.
Further, the intelligent cleaning device 100 may also include a sensor (e.g., a cliff sensor, an ultrasonic sensor, or an infrared sensor), or a odometer, etc. In the walking process, the intelligent cleaning device 100 can obtain the self walking distance according to electronic elements such as a sensor or a speedometer and the like, so as to determine the position of the intelligent cleaning device 100 in a scene map in real time; alternatively, the intelligent cleaning device 100 may determine the position of the intelligent cleaning device itself in the scene map according to the separation distance between the intelligent cleaning device and the obstacle, which is acquired by the distance sensing device 3 in real time during the walking process. Of course, the position of the intelligent cleaning device 100 may also be determined comprehensively according to the data of the distance sensing device 3, the odometer, the speed sensor, the duration, the acceleration sensor, and other related elements, so as to improve the accuracy of the determination result.
In the working process, when the intelligent cleaning device is subjected to a kidnapping event or undergoes a position change in a condition of violating a normal working strategy under the influence of other events, namely when the intelligent cleaning device is subjected to a sudden position change due to the action of an external event, the intelligent cleaning device usually cannot know where the intelligent cleaning device is kidnapped due to the influence of errors, noises and the like of a sensor, or even does not realize that the position of the intelligent cleaning device has changed, so that the cleaned area can be cleaned again, the cleaning of the uncleaned area can be abandoned, and bad use experience can be brought to a user.
Therefore, with the intelligent cleaning device 100 provided in the present disclosure, when the image information acquired by the vision receiving device 1 indicates that the intelligent cleaning device 100 is suddenly changed in position, the control device 4 may re-determine the position of the intelligent cleaning device 100 in the scene map. Specifically, the control device 4 may compare two adjacent image information in time sequence, and determine whether the position of the control device changes abruptly according to the matching degree between the image information; alternatively, the control device 4 may match the image information with preset image information, and if the image information is matched with the preset image information, the position is determined not to have a sudden change; if not, the position of the intelligent cleaning device 100 can be considered to be mutated.
When the control device 4 determines that the intelligent cleaning device 100 has a sudden position change according to the image information acquired by the visual receiving device 1, the position of the intelligent cleaning device 100 in the scene map may be determined again according to the distance between the intelligent cleaning device 100 and the obstacle measured by the distance sensing device 3, and the cleaning device 2 is controlled to perform a cleaning function by using a corresponding working strategy according to a scene area corresponding to the scene map.
In the above-described embodiment, the operation strategy of the cleaning device 2 can be matched with the image information acquired by the visual reception device 1 and the scene map constructed according to the separation distance sensed by the distance sensing device 3. Therefore, the intelligent cleaning equipment 100 can work in different scenes according to proper strategies, and the intellectualization and autonomy of the intelligent cleaning equipment 100 are improved.
In the present embodiment, the viewfinder lens (i.e., the camera) of the vision receiving device 1 is disposed obliquely with respect to the traveling plane of the intelligent cleaning apparatus 100, so as to avoid the vision receiving device 1 from being blocked by other components on the intelligent cleaning apparatus 100. The "operating strategy" of the cleaning device, and the inclination of the vision receiving device 1 will be described in detail below with reference to specific scenarios.
In one embodiment, as shown in fig. 4 and 5, it is assumed that the intelligent cleaning device 100 is in the scene a and the intelligent cleaning device 100 travels in the direction indicated by the arrow F, so that the direction indicated by the arrow F can be considered as the front of the intelligent cleaning device 100, the direction away from the direction indicated by the arrow F is the back of the intelligent cleaning device 100, and the travel plane is parallel to the direction indicated by the arrow F, then the optical axis L of the vision receiving device 1 can be aimed at the travel direction of the intelligent cleaning device 100 for acquiring image information in front of the intelligent cleaning device 100.
For example, the acute angle α formed by the optical axis of the vision receiving device 1 and the direction indicated by the arrow F may be 40 °, the field angle of the vision receiving device 1 itself may be 65 °, the vision receiving device 1 may acquire the image information of the scene a once every fixed time interval, or the image information of the scene a may be acquired once after the intelligent cleaning apparatus 100 moves for a fixed displacement, and the image content corresponding to the image information is determined by the acute angle formed by the vision receiving device 1 and the traveling plane of the arrow F and the field angle range of the vision receiving device 1.
The scene type represented by the image information acquired by the visual receiving device 1 is the type of the scene a. For example, when the control device 4 recognizes a sofa, a tea table, an entrance, and the like from the image information acquired by the visual reception device 1, it is possible to determine that the scene a in which the intelligent cleaning apparatus 100 is located is a living room, and further select a cleaning mode corresponding to the living room. The living room is typically an area where people are traveling more, and is more soiled than other areas, and therefore may require greater cleaning effort. For an intelligent cleaning device 100 such as a sweeping robot, the cleaning mode corresponding to the living room may be a high-power cleaning mode for improving the cleaning intensity.
For another example, when the control device 4 determines that the scene a is a bedroom based on the image information acquired by the visual reception device 1, the cleaning mode corresponding to the bedroom may be selected. The bedroom is the region that the user had a rest usually, and personnel come and go less, and its dirty degree is lower relative the sitting room, so the cleaning mode of robot of sweeping the floor can be low power cleaning mode, has lower noise under the prerequisite of guaranteeing clean effect to avoid disturbing the user and have a rest.
In another embodiment, as shown in FIG. 6, it is still assumed that the intelligent cleaning device 100 is in the scene A and the intelligent cleaning device 100 travels in the direction indicated by the arrow F, i.e., the direction indicated by the arrow F is the front of the intelligent cleaning device 100, the direction away from the arrow F is the back of the intelligent cleaning device 100, and the travel plane is parallel to the direction indicated by the arrow F. then, the optical axis L of the vision receiving apparatus 1 can be aimed in the opposite direction to the travel direction of the intelligent cleaning device 100 for obtaining image information behind the intelligent cleaning device 100. the opposite direction to the travel direction in which the optical axis L is aimed at the intelligent cleaning device 100 can be understood as the optical axis L lying in a plane passing through the arrow F and perpendicular to the travel plane and the projection of the optical axis L on the travel plane is opposite to the direction of the arrow F.
For example, when the intelligent cleaning device 100 is a sweeping robot, the acute angle α between the optical axis L of the visual receiving device 1 and the traveling plane of the sweeping robot may be 30 °, and the field angle of the visual receiving device 1 itself may be 45 °, the visual receiving device 1 may still obtain image information about the scene a once every fixed time interval or after the sweeping robot moves for a fixed displacement.
For example, assume that the user sets a living room as a cleaning area and a toilet as a non-cleaning area. Then, when the control device 4 identifies a sofa, a tea table, a hallway, and the like from the image information acquired by the visual reception device 1, it can determine that the scene a where the sweeping robot is located is a living room, and control the cleaning device 2 to perform cleaning. In the cleaning process, the visual receiving device 1 may acquire image information once every fixed time period, such as 2s or 5s, and determine a scene type corresponding to the image information, so that the control device 4 determines whether to continue cleaning.
If the identified scene type is still the living room, the sweeping robot continues to carry out sweeping work; if the toilet is identified according to the image information, the sweeping robot can stop sweeping; alternatively, the distance between the user and the toilet may be estimated based on the gray-scale value corresponding to the characteristic of the toilet in the image information, and the cleaning may be continued by switching to a slower speed in the process of gradually approaching the toilet. Or further, the distance between the robot and the toilet can be estimated according to subsequently obtained image information until a boundary area between the living room and the toilet is determined, and the sweeping robot controls the robot not to cross the boundary area and avoids moving to the toilet. Therefore, the step of manually setting the virtual wall is eliminated, and the user experience is improved.
It should be noted that: identifying a corresponding scene type according to the image information may be that the intelligent cleaning device 100 identifies a corresponding scene type according to a corresponding feature object in the image information, for example, determining a living room according to a sofa, determining a toilet according to a toilet; alternatively, the image information may be compared with preset scene information prestored in the intelligent cleaning device 100 to determine, where the preset scene information may include living room scene information, bathroom scene information, child toy area scene information, kitchen scene information, and the like. The preset scene information may be acquired by the visual reception device 1 in advance surrounding each scene for one circle before the intelligent cleaning apparatus 100 operates. Naturally, the scene type represented by the image information may also be determined in other manners, which is not described herein any more.
The acute angle α between the optical axis L of the visual receiver 1 and the traveling plane may be greater than 40 °, for example, 45 °, 50 °, etc., to obtain image information at a higher position, or less than 30 °, for example, 25 °, 28 °, etc., to obtain image information at a lower position, which is not limited by the present disclosure, the field angle of the visual receiver 1 may be not less than 65 °, for example, 70 °, 72 °, etc., to enlarge the field range and obtain more object images, or may be less than 45 °, for example, 40 °, 43 °, etc., to reduce the field range and improve the accuracy of object recognition in the obtained images, which is not limited by the present disclosure.
Based on the technical solution of the present disclosure, for the installation and positioning of the visual receiving device 1 in the intelligent cleaning apparatus 100, as shown in fig. 7, the intelligent cleaning apparatus 100 may further include a top cover 5 and a key 6. The top cover 5 covers the top of the intelligent cleaning apparatus 100 and forms an upper surface thereof. A button 6 is provided on the top cover 5 for a user to make various function selections. The key 6, the distance sensing means 3 and the visual receiver 1 all have at least one portion protruding from the top cover 5 or flush with the top surface of the top cover 5. The centers of the key structure 6, the distance sensing device 3 and the visual receiving device 1 are located on the same straight line, that is, the holes on the top cover 5 for extending the above parts are located on the same straight line, so that the top cover 5 is convenient to process, and meanwhile, the attractiveness of the intelligent cleaning device 100 can be improved.
In an embodiment, as shown in fig. 7 and 8, the key structure 6 is located between the vision receiving device 1 and the distance measuring sensor 4, the top cover 5 may include a first opening 51, and a lens of the vision receiving device 1 (not shown) may protrude from the first opening 51 to acquire image information. In other words, since the optical axis of the visual receiver 1 is disposed obliquely with respect to the traveling plane, in order to avoid the top cover 5 or other related structures from obstructing the view of the visual receiver 1, the lens of the visual receiver 1 is disposed to protrude from the top cover 5, so as to ensure that the visual receiver 1 can acquire the image information.
In another embodiment, as shown in fig. 9, 10, the vision receiving device 1 may be located between the ranging sensor 4 and the key structure 6. The intelligent cleaning device 100 may further include a device body 7, and the device body 7 and the top cover 5 can enclose an accommodating space for accommodating the visual receiving device 1. Also, the top cover 5 may include a second opening 52, a lens (not shown) is disposed at the second opening 52 to close the second opening 52, and the lens of the visual receiving device 1 may acquire image information through the lens at the second opening 52. This can protect the visual receiving device 1 by the lens and prevent dust deposition and the like.
In the present embodiment, in the optical axis direction of the visual receiving device 1, the sectional area of the second opening 52 at each position positively relates to the distance between the each position and the lens; in addition, at the same position, the sectional area of the viewing angle range of the visual receiver 1 is not greater than the sectional area of the second opening 52, so that the inner wall surrounding the second opening 52 can be prevented from blocking the visual receiver 1, and the visual receiver 1 can acquire the image information. Wherein the viewing angle range may be a cone rotationally formed at a viewing angle of a fixed angle based on the optical axis of the visual receiving apparatus 1; in other embodiments, a four-sided frustum may be used, but the disclosure is not limited thereto.
In each of the above embodiments, as also shown in fig. 10, the intelligent cleaning apparatus 100 may further include a bracket 8, and the bracket 8 may be used to mount the visual reception device 1 and fix the visual reception device 1 to the apparatus main body 7. Specifically, as shown in fig. 11, the bracket 8 may include a fixing seat 81, a mounting seat 82, and a connecting portion 83 connecting the fixing seat 81 and the mounting seat 82. Here, the fixing base 81 may be fixed to the apparatus body 7, and the mounting base 82 is inclined with respect to a traveling plane of the intelligent cleaning apparatus 100 in a state where the fixing base 81 is fixed to the apparatus body 7. Since the mount 82 is connected to the visual receiver 1, the tilt direction of the mount 82 can be determined according to the tilt direction of the lens of the visual receiver 1.
For the intelligent cleaning device described in any of the above embodiments, the present disclosure also provides a method for repositioning of the intelligent cleaning device. As shown in fig. 12, the method may include the steps of:
in step 1201, image information of an environment in which the intelligent cleaning apparatus is located is acquired.
In the present embodiment, the image information may be acquired by a vision receiving device provided on the intelligent cleaning apparatus, specifically, by the vision receiving device 1 described in any one of the above embodiments, and the vision receiving device 1 may be disposed obliquely with respect to the traveling plane of the intelligent cleaning apparatus 100.
In step 1202, a scene map of the environment is constructed according to the separation distance between the intelligent cleaning device and the obstacles in the environment.
In this embodiment, the intelligent cleaning device 100 may further include the distance sensing device 3 according to any one of the above embodiments, the distance between the intelligent cleaning device 100 and an obstacle in the environment may be acquired by the distance sensing device 3, and a scene map of the environment in which the intelligent cleaning device 100 is located may be constructed according to the distance.
The distance sensing device 3 may include a laser distance measuring device (LDS) to measure a separation distance between an obstacle and itself within the environment in which the intelligent cleaning apparatus 100 is located using a laser.
In step 1203, when the image information represents a sudden change of the position of the intelligent cleaning device, the position of the intelligent cleaning device in the constructed scene map is determined again.
In the embodiment, when the image information represents that the intelligent cleaning device has sudden position change, the position information of the intelligent cleaning device 100 in the scene map can be determined again.
Further, the operation strategy of the intelligent cleaning device 100 can be determined according to the scene map and the scene type represented by the image information. The scene type is the type of environment in which the intelligent cleaning device 100 is located. For example, it may be determined from the image information whether the intelligent cleaning device 100 is in a living room or a bedroom.
In one embodiment, the corresponding scene type may be identified according to the corresponding characteristic object in the image information, for example, the scene is determined to be a living room according to a sofa, and the scene is determined to be a toilet according to a toilet.
In another embodiment, the determination may be further performed by comparing the image information with preset scene information pre-stored in the intelligent cleaning device 100, where the preset scene information may include living room scene information, bathroom scene information, or kitchen scene information. The preset scene information may be acquired around a circle in advance in each scene before the intelligent cleaning device 100 operates.
In this embodiment, the working strategy may be to determine whether the scene where the intelligent cleaning device is located is a cleaning area. For example, when the scene identified from the image information is a living room, cleaning may be performed, and for example, when the scene identified from the image information is a toilet, cleaning may be stopped; or the working strategy may be to adjust the cleaning mode of the intelligent cleaning device 100 according to the identified scene, for example, when the scene identified according to the image information is a living room, cleaning in a high-power mode is adopted, and for example, when the scene identified according to the image information is a bedroom, cleaning in a low-power mode is adopted. Of course, this is merely an example, and other scene types, such as kitchen, restaurant, etc., may be recognized according to the image information, but the disclosure is not limited thereto.
According to the embodiment, the intelligent cleaning equipment can determine whether the position of the intelligent cleaning equipment changes suddenly according to the acquired image information, and re-determine the position information of the intelligent cleaning equipment in the scene map when the position changes suddenly, so that the situation that the intelligent cleaning equipment is positioned wrongly after the position changes suddenly can be avoided, and the positioning accuracy of the intelligent cleaning equipment is enhanced.
Corresponding to the foregoing embodiments of information relocation, the present disclosure also provides embodiments of a relocation apparatus applying the above relocation method.
FIG. 13 is a block diagram illustrating an apparatus for relocation according to an example embodiment. Referring to fig. 13, the apparatus includes an image acquisition module 1301, a map construction module 1302, and a position determination module 1303; wherein:
an image acquisition module 1301 configured to acquire image information within an environment in which the intelligent cleaning apparatus is located;
a map building module 1302 configured to build a scene map about the environment according to the separation distance between the intelligent cleaning device and the obstacles in the environment;
a position determining module 1303 configured to re-determine the position information of the intelligent cleaning device in the scene map when the image information represents a sudden change in the position of the intelligent cleaning device.
Fig. 14 is a block diagram illustrating another relocation apparatus according to an exemplary embodiment, which may further include an operation policy determining module 1304 on the basis of the foregoing embodiment illustrated in fig. 13, wherein:
an operating strategy determining module 1304 configured to determine an operating strategy of the intelligent cleaning device according to the scene map and the scene type represented by the image information.
Fig. 15 is a block diagram illustrating still another relocation apparatus according to an exemplary embodiment, where on the basis of the foregoing embodiment illustrated in fig. 14, the operation policy determination module 1304 may further include a cleaning area determination unit 13041 and a cleaning mode determination unit 13042; wherein:
a cleaning region determining unit 13041 configured to determine a cleaning region of the intelligent cleaning apparatus;
a cleaning mode determining unit 13042 configured to determine a cleaning mode of the intelligent cleaning apparatus.
In other embodiments, the operation strategy determining module 1304 may also include one of the cleaning region determining unit 13041 and the cleaning mode determining unit 13042, which is not described in detail herein.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. One of ordinary skill in the art can understand and implement it without inventive effort.
Correspondingly, the present disclosure also provides a relocating device, including: a control device and a memory for storing instructions executable by the control device. Wherein the control device is configured to obtain image information within an environment in which the intelligent cleaning apparatus is located; constructing a scene map about the environment according to the spacing distance between the intelligent cleaning equipment and the obstacles in the environment; when the image information represents sudden position change of the intelligent cleaning device, the position information of the intelligent cleaning device in the scene map is determined again. The memory is used to store executable instructions to implement the above-described operations.
Correspondingly, the disclosure also provides a terminal, which comprises a memory and one or more than one program. Wherein the one or more programs are stored in memory and configured to be executed by one or more control devices. The one or more programs include instructions for: acquiring image information in an environment where the intelligent cleaning equipment is located; constructing a scene map about the environment according to the spacing distance between the intelligent cleaning equipment and the obstacles in the environment; when the image information represents sudden position change of the intelligent cleaning device, the position information of the intelligent cleaning device in the scene map is determined again.
FIG. 16 is a block diagram illustrating an apparatus 1600 for relocation according to an example embodiment. For example, the apparatus 1600 may be a mobile phone, computer, digital broadcast terminal, messaging device, game console, tablet device, medical device, fitness device, personal digital assistant, or the like.
Referring to fig. 16, apparatus 1600 may include one or more of the following components: processing component 1602, memory 1604, power component 1606, multimedia component 1608, audio component 1610, input/output (I/O) interface 1612, sensor component 1614, and communications component 1616.
The processing component 1602 generally controls overall operation of the device 1600, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 1602 may include one or more control devices 1620 that execute instructions to perform all or some of the steps of the above-described methods. Further, the processing component 1602 can include one or more modules that facilitate interaction between the processing component 1602 and other components. For example, the processing component 1602 can include a multimedia module to facilitate interaction between the multimedia component 1608 and the processing component 1602.
The memory 1604 is configured to store various types of data to support operation of the apparatus 1600. Examples of such data include instructions for any application or method operating on device 1600, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 1604 may be implemented by any type of volatile or non-volatile storage device or combination thereof, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
A power supply component 1606 provides power to the various components of the device 1600. The power supply components 1606 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power.
The multimedia component 1608 may include a screen to provide an output interface between the apparatus 1600 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be configured as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 1608 comprises a front-facing camera and/or a rear-facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 1600 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 1610 is configured to output and/or input an audio signal. For example, audio component 1610 includes a Microphone (MIC) configured to receive external audio signals when apparatus 1600 is in an operational mode, such as a call mode, recording mode, and voice recognition mode. The received audio signal may further be stored in the memory 1604 or transmitted via the communications component 1616. In some embodiments, audio component 1610 further includes a speaker for outputting audio signals.
I/O interface 1612 provides an interface between the processing component 1602 and peripheral interface modules, which can be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
Sensor assembly 1614 includes one or more sensors for providing status assessment of various aspects to device 1600. For example, sensor assembly 1614 can detect an open/closed state of device 1600, the relative positioning of components, such as a display and keypad of device 1600, a change in position of device 1600 or a component of device 1600, the presence or absence of user contact with device 1600, orientation or acceleration/deceleration of device 1600, and a change in temperature of device 1600. The sensor assembly 1614 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 1614 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 1614 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communications component 1616 is configured to facilitate communications between the apparatus 1600 and other devices in a wired or wireless manner. The device 1600 may access a wireless network based on a communication standard, such as WiFi, 2G, or 3G, or a combination thereof. In an exemplary embodiment, the communication component 1616 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communications component 1616 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 1600 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital signal control Devices (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), control devices, micro-control devices or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 1604 comprising instructions, executable by the control apparatus 1620 of the apparatus 1600 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (22)

1. An intelligent cleaning device, comprising:
the visual receiving device is used for acquiring image information in the environment where the intelligent cleaning equipment is located;
distance sensing means for obtaining a separation distance between the intelligent cleaning device and an obstacle in the environment to construct a scene map of the environment in which the intelligent cleaning device is located; and
the control device is used for re-determining the position information of the intelligent cleaning device in the scene map when the image information represents that the position of the intelligent cleaning device suddenly changes.
2. The intelligent cleaning apparatus of claim 1, further comprising:
a cleaning device, the working strategy of which can be matched with the scene type represented by the image information and the scene map constructed by the distance sensing device.
3. The intelligent cleaning apparatus of claim 2, wherein the operating strategy comprises at least one of:
clean area, clean mode.
4. The intelligent cleaning apparatus according to claim 1, wherein the optical axis of the vision receiving device is disposed obliquely with respect to a travel plane of the intelligent cleaning apparatus.
5. The intelligent cleaning apparatus according to claim 4, wherein the optical axis of the visual receiving device is aimed in a direction of travel of the intelligent cleaning apparatus for acquiring image information located in front of the intelligent cleaning apparatus along the direction of travel.
6. The intelligent cleaning apparatus according to claim 4, wherein the optical axis of the visual receiver is aimed in a direction opposite to a direction of travel of the intelligent cleaning apparatus for acquiring image information located behind the intelligent cleaning apparatus along the direction of travel.
7. The intelligent cleaning apparatus according to claim 4, wherein the optical axis of the vision receiving device makes an acute angle of not less than 40 ° with a travel plane of the intelligent cleaning apparatus;
or the acute angle is not more than 30 °.
8. The smart cleaning apparatus of claim 1 wherein the field of view of the visual receiving device is not less than 65 ° or the field of view is less than 45 °.
9. The intelligent cleaning apparatus according to claim 1, wherein the intelligent cleaning apparatus comprises a top cover, the top cover comprises a first opening, and the visual receiving device comprises a lens, the lens protrudes from the first opening to obtain the image information.
10. The intelligent cleaning device according to claim 1, wherein the intelligent cleaning device comprises a top cover and a device body, and the visual receiving device is located in a containing space enclosed by the top cover and the device body;
the top cover comprises a second opening, a lens is arranged at the second opening to seal the second opening, and the lens of the visual receiving device acquires the image information through the lens.
11. The intelligent cleaning apparatus according to claim 10, wherein in the optical axis direction of the vision receiving device, the cross-sectional area of the second opening at each position is positively correlated to the distance from the lens at each position, and at the same position, the cross-sectional area of the vision receiving device viewing angle range is not greater than the cross-sectional area of the second opening.
12. The intelligent cleaning device according to claim 1, further comprising a bracket, wherein the bracket comprises a fixed seat, a mounting seat and a connecting part for connecting the fixed seat and the mounting seat;
the fixing seat is fixed on the equipment main body, the mounting seat and the traveling plane of the intelligent cleaning equipment form the acute angle, and the mounting seat is connected with the vision receiving device.
13. The intelligent cleaning apparatus according to claim 1, wherein the intelligent cleaning apparatus comprises a key structure, and centers of the key structure, the visual receiving means, and the distance sensing means are located on a same straight line.
14. The intelligent cleaning apparatus according to claim 13, wherein the key structure is located between the visual receiving means and the distance sensing means; or, the visual receiving device is positioned between the key structure and the distance sensing device.
15. A repositioning method applied to intelligent cleaning equipment is characterized by comprising the following steps:
acquiring image information in an environment where the intelligent cleaning equipment is located;
constructing a scene map about the environment according to the spacing distance between the intelligent cleaning equipment and the obstacles in the environment;
when the image information represents sudden position change of the intelligent cleaning device, the position information of the intelligent cleaning device in the scene map is determined again.
16. The method of relocating according to claim 15, further comprising:
and determining the working strategy of the intelligent cleaning equipment according to the scene type represented by the scene map and the image information.
17. The method of repositioning according to claim 16, wherein said determining an operating strategy for the intelligent cleaning device based on the scene type and the scene map comprises:
determining a cleaning area of the intelligent cleaning device;
and/or determining a cleaning mode of the intelligent cleaning device.
18. A relocatable device for use with an intelligent cleaning apparatus, the device comprising:
the image acquisition module is used for acquiring image information in the environment where the intelligent cleaning equipment is located;
the map building module is used for building a scene map of the environment according to the spacing distance between the intelligent cleaning equipment and the obstacles in the environment;
the position determining module is used for re-determining the position information of the intelligent cleaning device in the scene map when the image information represents the sudden change of the position of the intelligent cleaning device.
19. The relocating device according to claim 18, further comprising:
and the working strategy determining module is used for determining the working strategy of the intelligent cleaning equipment according to the scene type and the scene map.
20. The relocating device according to claim 18, wherein the operating policy determining module comprises:
a cleaning region determination unit that determines a cleaning region of the intelligent cleaning apparatus;
and/or a cleaning mode determination unit determining a cleaning mode of the intelligent cleaning device.
21. A computer-readable storage medium having stored thereon computer instructions, characterized in that the instructions, when executed by a control device, implement the steps of the method according to any one of claims 15-17.
22. An electronic device, comprising:
a control device;
a memory for storing control device executable instructions;
wherein the control device is configured to implement the steps of the method according to any one of claims 15-17.
CN201811392739.6A 2018-11-21 2018-11-21 Intelligent cleaning equipment, repositioning method and device, storage medium and electronic equipment Pending CN111202470A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811392739.6A CN111202470A (en) 2018-11-21 2018-11-21 Intelligent cleaning equipment, repositioning method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811392739.6A CN111202470A (en) 2018-11-21 2018-11-21 Intelligent cleaning equipment, repositioning method and device, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN111202470A true CN111202470A (en) 2020-05-29

Family

ID=70780303

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811392739.6A Pending CN111202470A (en) 2018-11-21 2018-11-21 Intelligent cleaning equipment, repositioning method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN111202470A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114305208A (en) * 2021-12-17 2022-04-12 深圳市倍思科技有限公司 Driving method, device, equipment, program product and system of intelligent cleaning equipment
CN114601373A (en) * 2021-10-18 2022-06-10 北京石头世纪科技股份有限公司 Control method and device for cleaning robot, cleaning robot and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011059296A2 (en) * 2009-11-16 2011-05-19 엘지전자 주식회사 Robot cleaner and method for controlling same
CN105074600A (en) * 2013-02-27 2015-11-18 夏普株式会社 Surrounding environment recognition device, autonomous mobile system using same, and surrounding environment recognition method
CN107428008A (en) * 2014-12-09 2017-12-01 睿智通机器人有限公司 The touch of robot perceives
CN107569181A (en) * 2016-07-04 2018-01-12 九阳股份有限公司 A kind of Intelligent cleaning robot and cleaning method
CN107969995A (en) * 2017-11-27 2018-05-01 深圳市沃特沃德股份有限公司 Vision sweeping robot and its method for relocating
CN108021884A (en) * 2017-12-04 2018-05-11 深圳市沃特沃德股份有限公司 The sweeper power-off continuous of view-based access control model reorientation sweeps method, apparatus and sweeper
CN108247647A (en) * 2018-01-24 2018-07-06 速感科技(北京)有限公司 A kind of clean robot
CN210541347U (en) * 2018-11-21 2020-05-19 北京石头世纪科技股份有限公司 Intelligent cleaning device and relocating device for same

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011059296A2 (en) * 2009-11-16 2011-05-19 엘지전자 주식회사 Robot cleaner and method for controlling same
CN105074600A (en) * 2013-02-27 2015-11-18 夏普株式会社 Surrounding environment recognition device, autonomous mobile system using same, and surrounding environment recognition method
CN107428008A (en) * 2014-12-09 2017-12-01 睿智通机器人有限公司 The touch of robot perceives
CN107569181A (en) * 2016-07-04 2018-01-12 九阳股份有限公司 A kind of Intelligent cleaning robot and cleaning method
CN107969995A (en) * 2017-11-27 2018-05-01 深圳市沃特沃德股份有限公司 Vision sweeping robot and its method for relocating
CN108021884A (en) * 2017-12-04 2018-05-11 深圳市沃特沃德股份有限公司 The sweeper power-off continuous of view-based access control model reorientation sweeps method, apparatus and sweeper
CN108247647A (en) * 2018-01-24 2018-07-06 速感科技(北京)有限公司 A kind of clean robot
CN210541347U (en) * 2018-11-21 2020-05-19 北京石头世纪科技股份有限公司 Intelligent cleaning device and relocating device for same

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114601373A (en) * 2021-10-18 2022-06-10 北京石头世纪科技股份有限公司 Control method and device for cleaning robot, cleaning robot and storage medium
CN114305208A (en) * 2021-12-17 2022-04-12 深圳市倍思科技有限公司 Driving method, device, equipment, program product and system of intelligent cleaning equipment

Similar Documents

Publication Publication Date Title
US9962055B2 (en) Mute operation method and apparatus for automatic cleaning device
CN106797416B (en) Screen control method and device
CN106444786B (en) The control method and device and electronic equipment of sweeping robot
CN107544495B (en) Cleaning method and device
CN107920263B (en) Volume adjusting method and device
CN107169595B (en) Method and device for drawing room layout
EP2930705A1 (en) Method and apparatus for controlling smart terminal
CN106737709B (en) Cleaning method and device
JP2019537777A (en) Terminal, touch recognition method, device, and electronic device
EP3125152B1 (en) Method and device for collecting sounds corresponding to surveillance images
CN111374614A (en) Control method and device of cleaning equipment and storage medium
CN105515952B (en) Method of sending message in multimedia and device
CN107479551B (en) Method and device for controlling movement
CN210541347U (en) Intelligent cleaning device and relocating device for same
US10191708B2 (en) Method, apparatrus and computer-readable medium for displaying image data
CN104243829A (en) Self-shooting method and self-shooting device
CN106210543A (en) imaging apparatus control method and device
CN111202470A (en) Intelligent cleaning equipment, repositioning method and device, storage medium and electronic equipment
CN107485335B (en) Identification method, identification device, electronic equipment and storage medium
CN106886019B (en) Distance measurement method and device
CN105208378B (en) Camera head protecting method, apparatus and terminal
CN112135035B (en) Control method and device of image acquisition assembly and storage medium
CN105988139B (en) Intelligent cleaning equipment and its collision detection component, method and apparatus of getting rid of poverty
CN104954683B (en) Determine the method and device of photographic device
CN111398970B (en) Event detection method, distance sensor and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination