CN114652227B - Intelligent cleaning robot control method, system, device, electronic equipment and medium - Google Patents

Intelligent cleaning robot control method, system, device, electronic equipment and medium Download PDF

Info

Publication number
CN114652227B
CN114652227B CN202210278350.9A CN202210278350A CN114652227B CN 114652227 B CN114652227 B CN 114652227B CN 202210278350 A CN202210278350 A CN 202210278350A CN 114652227 B CN114652227 B CN 114652227B
Authority
CN
China
Prior art keywords
image
target
information
target object
matching degree
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210278350.9A
Other languages
Chinese (zh)
Other versions
CN114652227A (en
Inventor
廖海波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dongguan Pinjia Intelligent Technology Co ltd
Original Assignee
Dongguan Pinjia Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dongguan Pinjia Intelligent Technology Co ltd filed Critical Dongguan Pinjia Intelligent Technology Co ltd
Priority to CN202210278350.9A priority Critical patent/CN114652227B/en
Publication of CN114652227A publication Critical patent/CN114652227A/en
Application granted granted Critical
Publication of CN114652227B publication Critical patent/CN114652227B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/28Floor-scrubbing machines, motor-driven
    • A47L11/282Floor-scrubbing machines, motor-driven having rotary tools
    • A47L11/283Floor-scrubbing machines, motor-driven having rotary tools the tools being disc brushes
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4036Parts or details of the surface treating tools
    • A47L11/4038Disk shaped surface treating tools
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • G08B21/24Reminder alarms, e.g. anti-loss alarms
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The application relates to the technical field of intelligent equipment, in particular to an intelligent cleaning robot control method, an intelligent cleaning robot control system, an intelligent cleaning robot control device, electronic equipment and a medium, wherein the technical scheme is characterized in that: when the arrival information of the target main body is received in real time, the bottom image of the target main body is acquired, the bottom image of the target main body is analyzed to acquire a target object image, then the target object image is compared with a preset target object reference image, and a comparison result is obtained.

Description

Intelligent cleaning robot control method, system, device, electronic equipment and medium
Technical Field
The application relates to the technical field of intelligent cleaning, in particular to an intelligent cleaning robot control method, an intelligent cleaning robot control system, an intelligent cleaning robot control device, electronic equipment and a medium.
Background
The intelligent cleaning robot is generally applied to household cleaning, not only can replace manual cleaning of dust, hair and other garbage on the ground, but also can replace manual wet mopping of the ground.
The current intelligent cleaning robot comprises a base station and a robot main body, wherein a rotary mop for wiping the ground is arranged at the bottom of the robot main body, a mop layer is arranged at the bottom of the rotary mop, and the mop is fixedly connected to the bottom of the rotary mop through a magic tape. The lower side wall of the base station is provided with a working cavity, a cleaning pool is arranged at the working cavity, and a water outlet is arranged at one side of the cleaning pool. In the process that the user uses intelligent cleaning robot wet to drag ground, if need wash rotatory dragging, the user carries out the water injection in the washing pond, then the robot main part removes to the working chamber and makes rotatory dragging soak in the washing pond, then the robot main part drive rotatory dragging rotation, thereby realize washing rotatory dragging. After the cleaning is finished, water in the cleaning pool is discharged from the water outlet, and then the robot main body controls the rotary mop to spin-dry water, so that the rotary mop can continuously mop the floor.
In view of the above-mentioned technology, the inventor found that when the mop at the bottom of the rotary mop is used for a long time, the mop is easy to break, but because the mop of the rotary mop is generally in contact with the ground, the user cannot easily find that the mop of the rotary mop is broken or that the user can find that the mop of the rotary mop is broken after lifting the whole robot main body to view, which is inconvenient for the user to replace the broken mop in time.
Disclosure of Invention
In order to facilitate the user to replace the mop of the rotary mop in time, the intelligent cleaning robot control method, system, device, electronic equipment and medium.
The first object of the present invention is achieved by the following technical solutions:
the intelligent cleaning robot control device comprises a sub-controller arranged on a base station and an image acquisition unit used for acquiring mop images, wherein the sub-controller is connected with the image acquisition unit, the sub-controller detects that a robot main body is in a working cavity through a detection unit and starts the image acquisition unit to acquire the mop images of the robot main body, the robot main body is provided with a reminding unit, and the reminding unit is connected with the sub-controller.
Through adopting above-mentioned technical scheme, after the robot main part accomplishes cleaning operation, get into the working chamber through the guide block, when detecting the robot main part has got into the working chamber this moment, detecting element sends detection signal for the auxiliary controller, auxiliary controller control image acquisition unit starts, thereby can acquire the mop image of robot main part bottom, when the mop damages, the mop image is inconsistent with the contrast result of preset mop image, therefore auxiliary controller can judge through the contrast result of mop image and preset mop image whether the mop damages, when the contrast result of mop image and preset mop image is inconsistent, auxiliary controller sends the signal for reminding the unit in order to control the reminding unit, remind the unit to remind the user to change the mop.
Preferably, a guide block for the robot main body to enter the working cavity is fixedly arranged at the cavity opening of the working cavity of the base station, the bottom surface of the guide block is fixedly connected with the cavity bottom of the working cavity, the top surface of the guide block is an inclined surface and is used for the robot main body to move into the working cavity, the top surface of the guide block is transparent, an installation cavity for installing the image acquisition unit is formed in the side surface of the guide block, and the acquisition end of the image acquisition unit faces the top surface of the guide block.
Through adopting above-mentioned scheme, the robot main part before business turn over working chamber needs to pass through the guide block, moreover because the top surface of guide block is transparent setting to carry out image acquisition to the bottom of the robot main part that moves relative guide block top surface when the image acquisition unit starts, thereby be convenient for collect the mop image.
The second object of the present invention is achieved by the following technical solutions:
the intelligent cleaning robot control method comprises the following steps:
receiving the inbound information of the target main body in real time;
transmitting a start image acquisition signal based on the inbound information;
receiving a bottom image of a target subject in real time; analyzing the bottom image and acquiring a target object image in the bottom image;
Comparing the target object image with a preset target object reference image to obtain a comparison matching degree;
when the comparison result is not consistent with the preset comparison result, determining that the target object is damaged and sending a reminding signal to the terminal.
By adopting the scheme, when the inbound information of the target main body is received in real time, the image acquisition mode is started to acquire the bottom image of the target main body, the bottom image of the target main body is analyzed to acquire the target object image, then the target object image is compared with the preset target object reference image, and a comparison result is obtained, the comparison result is used for measuring the damage condition of the target object, when the target object is damaged, the damaged target object is different from the preset target object image, so that the comparison result is inconsistent with the preset comparison result, the fact that the target object is damaged can be determined through the method, a prompt is sent to the terminal at the moment, and a user can replace a mop in time based on the prompt condition.
Preferably, after the step of receiving the inbound information of the target subject in real time and before the step of receiving the bottom image of the target subject in real time, the method further comprises:
Continuously generating a light supplementing signal to enhance the brightness in the image acquisition area;
or alternatively, the first and second heat exchangers may be,
acquiring a light intensity signal of a detection environment in real time;
comparing the light intensity with a light intensity threshold;
and when the light intensity is smaller than the light intensity threshold, generating a light supplementing signal to enhance the brightness in the image acquisition area.
Through adopting above-mentioned technical scheme, through continuously generating the light filling signal to realize the luminance in the reinforcing image acquisition region, play on the one hand and remind the user that current state is in the image acquisition state, on the other hand, be favorable to improving the luminance of image acquisition environment, improve image acquisition quality. And through the light intensity of real-time detection environment, then judge whether to generate the light filling signal according to the light intensity of environment, realize carrying out the light filling to the image acquisition region at the lower light intensity of environment, realize saving the power.
Preferably, the step of analyzing the bottom image and acquiring a target object image in the bottom image includes:
extracting a plurality of contour feature information of the bottom image;
comparing the profile characteristic information with profile characteristic reference information and respectively obtaining a first profile matching degree corresponding to each profile characteristic information;
And determining the image associated with the profile characteristic information with the highest profile matching degree as a target image.
By adopting the scheme, after the bottom image information of the target main body is acquired, the contour feature information of the bottom image is extracted first, and as the bottom of the target main body has various structures, various contour feature information can be extracted, and the target image is determined according to the highest first contour matching degree in the various contour feature information.
Preferably, the step of comparing the target object image with a preset target object reference image to obtain a comparison result includes:
determining a target region image based on contour feature information associated with the target image;
extracting edge characteristic information of the target area image;
comparing the edge characteristic information associated with the target area image with edge characteristic reference information to obtain edge characteristic matching degree;
extracting outline characteristic information from the target area image;
comparing the contour feature information associated with the target area image with second contour feature reference information to obtain a second contour feature matching degree;
When the edge feature matching degree is smaller than the edge reference matching degree,
and/or the number of the groups of groups,
and when the second contour feature matching degree is smaller than the second contour reference matching degree, determining that the comparison result is not in accordance.
By adopting the scheme, the target area image is determined from the contour feature information related to the target image, whether the edge part of the target object is complete or not is determined by comparing the edge feature information to judge whether the reminding signal needs to be generated or not, the contour feature information is extracted from the interior of the target area image, the second contour matching degree is obtained, and when other contours appear in the target area image, for example, when the target object is damaged, the second contour matching degree is smaller than the second contour reference matching degree. Therefore, when the second contour feature matching degree is smaller than the second contour reference matching degree and the edge feature matching degree is smaller than either or both of the edge reference matching degrees, the comparison result is not in conformity, and thus the accuracy of the image comparison result of the target object is improved.
Preferably, the method further comprises:
receiving outbound information of a target main body in real time;
Receiving a target object image in real time based on the outbound information;
determining a target area image of the target object based on the target object image;
extracting color characteristic information from the target area image;
comparing the color characteristic information related to the interior of the target area image with color reference information to obtain color characteristic matching degree;
and when the color feature matching degree is smaller than the color reference matching degree, sending a reminding signal to the terminal.
By adopting the scheme, when the outbound information of the target main body is received, the color characteristic information in the target area image is compared with the color reference information to obtain the color characteristic matching degree, and when the color characteristic matching degree in the target area image is smaller than the color reference matching degree, the color abnormality in the target area image is represented at the moment, and a reminding signal is sent out to remind a user to check in time.
The third object of the present invention is achieved by the following technical solutions:
an intelligent cleaning robot control system, comprising:
a position information receiving module: the system comprises a target main body, a target terminal and a control unit, wherein the target main body is used for receiving the inbound information of the target main body in real time;
Starting an image acquisition module: the system is used for sending a starting image acquisition signal based on the inbound information;
an image receiving module: for receiving in real time a bottom image of a target subject;
and an analysis module: the method comprises the steps of analyzing the bottom image and acquiring a target object image in the bottom image;
and a comparison module: the target object image is used for comparing with a preset target object reference image to obtain a comparison matching degree; and a signal sending module: when the comparison result is not consistent with the preset comparison result, determining that the target object is damaged and sending a reminding signal to the terminal.
Through adopting above-mentioned scheme, adopt the information of coming station of position information receiving module real-time receipt target subject, send and start image acquisition signal, next pass through the image acquisition unit and gather the bottom image of target subject, next pass through the analysis module and analyze the bottom image of target subject in order to obtain target object image, then adopt contrast module to compare target object image with the target object benchmark image of predetermineeing and obtain the comparison result, this comparison result is used for measuring the damage condition of target object, when target object is damaged, the target object after the damage has the difference with the image of predetermined target object, therefore comparison result is inconsistent with the comparison result of predetermineeing, therefore, can confirm through this kind of mode that target object has been damaged, adopt signal transmission module to send the warning signal for the terminal at this moment, the user of being convenient for can in time change the mop based on the warning condition.
The fourth object of the present application is achieved by the following technical solutions:
an electronic device comprising a memory and a processor, said memory having stored thereon a computer program capable of being loaded by the processor and executing the above-described intelligent cleaning robot control method.
The fifth purpose of the present application is achieved by the following technical solutions:
a computer readable storage medium storing a computer program capable of being loaded by a processor and executing any one of the above-described intelligent cleaning robot control methods.
In summary, the beneficial technical effects of the application are:
when the arrival information of the target main body is received in real time, the bottom image of the target main body can be acquired, the bottom image of the target main body is analyzed to acquire a target object image, then the target object image is compared with a preset target object reference image, and a comparison result is obtained.
Drawings
Fig. 1 is a schematic structural view of the intelligent cleaning robot of the present application.
Fig. 2 is a schematic structural view of the robot body of the present application.
Fig. 3 is a schematic flow chart of the intelligent cleaning robot control method.
Fig. 4 is a bottom structural view of the robot body of the present application.
Fig. 5 is a specific flowchart of step S3 in one embodiment of the present application.
Fig. 6 is a specific flowchart of step S4 in one embodiment of the present application.
Fig. 7 is a block diagram of a control system of the intelligent cleaning robot in one embodiment of the present application.
Fig. 8 is a block diagram of an electronic device in one embodiment of the present application.
Reference numerals illustrate:
1. a base station; 101. a working chamber; 102. a cleaning pool; 103. a guide block; 2. a robot main body; 201. a housing; 21. rotating the mop; 211. mop cloth; 22. a cleaning mechanism; 31. a transmitter; 32. a receiver; 41. a position information receiving module; 42. starting an image acquisition module; 43. an image receiving module; 44. an analysis module; 45. a comparison module; 46. and a signal transmitting module.
Detailed Description
The present application is described in further detail below in conjunction with fig. 1-8.
In order to improve the convenience of indoor cleaning, intelligent cleaning robots are naturally active. Referring to fig. 1 and 2, the intelligent cleaning robot includes a base station 1 and a robot main body 2. The working chamber 101 is provided on the side of the base station 1, and the robot body 2 includes a casing 201, and a main controller (not shown) is provided in the casing 201. Two travelling wheels 6 are arranged at the bottom of the casing 201, and a driving motor (not shown in the figure) for driving the travelling wheels 6 to rotate is further arranged in the casing 201, and the driving motor is electrically connected with the main controller so as to realize the function that the main controller controls the robot main body 2 to move on the ground or on the surface of an object. The bottom of casing 201 is provided with cleans mechanism 22 and rotatory drags 21, and rotatory drags 21 include connecting rod, connection pad and mop 211, and the connecting rod is vertical setting, and the connection pad is the level setting, and the lower extreme of connecting rod and the upper surface fixed connection of connection pad, the lower surface of connection pad pass through the magic subsides and can dismantle with mop 211 and be connected. The upper end of the connecting rod stretches into the casing 201, a first motor (not shown in the figure) is arranged in the casing 201, and an output shaft of the first motor is fixedly connected with the upper end of the connecting rod so as to realize the high-speed rotation of the driving rotary tractor. The connection pad is a black connection pad and the mop 211 is a white mop so that the user can easily see the dirt level of the mop 211. The main controller can control the cleaning mechanism 22 and the rotary mop 21 to work, so as to clean and wipe the ground. A cleaning tank 102 is arranged in the working chamber 101 near the chamber wall.
After the robot body 2 completes the cleaning work, the main controller can control the robot body 2 to move back into the working chamber 101 of the base station 1. And the main controller synchronously controls the water supply mechanism to supply water to the cleaning tank 102 in the process that the robot main body 2 moves into the working cavity 101. When the robot body 2 is moved inside the working chamber 101, the rotary mop 21 can enter the cleaning tank 102, at which time cleaning of the mop 211 can be achieved.
Since the mop 211 is easily damaged after a long period of use, the user needs to replace the damaged mop 211 in time so that the subsequent mopping operation can be performed. Therefore, the application proposes an intelligent cleaning robot control device, is applied to intelligent cleaning robot, is convenient for remind the user to change mop 211.
The intelligent cleaning robot control device comprises a secondary controller, an image acquisition unit and a reminding unit. The auxiliary controller and the image acquisition unit are both arranged at the base station, and the reminding unit is arranged at the robot main body 2. The auxiliary controller is connected with the image acquisition unit and the reminding unit. The image acquisition unit is used for acquiring a mop image of the robot main body 2; the reminding unit is used for reminding the user of the need to check and replace the mop 211. The auxiliary controller detects whether the robot main body 2 enters the working cavity 101 through a detection unit, when the detection unit detects that the robot main body 2 enters the working cavity 101, the detection unit sends detection signals to the auxiliary controller, the auxiliary controller controls the image acquisition unit to start to acquire mop images of the robot main body, then the image acquisition unit sends the mop 211 images to the auxiliary controller, the auxiliary controller determines the damage condition of the mop 211 associated with the mop images by analyzing and comparing the mop images, and when the fact that the mop 211 is damaged is determined, the auxiliary controller controls the reminding unit to send reminding so that a user can check the condition of the mop 211 in time and replace the mop 211.
Specifically, in order to facilitate the robot body 2 to enter the working cavity 101, a guiding block 103 is fixedly arranged at the cavity opening of the working cavity 101 of the base station, the bottom surface of the guiding block 103 is fixedly connected with the cavity bottom of the working cavity 101, the top surface of the guiding block 103 is an inclined surface, and the inclined direction of the inclined surface is inclined upwards from the cavity opening of the working cavity 101 along the direction towards the cleaning pool 102, so that the robot body 2 can move into the working cavity 101 and the rotary mop 21 can be positioned at the cleaning pool 102. The travelling wheel 6 of the robot body moves relative to the top surface of the guide block 103 to effect ingress and egress to and from the working chamber 101.
The inclined surface of the guide block 103 is provided in a transparent manner, and is made of transparent glass or transparent plastic. The side of the guide block 103 is provided with a mounting cavity for mounting the image acquisition unit so that the image acquisition unit can acquire images of the mop. In this embodiment, the image capturing unit is a camera, and in other embodiments, the image capturing unit may be other electronic devices capable of capturing images. The collection end of the image collection unit faces the top surface of the guide block 103 so as to collect the bottom photo of the robot main body 2, thereby realizing the acquisition of the mop image.
Specifically, the secondary controller may be a single-chip controller or other control chips capable of realizing signal transmission and control functions.
Specifically, in the present embodiment, the detection unit includes a photoelectric proximity switch sensor including a transmitter 31 and a receiver 32, the receiver 32 is provided at a position where the base station 1 is close to the cleaning tank 102, and the transmitter 31 is provided on the robot main body 2. The receiver 32 at the base station is capable of receiving infrared light emitted from the emitter 31. When the robot body 2 has entered the working chamber 101, the robot body 2 at this time approaches the cleaning tank 102, the receiver 32 at the base station 1 is able to receive the infrared light emitted by the emitter 31, and at this time the receiver 32 sends a signal to the sub-controller, which controls the image acquisition unit to start in preparation for acquiring the bottom image of the robot body 2.
When the auxiliary controller determines that the mop 211 of the current robot main body 2 is damaged through analysis and comparison results, the auxiliary controller controls the reminding unit to send a reminding to a user. The reminding unit comprises a display lamp and/or a buzzer. The display lamp flashes and the buzzer make a sound to cooperate together so as to remind a user. In one embodiment, the user may be alerted only by the flashing of the display light. In another embodiment, the user may be alerted by a buzzer alone. In other embodiments, the reminding unit is connected to the terminal (e.g. mobile phone, tablet, etc.) of the user, for example, in a wireless connection manner, so as to realize information transmission, and the reminding unit can send a short message to the terminal of the user, so as to remind the user to check and replace the mop 211 in time.
The application also discloses an intelligent cleaning robot control method which is realized based on the intelligent cleaning robot control device. Since the robot body 2 generally performs cleaning work after being removed from the base station 1, the control method starts the image acquisition mode before the robot body 2 enters the working chamber 101 of the base station 1, and performs image acquisition before the robot body 2 goes out and passes through the guide block 103, so that it is possible to determine whether the mop 211 is damaged before the cleaning work of the robot body 2.
Referring to fig. 3, the intelligent cleaning robot control method includes:
s1, receiving the inbound information of the target main body in real time.
Here, the target subject means the robot subject 2, and the arrival information means information when the robot subject 2 enters the working chamber 101 of the base station 1.
Specifically, when the detection unit detects that the target subject has arrived, the detection unit transmits the inbound information to the sub-controller.
S2, based on the inbound information, sending a start image acquisition signal.
The auxiliary controller receives and analyzes the inbound information of the target main body in real time, determines that the robot main body 2 enters the working cavity 101, then generates a starting image acquisition signal, and sends the image acquisition signal to the image acquisition unit to realize the starting of the image acquisition unit.
S3, receiving the bottom image of the target main body in real time, analyzing the bottom image and acquiring a target object image in the bottom image.
Wherein the target object is a mop 211 rotating the mop bottom.
After the image capturing unit is activated, the image capturing unit can capture a photograph of the object passing over the top surface of the guide block 103 in real time. When the robot body 2 moves out of the working cavity 101, the robot body 2 passes through the top surface of the guide block 103 at the cavity opening of the working cavity 101, and the top surface of the guide block 103 is arranged below the bottom of the robot body 2, so that the image acquisition unit can acquire the bottom image of the robot body 2 in real time, and after the image acquisition unit finishes collecting the bottom image of the robot body 2, the image acquisition unit sends the bottom image of the robot body 2 to the auxiliary controller, and the auxiliary controller analyzes the bottom image and acquires the mop 211 image in the bottom image.
Specifically, referring to fig. 3 and 5, step S3 includes:
s31, extracting a plurality of profile characteristic information of the bottom image.
Fig. 4 shows a bottom image of the robot body 2, and since the bottom image of the robot body 2 has a large number of structures, it is necessary to extract a plurality of contour features of the bottom image.
S32, comparing the plurality of profile characteristic information with profile characteristic reference information and respectively obtaining a first profile matching degree corresponding to each profile characteristic information.
Because the robot main body 2 walks on the top surface of the guide block 103, the bottom surface of the robot main body 2 is basically parallel to the top surface of the guide block 103, and the collection end of the image collection unit faces the top surface of the guide block 103, images which are just looking at the bottom of the robot main body 2 can be collected in the image collection process, so that the subsequent image comparison accuracy is improved.
Specifically, the mop 211 of the rotary mop is basically circular in shape, so that the user presets the profile feature reference information and stores the profile feature reference information in the sub-controller, and then compares the plurality of profile feature information extracted in step S31 with the profile feature reference information, and each profile feature information corresponds to the profile feature reference information to obtain the first profile matching degree.
For example: the bottom image of the robot main body 2 includes a bottom image of the casing 201, a mop 211 image, a road wheel bottom image, and the like, and for convenience of understanding, a plurality of profile feature information are named profile feature information a, profile feature information B, profile feature information C, respectively. And comparing the contour feature information A, the contour feature information B and the contour feature information C with contour feature reference information respectively:
And comparing the contour feature information A with the contour feature reference information to obtain a first contour matching degree A of 80%.
And comparing the contour feature information B with the contour feature reference information to obtain a first contour matching degree B of 45%.
And comparing the contour feature information C with contour feature reference information to obtain a first contour matching degree C of 98%.
S33, determining an image associated with the profile characteristic information with the highest first profile matching degree as a target image.
In step S32, the first profile matching degree C associated with the profile-feature information C is highest, and thus the image formed by the profile-feature information C is determined as the target image, i.e., the mop 211 image.
In other embodiments, in order to improve the accuracy of the image acquisition of the mop 211, step S32 may be performed multiple times, then an average value of the first profile matching degree corresponding to each profile characteristic information is taken, and finally the comparison is performed according to the average value of the first profile matching degree.
S4, comparing the target object image with a preset target object reference image to obtain a comparison matching degree.
The user presets the mop image with white, round, unbroken surface and edge as the mop 211 reference image, and stores the mop 211 reference image in the sub-controller.
After step S3 has been performed, the target image is determined, i.e. the image of the mop 211 is determined, and the image of the mop 211 is compared with the preset image of the mop 211 according to step S4. Specifically, referring to fig. 3 and 6, step S4 includes:
s41, determining a target area image based on contour feature information associated with the target image.
The figure shows a mop image, and a target area image is determined according to the target image determined in the step S3 and the contour characteristic information of the target image, wherein the target area image of the mop image comprises the edge of the mop 211 and an area formed by the edge of the mop 211 in a surrounding way.
S42, extracting edge characteristic information of the target area image.
And sharpening the target image to make the edge characteristics of the target image clearer.
In this embodiment, the Canny algorithm is adopted to extract the edge feature information of the target area image, so as to improve the accuracy of edge feature extraction of the target area image.
S43, comparing the edge characteristic information associated with the target area image with the edge characteristic reference information to obtain the edge characteristic matching degree.
The user can pre-enter the edge feature reference information and the edge reference matching degree into the secondary controller. In this embodiment, the edge reference matching degree is preset to 98%. In other embodiments, the user may adjust the edge reference fit according to the actual condition of the mop 211.
The edge feature information obtained in the step S42 is matched with the edge feature reference information, and when the similarity between the edge feature information and the edge feature reference information is higher, the obtained edge feature matching degree is higher; conversely, when the similarity between the edge feature information and the edge feature reference information is lower, the obtained edge feature matching degree is lower.
And when the edge feature matching degree is greater than the edge reference matching degree, determining that the comparison result is in accordance.
And when the edge feature matching degree is smaller than the edge reference matching degree, determining that the comparison result is not in accordance.
In other embodiments, after step S41, it may further include:
s4a1, extracting outline characteristic information from the target area image.
The contour feature information is extracted from the target area image to detect whether the surface of the mop 211 is damaged or perforated.
S4a2, comparing the contour feature information associated with the target area image with second contour feature reference information to obtain second contour feature matching degree.
Wherein, the second profile reference information can be pre-entered into the secondary controller by the user, and when the surface of the mop 211 is perforated, deblock, etc., the profile information associated with the target area image contains more additional profile information, which is not contained in the second profile reference information. Thus, the method is applicable to a variety of applications. After extracting the profile information associated with the target area image in step S4a1, the profile information is compared with the second profile reference information, and when the surface of the mop 211 is damaged, the similarity between the profile information and the second profile reference information is smaller, so that the obtained second profile has smaller matching degree.
Assuming that the second contour feature matching degree is 60% and the second contour reference matching degree is 90%, determining that the comparison result is not in accordance with the second contour feature matching degree which is smaller than the second contour reference matching degree; otherwise, assuming that the second profile matching degree is 95% and the second profile reference matching degree is 90%, the second profile matching degree is greater than the second profile reference matching degree, and the comparison result is determined to be the coincidence.
In other embodiments, steps S41-S43 may be performed simultaneously with steps S4a1-S4a 2. And the edge feature matching degree obtained in the steps S41-S43 and the second contour feature matching degree obtained in the steps S4a1-S4a2 can be matched together to determine a comparison result. The specific mode is as follows:
Figure BDA0003556987250000131
when the second contour feature matching degree is smaller than the second contour reference matching degree and the edge feature matching degree is smaller than either or both of the edge reference matching degree, the comparison result is not in conformity, and thus the accuracy of the image comparison result of the target object is improved.
And S5, when the comparison result is not consistent with the preset comparison result, determining that the target object is damaged and sending a reminding signal to the terminal.
Specifically, when the comparison result is determined to be inconsistent, it is determined that the mop 211 is damaged, at this time, the sub-controller generates a start reminding signal and sends the start reminding signal to the reminding unit, and the reminding unit generates a reminding signal and sends the reminding signal to the terminal after receiving the start reminding signal. The reminding unit sends the reminding signal to the terminal in various modes, for example:
In one embodiment, the reminding information can be sent to the electronic equipment terminal (such as a mobile phone, a tablet and the like) of the user to remind the user.
In another embodiment, the alert unit alerts the user to view and replace the mop 211 (e.g., buzzer, bell, etc.) in time by sounding.
In another embodiment, the alert unit alerts the user to view and replace the mop 211 in time (e.g., a strobe light, etc.) by means of a light.
In another real-time mode, the reminding unit combines or selects one of the two modes of lighting or sounding to realize the reminding function. The method comprises the following specific steps of:
a reminder mode selection is displayed, the selection modes including a baby mode and a normal mode. The baby mode is that a baby exists in a room, and at the moment, some situations which are not suitable for sounding exist, for example, the baby is in a sleep state, so that a user can select a reminding mode to be the baby mode in advance, and at the moment, the reminding unit only lights when sending out a reminding. When the user selects the normal mode, the reminding unit can be matched with the sounding mode and the lighting mode to remind the user.
In another embodiment, since the robot body 2 continues to perform cleaning work under the condition of poor cleanliness after cleaning the rotary mop 21, it is difficult to mop the floor, so the present application can also detect the cleanliness degree of the mop 211, and the method further includes:
S61, receiving the outbound information of the target main body in real time.
In general, the robot body 2 returns to the working chamber 101 of the base station to perform the cleaning rotation while the robot body 2 leaves the working chamber 101 after the cleaning rotation 21 is completed, and then performs the floor cleaning operation.
In this embodiment, the auxiliary controller is connected with an infrared ranging sensor, and detects that the robot main body 2 is to be moved out of the working cavity 101 by using the infrared ranging sensor, at this time, the infrared ranging transmits the outbound information of the robot main body 2 to the auxiliary controller, and after the auxiliary controller receives the outbound information of the robot main body 2, the next step goes to step S62.
And S62, receiving the target object image in real time based on the outbound information.
After receiving the outbound information, the auxiliary controller triggers the operation that the auxiliary controller is to receive the image of the mop 211, so that the auxiliary controller sends a starting image acquisition signal to the image acquisition unit, and at the moment, the image acquisition unit acquires the image of the mop 211 in real time and sends the image of the mop 211 to the auxiliary controller. The sub controller receives the image of the mop 211 and proceeds to the next operation.
S63, determining a target area image of the target object based on the target object image.
In step S63, the target area image of the mop 211 image needs to be determined, and the specific embodiment of this step is the same as the specific embodiment of step S41, and will not be described here again.
S64, extracting color characteristic information from the target area image. The method specifically comprises the following steps:
the user inputs the color reference information corresponding to each pixel information of the preset target object reference image into the auxiliary controller. Since the usual color of the mop 211 is white, the white corresponds to RGB values of 255,255. However, in this embodiment, the color reference information corresponds to an RGB value of 225,240,230, and the corresponding color reference is nearly light gray, considering that the color of the surface of the mop 211 slightly changes after several uses.
And extracting the color characteristics corresponding to each pixel information in the target area image to obtain target color information, wherein the target color information is represented by RGB values.
S65, comparing the color characteristic information related to the interior of the target area image with the color reference information to obtain the color characteristic matching degree. The method specifically comprises the following steps:
the target color information is compared with the color reference information.
For example: assuming that the target color information is RGB values 255, the target color information is higher than the color reference information, and thus the color feature matching degree is 100%.
Also for example: assuming that the target color information is RGB value 0,0,255, the target color information is lower than the color reference information, and thus the color feature matching degree is 0%.
Also for example: assuming that the target color information is RGB value 250,0,255, the target color information is lower than the color reference information, and thus the color feature matching degree is 0%.
Also for example: assuming that the target color information is RGB values 200, 255,100, the target color information is lower than the color reference information, and thus the color feature matching degree is 0%.
In summary, when the RGB value of the target color information is higher than the RGB value of the color reference information, the color feature matching degree is 100%; when the RGB value of the target color information is lower than the RGB value of the color reference information, the color feature matching degree is 0%.
And S66, when the color feature matching degree is smaller than the color reference matching degree, sending a reminding signal to the terminal.
The user can input the color reference matching degree into the sub-controller, wherein the color reference matching degree is 50% in the embodiment, and when the color characteristic matching degree is 0%, the color characteristic matching degree is smaller than the color reference matching degree, a reminding signal is sent to the terminal; when the color feature matching degree is 100%, and when the color feature matching degree is higher than the color reference matching degree, the terminal is determined not to send a reminding signal. The cleanliness of the mop 211 is detected through color matching, and whether a user needs to be reminded of timely checking and replacing the mop 211 is determined according to the detection condition.
The reminding mode of step S66 is the same as the reminding mode of step S5, and will not be described here again.
In an embodiment, after the step of receiving the inbound information of the target subject in real time and before the step of receiving the bottom image of the target subject in real time, a light supplementing method is further provided, the light supplementing method comprising:
the light supplementing signal is continuously generated to enhance the brightness in the image acquisition area (namely in the working cavity 101), and the light supplementing signal is continuously generated to enhance the brightness in the image acquisition area, so that on one hand, the user is reminded that the current state is in the image acquisition state, and on the other hand, the brightness of the image acquisition environment and the image acquisition quality are improved.
Alternatively, in another embodiment, the light supplementing method includes:
a1, acquiring a light intensity signal of a detection environment in real time. Wherein, the illumination intensity sensor is arranged in the installation cavity of the guide block 103, and a light supplementing unit which can be an LED lamp is also arranged in the installation cavity of the guide block 103. The illumination intensity sensor is connected with the auxiliary controller. The illumination intensity sensor detects the intensity of light of the environment, and the unit of the intensity of light is lux (lx).
A2, comparing the light intensity with a light intensity threshold.
And A3, when the light intensity is smaller than the light intensity threshold value, generating a light supplementing signal to enhance the brightness in the image acquisition area. Specific:
assuming that the light intensity threshold range required by image acquisition is 30lx-50lx, and the light intensity sensor detects that the light intensity of the current environment is 20lx, and at the moment, the light intensity of the current environment is smaller than the light intensity threshold range, the light intensity sensor sends a light intensity signal to the auxiliary controller, the auxiliary controller generates a light supplementing control signal and sends the light supplementing control signal to the light supplementing unit, and the light supplementing unit works as the environment for supplementing light.
And through the light intensity of real-time detection environment, then judge whether to generate the light filling signal according to the light intensity of environment, realize carrying out the light filling to the image acquisition region at the lower light intensity of environment, realize saving the power.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic of each process, and should not limit the implementation process of the embodiment of the present application in any way.
The embodiment of the application further provides an intelligent cleaning robot control system, which corresponds to the intelligent cleaning robot control method in the above embodiment one by one, as shown in fig. 7, and includes a position information receiving module 41, a start image collecting module 42, an image receiving module 43, an analyzing module 44, a comparing module 45 and a signal transmitting module 46, and the detailed descriptions of the functional modules are as follows:
The position information receiving module 41: for receiving in real time inbound information of a target subject.
The image acquisition module 42 is activated: for transmitting a start image acquisition signal based on the inbound information.
The image receiving module 43: for receiving in real time a bottom image of a target subject.
Parsing module 44: the method is used for analyzing the bottom image and acquiring the target object image in the bottom image.
Contrast module 45: and the method is used for comparing the target object image with a preset target object reference image to obtain the contrast matching degree.
The signal transmitting module 46: when the comparison result is not consistent with the preset comparison result, determining that the target object is damaged and sending a reminding signal to the terminal.
The position information receiving module 41 is adopted to receive the arrival information of the target main body in real time, the starting image acquisition module 42 is adopted to transmit a starting image acquisition signal, the image acquisition unit is adopted to acquire the bottom image of the target main body, the analysis module 44 is adopted to analyze the bottom image of the target main body to acquire a target object image, the comparison module 45 is adopted to compare the target object image with a preset target object reference image and obtain a comparison result, the comparison result is used for measuring the damage condition of the target object, when the target object is damaged, the damaged target object is different from the preset target object image, and therefore the comparison result is different from the preset comparison result, so that the fact that the target object is damaged can be determined in this way, the signal transmitting module 46 is adopted to transmit a reminding signal to the terminal, and a user can conveniently replace a mop 211 in time based on the reminding condition.
Specific limitations regarding the intelligent cleaning robot control system may be referred to above as limitations regarding the intelligent cleaning robot control method, and will not be described herein. The respective modules in the above-described intelligent cleaning robot control system may be implemented in whole or in part by software, hardware, and combinations thereof. The above modules may be embedded in hardware or independent of a processor in the electronic device, or may be stored in software in a memory in the electronic device, so that the processor may call and execute operations corresponding to the above modules.
The embodiment of the application also provides electronic equipment, namely a sub-controller of the robot main body 2. As shown in fig. 8, the cleaning robot comprises a memory and a processor, wherein the memory stores a computer program capable of being loaded by the processor and executing the intelligent cleaning robot control method:
s1, receiving the inbound information of the target main body in real time.
S2, receiving a bottom image of the target main body in real time based on the incoming information.
S3, analyzing the bottom image and acquiring a target object image in the bottom image.
S4, comparing the target object image with a preset target object image to obtain a comparison matching degree.
And S5, when the comparison result is not consistent with the preset comparison result, determining that the target object is damaged and sending a reminding signal to the terminal.
In addition, the processor in the electronic device, when executing the computer program, performs the steps of all the intelligent cleaning robot control methods described above.
The electronic device is a server, and the internal structure thereof can be shown in fig. 8. The electronic device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the processor of the electronic device is configured to provide computing and control capabilities. The memory of the electronic device includes a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, computer programs, and a database. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The database of the electronic device is used for storing target object reference images, contour feature reference information, edge reference matching degree and the like. The network interface of the electronic device is used for communicating with an external terminal through a network connection. The computer program, when executed by the processor, implements a method of intelligent cleaning robot control.
The embodiments of the present application also provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of:
s1, receiving the inbound information of the target main body in real time.
S2, receiving a bottom image of the target main body in real time based on the incoming information.
S3, analyzing the bottom image and acquiring a target object image in the bottom image.
S4, comparing the target object image with a preset target object image to obtain a comparison matching degree.
And S5, when the comparison result is not consistent with the preset comparison result, determining that the target object is damaged and sending a reminding signal to the terminal.
The processor, when executing the computer program, is also capable of executing the steps of the method for controlling the intelligent cleaning robot in any of the embodiments described above.
Those skilled in the art will appreciate that implementing all or part of the above-described methods may be accomplished by way of a computer program, which may be stored on a non-transitory computer readable storage medium and which, when executed, may comprise the steps of the above-described embodiments of the methods. Any reference to memory, storage, database, or other medium used in the various embodiments provided herein may include non-volatile and/or volatile memory. The nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), memory bus direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), among others.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting thereof; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (8)

1. The intelligent cleaning robot control method is characterized in that: the intelligent cleaning robot comprises a secondary controller and an image acquisition unit, wherein the secondary controller is arranged on a base station (1) and is used for acquiring images of a mop (211), the secondary controller is connected with the image acquisition unit, the secondary controller detects that a robot main body (2) is arranged in a working cavity (101) through a detection unit and starts the image acquisition unit to acquire the images of the mop of the robot main body (2), the robot main body (2) is provided with a reminding unit, and the reminding unit is connected with the secondary controller;
A guide block (103) for a robot main body (2) to enter the working cavity (101) is fixedly arranged at a cavity opening of a working cavity (101) of the base station (1), the bottom surface of the guide block (103) is fixedly connected with the cavity bottom of the working cavity (101), the top surface of the guide block (103) is an inclined surface and is used for the robot main body (2) to move into the working cavity (101), the top surface of the guide block (103) is in transparent arrangement, an installation cavity for installing the image acquisition unit is formed in the side surface of the guide block (103), and the acquisition end of the image acquisition unit faces the top surface of the guide block (103);
the method comprises the following steps:
receiving the inbound information of the target main body in real time;
transmitting a start image acquisition signal based on the inbound information;
receiving a bottom image of a target subject in real time; analyzing the bottom image and acquiring a target object image in the bottom image;
comparing the target object image with a preset target object reference image to obtain a comparison result;
when the comparison result is not consistent with the preset comparison result, determining that the target object is damaged and sending a reminding signal to the terminal.
2. The intelligent cleaning robot control method according to claim 1, wherein: after the step of receiving the inbound information of the target subject in real time and before the step of receiving the bottom image of the target subject in real time, further comprising:
Continuously generating a light supplementing signal to enhance the brightness in the image acquisition area;
or alternatively, the first and second heat exchangers may be,
acquiring a light intensity signal of a detection environment in real time;
comparing the light intensity with a light intensity threshold;
and when the light intensity is smaller than the light intensity threshold, generating a light supplementing signal to enhance the brightness in the image acquisition area.
3. The intelligent cleaning robot control method according to claim 1, wherein: the step of analyzing the bottom image and acquiring the target object image in the bottom image comprises the following steps:
extracting a plurality of contour feature information of the bottom image;
comparing the profile characteristic information with profile characteristic reference information and respectively obtaining a first profile matching degree corresponding to each profile characteristic information;
and determining the image associated with the profile characteristic information with the highest profile matching degree as a target object image.
4. The intelligent cleaning robot control method according to claim 1, wherein: the step of comparing the target object image with a preset target object reference image to obtain a comparison result includes:
determining a target region image based on contour feature information associated with the target object image;
Extracting edge characteristic information of the target area image;
comparing the edge characteristic information associated with the target area image with edge characteristic reference information to obtain edge characteristic matching degree;
extracting outline characteristic information from the target area image;
comparing the contour feature information associated with the target area image with second contour feature reference information to obtain a second contour feature matching degree;
when the edge feature matching degree is smaller than the edge reference matching degree,
and/or the number of the groups of groups,
and when the second contour feature matching degree is smaller than the second contour reference matching degree, determining that the comparison result is not in accordance.
5. The intelligent cleaning robot control method according to any one of claims 1 to 4, wherein: the method further comprises the steps of:
receiving outbound information of a target main body in real time;
receiving a target object image in real time based on the outbound information;
determining a target area image of the target object based on the target object image;
extracting color characteristic information from the target area image;
comparing the color characteristic information related to the interior of the target area image with color reference information to obtain color characteristic matching degree;
And when the color feature matching degree is smaller than the color reference matching degree, sending a reminding signal to the terminal.
6. Intelligent cleaning robot control system, its characterized in that: the intelligent cleaning robot control method according to claim 1, wherein the control system comprises:
position information receiving module (41): the system comprises a target main body, a target terminal and a control unit, wherein the target main body is used for receiving the inbound information of the target main body in real time;
starting an image acquisition module (42): the system is used for sending a starting image acquisition signal based on the inbound information;
image receiving module (43): for receiving in real time a bottom image of a target subject;
analysis module (44): the method comprises the steps of analyzing the bottom image and acquiring a target object image in the bottom image;
contrast module (45): the target object image is compared with a preset target object reference image to obtain a comparison result;
signal transmission module (46): when the comparison result is not consistent with the preset comparison result, determining that the target object is damaged and sending a reminding signal to the terminal.
7. An electronic device, characterized in that: comprising a memory and a processor, said memory having stored thereon a computer program that can be loaded by the processor and that performs the method of controlling a smart cleaning robot according to any of claims 3-5.
8. A computer-readable storage medium, characterized by: a computer program that can be loaded by a processor and that performs the intelligent cleaning robot control method according to any one of claims 3-5 is stored.
CN202210278350.9A 2022-03-21 2022-03-21 Intelligent cleaning robot control method, system, device, electronic equipment and medium Active CN114652227B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210278350.9A CN114652227B (en) 2022-03-21 2022-03-21 Intelligent cleaning robot control method, system, device, electronic equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210278350.9A CN114652227B (en) 2022-03-21 2022-03-21 Intelligent cleaning robot control method, system, device, electronic equipment and medium

Publications (2)

Publication Number Publication Date
CN114652227A CN114652227A (en) 2022-06-24
CN114652227B true CN114652227B (en) 2023-06-16

Family

ID=82030575

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210278350.9A Active CN114652227B (en) 2022-03-21 2022-03-21 Intelligent cleaning robot control method, system, device, electronic equipment and medium

Country Status (1)

Country Link
CN (1) CN114652227B (en)

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3807011B2 (en) * 1997-03-13 2006-08-09 松下電器産業株式会社 Floor nozzle for vacuum cleaner
CN207755219U (en) * 2017-06-23 2018-08-24 杭州九阳小家电有限公司 A kind of cleaning robot system
WO2020125489A1 (en) * 2018-12-21 2020-06-25 苏州宝时得电动工具有限公司 Robot cleaner and control method therefor, and floor treating system
CN211093818U (en) * 2019-09-10 2020-07-28 深圳市神州云海智能科技有限公司 Mop device
CN211723013U (en) * 2019-10-25 2020-10-23 苏州宝时得电动工具有限公司 Cleaning robot
CN215457703U (en) * 2021-03-25 2022-01-11 深圳市银星智能科技股份有限公司 Maintenance station and cleaning system
CN113017506B (en) * 2021-03-25 2022-09-13 深圳银星智能集团股份有限公司 Mop cleaning method and maintenance station for a cleaning robot
CN113273933B (en) * 2021-05-24 2022-04-12 美智纵横科技有限责任公司 Cleaning robot, control method and device thereof, and storage medium
CN215874494U (en) * 2021-06-15 2022-02-22 杭州匠龙机器人科技有限公司 Roll and drag module and cleaning robot system thereof

Also Published As

Publication number Publication date
CN114652227A (en) 2022-06-24

Similar Documents

Publication Publication Date Title
CN106377209B (en) Movable cleaning device and control method thereof
JP7007078B2 (en) Vacuum cleaner
KR101840158B1 (en) Electric vacuum cleaner
CN111789538B (en) Method and device for determining degree of soiling of cleaning mechanism, and storage medium
CN112274065B (en) Cleaning robot and control method thereof
TWI726031B (en) Electric sweeper
US20180310872A1 (en) Skin detection device and detection method thereof
CN107358175B (en) Iris collection method and electronic device
US20190227566A1 (en) Self-propelled vacuum cleaner
CN108937726A (en) A kind of clean robot awakening method and device based on cleannes identification
CN110636789A (en) Electric vacuum cleaner
WO2023138365A1 (en) Method for controlling cleaning device, and device and storage medium
CN114652227B (en) Intelligent cleaning robot control method, system, device, electronic equipment and medium
CN110967703A (en) Indoor navigation method and indoor navigation device using laser radar and camera
CN114617478B (en) Cleaning control method and device, sweeping robot and storage medium
CN217365667U (en) Automatic cleaning equipment
CN111945375B (en) Clothes treatment device, control method thereof and readable storage medium
CN115277811A (en) Intelligent water affair operation and maintenance method and system based on Internet of things
KR100831702B1 (en) A position detection method and a position move equipment of robot cleaner
CN114109095A (en) Swimming pool cleaning robot and swimming pool cleaning method
CN112507970A (en) Unmanned car washing image recognition anti-collision system and recognition method based on PointRend algorithm
CN110946514B (en) Monitoring method and monitoring device applied to movable cleaning equipment
CN114532919B (en) Multi-mode target detection method and device, sweeper and storage medium
CN111210657A (en) Intelligent video parking method, parking pile and parking system
JPH07175520A (en) Automatic travel device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP02 Change in the address of a patent holder

Address after: Building 1, No. 78 Qingyu Road, Qingxi Town, Dongguan City, Guangdong Province, 523000

Patentee after: Dongguan Pinjia Intelligent Technology Co.,Ltd.

Address before: 523000 Room 301, building 1, No.11 Zhensheng Road, Qingxi Town, Dongguan City, Guangdong Province

Patentee before: Dongguan Pinjia Intelligent Technology Co.,Ltd.

CP02 Change in the address of a patent holder