CN111199627A - Safety monitoring method and chip for cleaning robot - Google Patents

Safety monitoring method and chip for cleaning robot Download PDF

Info

Publication number
CN111199627A
CN111199627A CN202010095306.5A CN202010095306A CN111199627A CN 111199627 A CN111199627 A CN 111199627A CN 202010095306 A CN202010095306 A CN 202010095306A CN 111199627 A CN111199627 A CN 111199627A
Authority
CN
China
Prior art keywords
cleaning robot
door
preset
determines
cleaning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010095306.5A
Other languages
Chinese (zh)
Other versions
CN111199627B (en
Inventor
肖刚军
许登科
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Amicro Semiconductor Co Ltd
Original Assignee
Zhuhai Amicro Semiconductor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Amicro Semiconductor Co Ltd filed Critical Zhuhai Amicro Semiconductor Co Ltd
Priority to CN202010095306.5A priority Critical patent/CN111199627B/en
Publication of CN111199627A publication Critical patent/CN111199627A/en
Application granted granted Critical
Publication of CN111199627B publication Critical patent/CN111199627B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B17/00Fire alarms; Alarms responsive to explosion
    • G08B17/10Actuation by presence of smoke or gases, e.g. automatic alarm devices for analysing flowing fluid materials by the use of optical means
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B3/00Audible signalling systems; Audible personal calling systems
    • G08B3/10Audible signalling systems; Audible personal calling systems using electric transmission; using electromagnetic transmission
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Electromagnetism (AREA)
  • Electric Vacuum Cleaner (AREA)

Abstract

The invention discloses a safety monitoring method and a chip of a cleaning robot, which can control the cleaning robot to repeatedly impact a door when detecting a fire, effectively awaken a sleeping person to process the fire or escape in time in a similar door knocking way, avoid the problem that a user cannot hear an alarm sound in time due to good room sound insulation effect, ensure the safety of the user and improve the practicability of the cleaning robot.

Description

Safety monitoring method and chip for cleaning robot
Technical Field
The invention belongs to the field of intelligent cleaning robots, and particularly relates to a safety monitoring method and a chip of an intelligent cleaning robot.
Background
The existing cleaning robot can not only realize the functions of automatic dust collection and cleaning, but also realize the functions of automatic mopping, waxing and the like. With the progress and development of science and technology, the intelligent degree of the cleaning robot is higher and higher, the functions are more and more abundant, and the cleaning robot is not limited to the cleaning function any more. As disclosed in the chinese utility model patent with publication number CN207216832U, a fire-proof, theft-proof and gas-leakage-proof floor-sweeping robot is disclosed, which is provided with a master controller, an alarm device, a human body sensor and a wireless communication module, and can realize remote fire alarm, smoke alarm and gas leakage alarm; video monitoring can be carried out, and an anti-theft function is realized; meanwhile, remote video can be carried out with other people. Although the floor sweeping robot can realize fire alarm, the floor sweeping robot mainly reminds a user in a loudspeaker mode, if the sound insulation effect of a room of the user is good, the effect of the sound alarm mode is not ideal, and particularly when the user is in a deep sleep state at night, the alarm sound is harder to hear.
Disclosure of Invention
In order to solve the technical problems, the invention provides the following technical scheme:
a safety monitoring method of a cleaning robot includes the steps of: step S1: the cleaning robot judges whether the preset condition of fire occurrence is met or not according to the detection data of the sensor, if so, the step S2 is carried out, otherwise, the default state is kept; step S2: the cleaning robot judges whether the current time is in a preset time period, if so, the step S3 is carried out, otherwise, the step S4 is carried out; step S3: the cleaning robot searches the stored map data, navigates to the position of the door of the master house according to the map data, repeatedly impacts the door and sends out an alarm prompt tone at the same time; step S4: the cleaning robot detects whether a person appears, if not, the cleaning robot moves and searches the person according to the stored map data and sends out an alarm prompt tone, and if so, the cleaning robot returns to the initial position and keeps the default state.
Further, the cleaning robot in step S1 determines whether the preset condition for fire occurrence is met according to the detection data of the sensor, and specifically includes the following steps: step S11: the cleaning robot judges whether the current brightness is higher than the brightness at the last moment by a preset brightness value according to the detection data of the brightness detection sensor, if so, the cleaning robot enters the step S12, otherwise, the cleaning robot determines that the cleaning robot does not accord with the preset condition of fire occurrence and continues to keep detecting; step S12: and the cleaning robot judges whether the current temperature is higher than the temperature at the last moment by a preset temperature value according to the detection data of the temperature detection sensor, if so, the cleaning robot confirms that the cleaning robot accords with the preset condition of fire, otherwise, the cleaning robot confirms that the cleaning robot does not accord with the preset condition of fire, and the cleaning robot continues to maintain the detection.
Further, the cleaning robot in step S1 determines whether the preset condition for fire occurrence is met according to the detection data of the sensor, and specifically includes the following steps: step S11: the cleaning robot judges whether the current brightness is higher than the brightness at the last moment by a preset brightness value according to the detection data of the brightness detection sensor, if so, the cleaning robot enters the step S12, otherwise, the cleaning robot determines that the cleaning robot does not accord with the preset condition of fire occurrence and continues to keep detecting; step S12: the cleaning robot judges whether the current temperature is higher than the temperature at the last moment by a preset temperature value according to the detection data of the temperature detection sensor, if so, the step S13 is carried out, otherwise, the cleaning robot determines that the cleaning robot does not accord with the preset condition of fire occurrence, and continues to keep detecting; step S13: and the cleaning robot judges whether the current smoke concentration is higher than the smoke concentration at the last moment by a preset smoke concentration value or not according to the detection data of the smoke detection sensor, if so, the cleaning robot determines that the cleaning robot accords with the preset condition of fire, otherwise, the cleaning robot determines that the cleaning robot does not accord with the preset condition of fire, and the cleaning robot continues to maintain the detection.
Further, the method for determining whether the current time is within the preset time period by the cleaning robot in step S2 includes the following steps: the cleaning robot determines the current time; the cleaning robot determines a preset time period; the cleaning robot judges whether the current time is within the range of the preset time period; the preset time period is set by a user or factory set by a manufacturer.
Further, the cleaning robot in step S3 searches the stored map data and navigates to the position of the door of the main house according to the map data, specifically including the following steps: the cleaning robot searches map data recorded by the cleaning robot in the last cleaning work and stored in a memory, wherein the map data comprises grid unit coordinates and an environment state corresponding to each grid unit coordinate; the cleaning robot determines the current position of the cleaning robot according to image data shot by a camera of the cleaning robot; the cleaning robot determines a navigation path formed by continuous grid units with environment states capable of passing between the current position and the position of a door of a master house; the cleaning robot walks to the position where the door of the master house is located along the navigation path; the intelligent cleaning robot comprises a cleaning robot body, a door of a main house, an intelligent terminal and a map data display module, wherein the position of the door of the main house is set by a user through the intelligent terminal, and the intelligent terminal can synchronously display the map data detected by the cleaning robot in the cleaning work in a graphical mode.
Further, the cleaning robot in step S3 may issue an alarm sound while repeatedly hitting the door, specifically including the steps of: step S31: the cleaning robot broadcasts voice data prestored in the memory through the voice module; step S32: the cleaning robot retreats to a position away from the door by a preset distance, then directly moves towards the door in an accelerating manner, stops moving after detecting that an obstacle detection sensor at the front end of the cleaning robot is triggered, and proceeds to step S33, and stops moving if the moving distance exceeds the preset distance and the obstacle detection sensor is not triggered, and proceeds to step S4; step S33: the cleaning robot judges whether the number of times of triggering the obstacle detection sensor reaches a preset number of times, if so, the step S34 is carried out, otherwise, the step S32 is returned; step S34: the cleaning robot judges whether the door collided at present is the door of the last room, if so, the cleaning robot returns to the initial position and sends prompt information to the intelligent terminal bound by the user through the network, otherwise, the cleaning robot goes to step S35; step S35: the cleaning robot navigates to a location where a door of another room is located according to the map data, and retreats to a location away from the door by a preset distance, and then moves straight toward the door, and stops moving after detecting that the obstacle detection sensor at the front end of the cleaning robot is triggered, and proceeds to step S33, and stops moving if the moved distance exceeds the preset distance, and does not detect that the obstacle detection sensor is triggered, and proceeds to step S4.
Further, the step S4 of detecting whether a person is present by the cleaning robot includes the following steps: the cleaning robot starts a camera, analyzes whether a moving object exists in a shot image, analyzes whether the shape of the moving object accords with the character characteristics, determines that a person appears if the moving object exists, and otherwise determines that no person appears.
Further, the cleaning robot in step S4 moves and searches for a person according to the stored map data, specifically including the steps of: the cleaning robot searches map data recorded by the cleaning robot in the last cleaning work and stored in a memory, wherein the map data comprises grid unit coordinates and an environment state corresponding to each grid unit coordinate; the cleaning robot determines the current position of the cleaning robot according to image data shot by a camera of the cleaning robot; the cleaning robot determines a navigation path consisting of continuous grid units with a passable environmental state; the cleaning robot walks to different areas along the navigation path, analyzes whether a moving object exists in the shot image and analyzes whether the shape of the object accords with the character characteristics or not in the walking process, and if so, determines that the person is found, otherwise, determines that the person is not found.
Further, the default state is that the cleaning robot only keeps the main control module and the sensor for detecting the fire occurrence in the working state, and other circuit modules keep the dormant state.
A chip is internally provided with a program code, and a processor can control a cleaning robot to execute the safety monitoring method of the cleaning robot according to the program code.
The safety monitoring method of the cleaning robot can control the cleaning robot to repeatedly impact a door when detecting a fire disaster, effectively awaken a sleeping person to timely treat the fire disaster or escape in a similar door knocking mode, avoid the problem that a user cannot timely hear an alarm sound due to a good room sound insulation effect, ensure the safety of the user and improve the practicability of the cleaning robot.
Drawings
Fig. 1 is a schematic flow chart of a safety monitoring method of a cleaning robot according to an embodiment of the present invention.
Detailed Description
The following further describes embodiments of the present invention with reference to the accompanying drawings:
as shown in fig. 1, the cleaning robot includes common intelligent robot products such as a floor sweeping robot and a floor mopping robot in the market, and the robot products have components such as a camera, a odometer, a gyroscope, an infrared detection sensor and the like, and can realize functions of automatic walking, automatic dust collection and cleaning, floor mopping, automatic path planning, obstacle avoidance and the like. These cleaning robots employ slam (simultaneous localization and Mapping) technology, also known as cml (concurrent Mapping and localization), instant positioning and Mapping technology, or concurrent Mapping and positioning technology.
The safety monitoring method comprises the following steps:
step S1: the cleaning robot judges whether the preset condition of fire occurrence is met or not according to the detection data of the sensor. The type and detection data of the sensor are configured according to the setting of preset conditions. The preset conditions are mainly designed during the production and manufacturing of the robot product. When the reference factors for judging the preset conditions of the fire occurrence are mainly temperature and brightness, only a sensor for detecting temperature and a sensor for detecting brightness can be adopted, and accordingly, the cleaning robot only needs to acquire detection data as temperature detection data and brightness detection data. When the reference factors for judging the preset conditions of the fire occurrence are mainly temperature, brightness and smoke concentration, only a sensor for detecting temperature, a sensor for detecting brightness and a sensor for detecting smoke concentration can be adopted, and accordingly, the cleaning robot only needs to acquire detection data which are temperature detection data, brightness detection data and smoke concentration detection data. Of course, the predetermined condition for determining the occurrence of a fire may be determined by referring to other factors such as light radiation detected by a flame detection sensor, and the like. When the cleaning robot acquires the detection data, the detection data also needs to be analyzed, and when the detection data meet certain conditions, whether the preset conditions for fire occurrence are met can be judged, and whether the fire occurs can be determined. The conditions are also set according to different situations, and a mode of setting a threshold value for comparison can be generally adopted, for example, the temperature reaches a certain temperature threshold value, the brightness reaches a certain brightness threshold value, the smoke concentration reaches a certain concentration threshold value, whether the light radiation reaches a certain radiation value, and the like, and the threshold values can be obtained through a large number of experiments by summarizing and analyzing. When the cleaning robot judges that the preset condition for the occurrence of a fire is met, it proceeds to step S2 to continue the following operation, otherwise, the cleaning robot determines that no fire is occurring and maintains a default state. The default state may also be configured according to different product design requirements, for example, configured as: the robot does not walk, and other functions and parts operate normally; or the robot is in a dormant state, and only the components capable of detecting the fire disaster are kept to normally operate; or the robot only keeps one sensor which is mainly used for detecting the fire to normally operate, and other detection sensors are sequentially triggered after the detection data of the sensor meets certain conditions.
Step S2: the cleaning robot judges whether the current time is within a preset time period. The cleaning robot is provided with a clock module and a communication module, can acquire the most accurate time in real time, wherein the time comprises year, month, day, hour and minute, generally Beijing time, and can also be configured with foreign standard time according to different use regions of products. The preset time period can be configured according to the use requirements of the user, for example, the user generally sleeps from 11 pm to 7 am, and in this time period, the user generally sleeps in a room with a door closed, and at this time, the user hardly finds a fire in the living room, and the cleaning robot is generally placed in the living room, so when the cleaning robot determines that the fire in the living room occurs and indicates that the user is sleeping at the moment in the preset time period, the cleaning robot needs to timely remind the user to handle the fire or escape, and then the cleaning robot enters step S3 to perform subsequent operations. If not, it indicates that the user is not sleeping, and the cleaning robot is not required to perform the door collision warning, so the process proceeds to step S4 for subsequent processing.
Step S3: the cleaning robot searches for stored map data, which is recorded and stored by the robot in the last cleaning work, or effective map data that the user directly imports into the cleaning robot. The cleaning robot navigates to the position of the door of the master room according to the map data, and since the master should be notified firstly when a fire occurs, the cleaning robot navigates to the master room preferentially to remind the master of door collision. The position of the door of the main house can be set by a user directly through an intelligent terminal, or can be obtained by analyzing the detection data recorded in the cleaning working process by the cleaning robot, and the obtained position information needs to be confirmed by the user. Cleaning machines people navigates to the door position after, strikes the door repeatedly through the mode of front and back repetitive movement on one side, sends out the alarm prompt tone through voice module on one side, and this alarm prompt tone can be that the product just sets up before dispatching from the factory, also can be that the user sets up when using, can record the speech information that oneself liked as the prompt tone.
Step S4: cleaning machines people striking door and awaken up the user after, the user can open the door, at this moment, cleaning machines people detects someone through self camera or infrared thermal imager, then returns initial position to keep acquiescence state. The start position is a position where the cleaning robot initially detects the occurrence of a fire. If the door is not opened or closed after the door is impacted, the user is possibly not in the room, at the moment, the cleaning robot moves and searches for the person according to the stored map data, and sends out a warning sound to remind the user as much as possible.
According to the method, the cleaning robot can be controlled to repeatedly impact the door when detecting a fire, the sleeping person can be effectively awakened to handle the fire or escape in time in a door knocking mode, the problem that the user cannot hear the alarm sound in time due to the good sound insulation effect of the room is avoided, the safety of the user is guaranteed, and the practicability of the cleaning robot is improved.
As one embodiment, the method for determining whether the cleaning robot meets the preset condition for fire occurrence according to the detection data of the sensor in step S1 includes the following steps:
step S11: the cleaning robot judges whether the current brightness is higher than the brightness at the previous moment by a preset brightness value according to the detection data of the brightness detection sensor. The time interval between the current time and the previous time for judgment may be set during product design, or may be set by the user autonomously, and is generally set to any value from 1 second to 5 seconds. The preset brightness value is obtained by comparing the brightness value detected when the building is on fire with the normal brightness value of the house at night, and the data can be obtained through experimental tests. The brightness detection sensor can adopt components such as a light sensor or a photoresistor. If the current brightness is higher than the brightness at the previous moment by the preset brightness value, it indicates that a fire may be started, or the user turns on the lamp, and then the process goes to step S12 for further analysis and judgment. Otherwise, indicating that no abnormal condition exists at present, determining that the preset condition of fire occurrence is not met, and continuously keeping the detection of the cleaning robot.
Step S12: the cleaning robot judges whether the current temperature is higher than the temperature at the last moment by a preset temperature value according to the detection data of the temperature detection sensor. The time for the cleaning robot to perform the temperature detection is the same as the time for performing the brightness detection in step S11, that is, the cleaning robot performs the brightness detection and the temperature detection at the same time, and only when it is determined that a fire occurs, there is a chronological order in the determination time. The preset temperature value is smaller than the temperature value generated by the flame, generally the flame temperature detected within the range of 0.1-10 m away from the flame is taken as a parameter, and the temperature can be obtained through experimental tests. The temperature detection sensor can adopt components such as a thermistor sensor or a thermocouple sensor. If the current temperature is higher than the temperature at the last moment by a preset temperature value, the change of the current environment state is not only luminous, but also generates heat, and the possibility of fire occurrence is high, so that the cleaning robot can determine that the preset condition of fire occurrence is met, otherwise, the cleaning robot can determine that the lamp is turned on by a user and the fire does not occur, and the cleaning robot determines that the preset condition of fire occurrence is not met, and continues to keep detecting.
According to the method, whether a fire disaster occurs can be accurately judged by adopting a comprehensive judgment mode of combining brightness detection and temperature detection, and the effectiveness and the accuracy of safety monitoring of the cleaning robot are effectively improved.
In another embodiment, the method for determining whether the cleaning robot in step S1 meets the preset condition for fire occurrence according to the detection data of the sensor includes the following steps:
step S11: the cleaning robot judges whether the current brightness is higher than the brightness at the last moment by a preset brightness value according to the detection data of the brightness detection sensor, if so, the step S12 is carried out, otherwise, the cleaning robot determines that the cleaning robot does not accord with the preset condition of fire occurrence, and the detection is continuously kept. This step is the same as step S11 described in the previous embodiment, and is not described here again.
Step S12: and the cleaning robot judges whether the current temperature is higher than the temperature at the last moment by a preset temperature value according to the detection data of the temperature detection sensor, if so, the step S13 is carried out, otherwise, the cleaning robot determines that the cleaning robot does not accord with the preset condition of fire occurrence, and the detection is continuously kept. This step is the same as step S12 described in the previous embodiment, and is not described here again.
Step S13: when the cleaning robot detects that the brightness and the temperature both meet the conditions, in order to further improve the accuracy of fire judgment, the cleaning robot also judges whether the current smoke concentration is higher than the smoke concentration at the previous moment by a preset smoke concentration value according to the detection data of the smoke detection sensor. The time for detecting the smoke concentration by the cleaning robot is consistent with the time for detecting the brightness and the temperature, namely the cleaning robot simultaneously detects the smoke concentration and the brightness and the temperature, and only when the fire disaster is judged, the judgment time is in sequence. The preset smoke concentration value is generally a smoke concentration detected within a range of 0.1 to 10 meters from the flame as a parameter, and can be obtained through experimental tests. The smoke detection sensor can adopt an ion type smoke sensor or a gas-sensitive resistor type smoke detection sensor. If the current smoke concentration is higher than the smoke concentration at the last moment by a preset smoke concentration value, the change of the current environment state is shown to be not only luminous and heating, but also smoke, and the possibility of fire occurrence is shown to be very high, so that the cleaning robot can determine that the preset condition of fire occurrence is met, otherwise, the cleaning robot can determine that the lamp and the electric heater are turned on by a user and does not fire, and the cleaning robot determines that the preset condition of fire occurrence is not met and continues to keep detection.
According to the method, by adopting a comprehensive judgment mode of combining brightness detection with temperature detection and smoke concentration detection, whether a fire disaster occurs can be judged more accurately, and the effectiveness and the accuracy of safety monitoring of the cleaning robot are further improved.
As one embodiment, the method for determining whether the current time is within the preset time period by the cleaning robot in step S2 includes the following steps: first, the cleaning robot determines the current time, which is specific to several minutes and minutes, according to its own built-in clock module. And then, the cleaning robot determines a preset time period in the memory, wherein the preset time period is set by a user or factory set by a manufacturer, and the preset time period is mainly used for controlling the robot to perform door collision reminding operation on the user within a proper time. If the user sleeps at 7 o' clock from 11 pm to the next morning, setting the time period as a preset time period; if the user goes to the work at night and sleeps from 10 am to 6 pm, setting the time period as a preset time period; and so on. And finally, the cleaning robot analyzes whether the current time is in the range of the preset time period, if so, the current time is determined to be in the preset time period, otherwise, the current time is determined not to be in the preset time period. According to the method, whether the condition meeting the preset condition of fire occurrence occurs within a specific time is analyzed, so that whether the cleaning robot performs door collision reminding is controlled, the door collision is avoided under the condition that the door collision is not needed, only the voice alarm is performed, and therefore the working efficiency of the cleaning robot is improved.
As one embodiment, the cleaning robot in step S3 searches the stored map data and navigates to the position of the door of the main house according to the map data, and specifically includes the following steps: first, the cleaning robot searches for map data stored in a memory, which is recorded by the cleaning robot in the previous cleaning work. The map data comprises grid unit coordinates and an environment state corresponding to each grid unit coordinate, wherein the environment state is information obtained by the cleaning robot according to data detected by a sensor of the cleaning robot in the cleaning working process, for example, when the cleaning robot detects an obstacle at a certain position, the grid unit corresponding to the position is marked as an obstacle unit; when a cliff is detected at a certain position, marking the grid unit corresponding to the position as a cliff unit, and when the cliff can normally walk at the certain position, and when no abnormality is detected, marking the grid unit corresponding to the position as a passable grid unit; and so on. The cleaning robot can realize the functions of path search, navigation, obstacle avoidance and the like according to the grid map consisting of the grid units marked with the environmental states. And then, the cleaning robot determines the current position of the cleaning robot according to the image data shot by the camera of the cleaning robot. Because the cleaning robot shoots various road signs by the camera in the last cleaning work, when the cleaning robot restarts the positioning, the current position of the cleaning robot can be known only according to the road signs in the image shot by the current position. Then, the cleaning robot determines a navigation path composed of continuous grid units with environment states capable of passing between the current position and the position of the door of the main house. The position of the door of the master room is set by a user through an intelligent terminal, specifically, the intelligent terminal can synchronously display map data detected by the cleaning robot in the cleaning work in a graphical mode, namely the map data detected by the cleaning robot in the cleaning work can be displayed in the graphical mode in the intelligent terminals such as a mobile phone or a tablet personal computer bound by the user, at the moment, the user can directly see a two-dimensional plane map of the house through the intelligent terminal, and knows which graphical area is a living room, which graphical area is the master room, which graphical area is a guest room, and the like. Therefore, the user can set a mark at the position of the door of the main house through the intelligent terminal, and the corresponding grid unit is marked as the door of the main house. The cleaning robot can search a navigation path formed by the passable grid units from the current position to the position of the door of the main house by searching the grid units. And finally, the cleaning robot walks to the position of the door of the main house along the navigation path. According to the method, the cleaning robot can quickly and accurately reach the position of the door of the host house in a map searching and navigation mode, and the effectiveness of reminding a user of operation when the cleaning robot subsequently impacts the door is guaranteed.
In one embodiment, the cleaning robot in step S3 may generate an alarm sound while repeatedly striking a door, and the method may include the following steps:
step S31: the cleaning robot broadcasts the voice data prestored in the memory through the voice module, such as broadcasting' fire is cheerful! The fire is easy! Fire fighting water mark! Fire fighting water mark! "," danger of fire, please note ", etc. The warning tone is broadcast from the time the cleaning robot hits the door. Preferably, the broadcast voice is automatically adjusted to the maximum volume. The process then proceeds to step S32.
Step S32: the cleaning robot retreats to the position away from the door by the preset distance, and the preset distance can be set according to the design requirement of the product and is generally set to be 10 cm to 20 cm. Then the cleaning robot accelerates straight toward the door, and at this time, if it is detected that the obstacle detecting sensor at the front end of the cleaning robot is triggered, indicating that the robot collides with the door, the movement is stopped, and the process proceeds to step S33 for subsequent operations. If the moving distance exceeds the preset distance, the obstacle detection sensor is not detected to be triggered, the door is opened, the cleaning robot cannot impact on the door, the cleaning robot stops moving, and the process goes to step S4 to perform subsequent operations.
Step S33: the cleaning robot judges whether the number of times of triggering the obstacle detection sensor reaches a preset number of times, the preset number of times can be correspondingly set according to design requirements and can be generally set to be 10 times to 30 times, and therefore effective awakening of a user can be guaranteed. If the preset number of times is reached, indicating that the cleaning robot has hit the door a number of times, the door is still not opened, and the user is likely not in the room, the process proceeds to step S34 for subsequent operations. Otherwise, returning to step S32, the operation of striking the door is continued to ensure that the user can be effectively awakened.
Step S34: the cleaning robot judges whether the door which is collided at present is the door of the last room, if so, the cleaning robot indicates that the cleaning robot has collided with all the doors including the master room, no door is still opened, and the cleaning robot indicates that no person is at home at present, so the cleaning robot returns to the initial position, sends prompt information to the intelligent terminal bound by the user through the network, and hopes to remind the user through the intelligent terminal. If it is not the door of the last room, indicating that the user is likely to be in another room, the process proceeds to step S35, and the subsequent operation is performed.
Step S35: the cleaning robot navigates to the position of the door of another room according to the map data, and the setting of the door positions of other rooms is the same as that of the door position of the master room, which is not described herein again. Then, the robot retreats to a position a predetermined distance away from the door in the same manner as described above in the case of hitting the door, then moves straight toward the door acceleration, stops moving after detecting that the obstacle detecting sensor at the front end of the cleaning robot is triggered, and proceeds to step S33. If the moved distance exceeds the preset distance, it is not detected that the obstacle detection sensor is triggered, the movement is stopped, and the process proceeds to step S4 to perform the subsequent operation.
According to the method, the cleaning robot is controlled to repeatedly impact the door in a front-and-back movement mode and impact the doors of different rooms for reminding, so that the automatic activity of the cleaning robot is fully utilized, and the effect of effectively waking up a user is achieved.
As one embodiment, the step S4 of detecting whether a person is present by the cleaning robot includes the following steps: the cleaning robot starts a camera, analyzes whether a moving object exists in a shot image, analyzes whether the shape of the moving object accords with the character characteristics, determines that a person appears if the moving object exists, and otherwise determines that no person appears. The method for judging whether the moving object exists mainly includes the steps of comparing whether large-area image content does not correspond to large-area image content in two adjacent images before and after the moving object exists, if yes, the fact that the object is shot by the previous image and the object moves away is indicated, the object cannot be shot by the next image, and therefore the fact that the moving object is shot can be judged. The morphological characteristics of the moving object are then further analyzed, such as the object being in a vertical shape in the image, the upper head being in a circular-like shape, the lower being significantly larger than the upper, and being in a rectangular-like shape. More specifically, whether the object in the image has the characteristics of eyes, noses, mouths or ears can be analyzed. When the analyzed image features all conform to the human features, it can be determined that a human is present. According to the method, whether the user appears or not is checked in a camera shooting and image analysis mode, the accuracy is high, and whether the user is awakened or not can be effectively determined.
As one embodiment, the cleaning robot in step S4 moves and finds a person according to the stored map data, specifically including the following steps: first, the cleaning robot searches for map data stored in a memory, which is recorded by the cleaning robot in the previous cleaning work. The map data includes grid cell coordinates and an environment state corresponding to each grid cell coordinate. Then, the cleaning robot determines the current position of the cleaning robot according to the image data shot by the camera of the cleaning robot. Then, the cleaning robot determines a navigation path composed of continuous grid cells whose environmental states are passable. And finally, the cleaning robot walks to different areas including the areas such as a room, a living room and a dining room with a door opened along the navigation path, analyzes whether a moving object exists in the shot image and analyzes whether the shape of the object accords with the characteristics of the person or not in the walking process, determines to find the person if the moving object exists, ends the task of alarm reminding at the moment, otherwise determines not to find the person, ends the tasks of door collision reminding and person finding, and can remind in other modes, such as sending prompt information to an intelligent terminal of a user or directly dialing a fire alarm call, and the like because of voice broadcasting of the specific fire position. According to the method, the robot is controlled to navigate by using the map data, so that the user is found to give an alarm prompt, the fire reminding effectiveness can be further ensured, and the working quality of the cleaning robot is ensured.
As one embodiment, the default state is that the cleaning robot only keeps the main control module and the sensor for detecting the fire occurrence in the working state, and the other circuit modules are kept in the dormant state. According to the embodiment, the cleaning robot is kept in the default state, and only part of useful circuits are reserved for working, so that the energy consumption of the cleaning robot is reduced, and the working time of the cleaning robot is effectively prolonged.
A chip can be a main control chip or a storage chip which is arranged in a cleaning robot, and program codes are arranged in the chip. The processor of the cleaning robot can control the cleaning robot to execute the safety monitoring method of the cleaning robot according to any one of the above embodiments according to the program code.
The above embodiments are only for the purpose of fully disclosing the invention without limiting the invention, and the technical solutions between the embodiments can be combined with each other to form different technical embodiments without conflicting, and all the alternatives of equivalent technical features that can be obtained without inventive work based on the gist of the present invention should fall within the scope covered by the invention.

Claims (10)

1. A safety monitoring method of a cleaning robot is characterized by comprising the following steps:
step S1: the cleaning robot judges whether the preset condition of fire occurrence is met or not according to the detection data of the sensor, if so, the step S2 is carried out, otherwise, the default state is kept;
step S2: the cleaning robot judges whether the current time is in a preset time period, if so, the step S3 is carried out, otherwise, the step S4 is carried out;
step S3: the cleaning robot searches the stored map data, navigates to the position of the door of the master house according to the map data, repeatedly impacts the door and sends out an alarm prompt tone at the same time;
step S4: the cleaning robot detects whether a person appears, if not, the cleaning robot moves and searches the person according to the stored map data and sends out an alarm prompt tone, and if so, the cleaning robot returns to the initial position and keeps the default state.
2. The safety monitoring method according to claim 1, wherein the cleaning robot in step S1 determines whether the preset condition for fire occurrence is met according to the detection data of the sensor, and specifically comprises the following steps:
step S11: the cleaning robot judges whether the current brightness is higher than the brightness at the last moment by a preset brightness value according to the detection data of the brightness detection sensor, if so, the cleaning robot enters the step S12, otherwise, the cleaning robot determines that the cleaning robot does not accord with the preset condition of fire occurrence and continues to keep detecting;
step S12: and the cleaning robot judges whether the current temperature is higher than the temperature at the last moment by a preset temperature value according to the detection data of the temperature detection sensor, if so, the cleaning robot confirms that the cleaning robot accords with the preset condition of fire, otherwise, the cleaning robot confirms that the cleaning robot does not accord with the preset condition of fire, and the cleaning robot continues to maintain the detection.
3. The safety monitoring method according to claim 1, wherein the cleaning robot in step S1 determines whether the preset condition for fire occurrence is met according to the detection data of the sensor, and specifically comprises the following steps:
step S11: the cleaning robot judges whether the current brightness is higher than the brightness at the last moment by a preset brightness value according to the detection data of the brightness detection sensor, if so, the cleaning robot enters the step S12, otherwise, the cleaning robot determines that the cleaning robot does not accord with the preset condition of fire occurrence and continues to keep detecting;
step S12: the cleaning robot judges whether the current temperature is higher than the temperature at the last moment by a preset temperature value according to the detection data of the temperature detection sensor, if so, the step S13 is carried out, otherwise, the cleaning robot determines that the cleaning robot does not accord with the preset condition of fire occurrence, and continues to keep detecting;
step S13: and the cleaning robot judges whether the current smoke concentration is higher than the smoke concentration at the last moment by a preset smoke concentration value or not according to the detection data of the smoke detection sensor, if so, the cleaning robot determines that the cleaning robot accords with the preset condition of fire, otherwise, the cleaning robot determines that the cleaning robot does not accord with the preset condition of fire, and the cleaning robot continues to maintain the detection.
4. The safety monitoring method according to claim 1, wherein the step S2 of determining whether the current time is within a preset time period by the cleaning robot specifically comprises the steps of:
the cleaning robot determines the current time;
the cleaning robot determines a preset time period;
the cleaning robot judges whether the current time is within the range of the preset time period;
the preset time period is set by a user or factory set by a manufacturer.
5. The safety monitoring method according to claim 1, wherein the cleaning robot in step S3 searches the stored map data and navigates to the position of the door of the main house according to the map data, and comprises the following steps:
the cleaning robot searches map data recorded by the cleaning robot in the last cleaning work and stored in a memory, wherein the map data comprises grid unit coordinates and an environment state corresponding to each grid unit coordinate;
the cleaning robot determines the current position of the cleaning robot according to image data shot by a camera of the cleaning robot;
the cleaning robot determines a navigation path formed by continuous grid units with environment states capable of passing between the current position and the position of a door of a master house;
the cleaning robot walks to the position where the door of the master house is located along the navigation path;
the intelligent cleaning robot comprises a cleaning robot body, a door of a main house, an intelligent terminal and a map data display module, wherein the position of the door of the main house is set by a user through the intelligent terminal, and the intelligent terminal can synchronously display the map data detected by the cleaning robot in the cleaning work in a graphical mode.
6. The safety monitoring method according to claim 5, wherein the cleaning robot generates an alarm prompt while repeatedly striking the door in step S3, comprising the following steps:
step S31: the cleaning robot broadcasts voice data prestored in the memory through the voice module;
step S32: the cleaning robot retreats to a position away from the door by a preset distance, then directly moves towards the door in an accelerating manner, stops moving after detecting that an obstacle detection sensor at the front end of the cleaning robot is triggered, and proceeds to step S33, and stops moving if the moving distance exceeds the preset distance and the obstacle detection sensor is not triggered, and proceeds to step S4;
step S33: the cleaning robot judges whether the number of times of triggering the obstacle detection sensor reaches a preset number of times, if so, the step S34 is carried out, otherwise, the step S32 is returned;
step S34: the cleaning robot judges whether the door collided at present is the door of the last room, if so, the cleaning robot returns to the initial position and sends prompt information to the intelligent terminal bound by the user through the network, otherwise, the cleaning robot goes to step S35;
step S35: the cleaning robot navigates to a location where a door of another room is located according to the map data, and retreats to a location away from the door by a preset distance, and then moves straight toward the door, and stops moving after detecting that the obstacle detection sensor at the front end of the cleaning robot is triggered, and proceeds to step S33, and stops moving if the moved distance exceeds the preset distance, and does not detect that the obstacle detection sensor is triggered, and proceeds to step S4.
7. The safety monitoring method according to claim 1, wherein the step of detecting whether a person is present by the cleaning robot in step S4 specifically comprises the steps of:
the cleaning robot starts a camera, analyzes whether a moving object exists in a shot image, analyzes whether the shape of the moving object accords with the character characteristics, determines that a person appears if the moving object exists, and otherwise determines that no person appears.
8. The safety monitoring method according to claim 7, wherein the cleaning robot in step S4 moves and searches for a person according to the stored map data, specifically comprising the steps of:
the cleaning robot searches map data recorded by the cleaning robot in the last cleaning work and stored in a memory, wherein the map data comprises grid unit coordinates and an environment state corresponding to each grid unit coordinate;
the cleaning robot determines the current position of the cleaning robot according to image data shot by a camera of the cleaning robot;
the cleaning robot determines a navigation path consisting of continuous grid units with a passable environmental state;
the cleaning robot walks to different areas along the navigation path, analyzes whether a moving object exists in the shot image and analyzes whether the shape of the object accords with the character characteristics or not in the walking process, and if so, determines that the person is found, otherwise, determines that the person is not found.
9. The security monitoring method according to any one of claims 1 to 8, characterized by:
the default state is that the cleaning robot only keeps the main control module and the sensor for detecting fire in working state, and other circuit modules keep in dormant state.
10. A chip having program code embodied therein, the chip comprising: the processor is capable of controlling the cleaning robot to perform the safety monitoring method of the cleaning robot according to the program code of any one of claims 1 to 9.
CN202010095306.5A 2020-02-17 2020-02-17 Safety monitoring method and chip for cleaning robot Active CN111199627B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010095306.5A CN111199627B (en) 2020-02-17 2020-02-17 Safety monitoring method and chip for cleaning robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010095306.5A CN111199627B (en) 2020-02-17 2020-02-17 Safety monitoring method and chip for cleaning robot

Publications (2)

Publication Number Publication Date
CN111199627A true CN111199627A (en) 2020-05-26
CN111199627B CN111199627B (en) 2021-09-07

Family

ID=70745272

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010095306.5A Active CN111199627B (en) 2020-02-17 2020-02-17 Safety monitoring method and chip for cleaning robot

Country Status (1)

Country Link
CN (1) CN111199627B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112201000A (en) * 2020-10-10 2021-01-08 广东省构建工程建设有限公司 Dynamic fire monitoring system and method applied to construction stage
CN113920671A (en) * 2021-09-15 2022-01-11 青岛经济技术开发区海尔热水器有限公司 Fire linkage control method, device and equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107527466A (en) * 2017-07-31 2017-12-29 努比亚技术有限公司 A kind of fire alarm method, terminal and computer-readable recording medium
CN108090990A (en) * 2018-02-01 2018-05-29 广州微树云信息科技有限公司 Intelligent inspection system and robot
CN108673524A (en) * 2018-05-24 2018-10-19 夏文才 A kind of smart home robot
CN109998429A (en) * 2018-01-05 2019-07-12 艾罗伯特公司 Mobile clean robot artificial intelligence for context aware
KR20190094303A (en) * 2019-04-25 2019-08-13 엘지전자 주식회사 Method of redefining position of robot using artificial intelligence and robot of implementing thereof
KR20190120015A (en) * 2018-04-14 2019-10-23 장은선 fire detector robotic vacuum claner

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107527466A (en) * 2017-07-31 2017-12-29 努比亚技术有限公司 A kind of fire alarm method, terminal and computer-readable recording medium
CN109998429A (en) * 2018-01-05 2019-07-12 艾罗伯特公司 Mobile clean robot artificial intelligence for context aware
CN108090990A (en) * 2018-02-01 2018-05-29 广州微树云信息科技有限公司 Intelligent inspection system and robot
KR20190120015A (en) * 2018-04-14 2019-10-23 장은선 fire detector robotic vacuum claner
CN108673524A (en) * 2018-05-24 2018-10-19 夏文才 A kind of smart home robot
KR20190094303A (en) * 2019-04-25 2019-08-13 엘지전자 주식회사 Method of redefining position of robot using artificial intelligence and robot of implementing thereof

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112201000A (en) * 2020-10-10 2021-01-08 广东省构建工程建设有限公司 Dynamic fire monitoring system and method applied to construction stage
CN113920671A (en) * 2021-09-15 2022-01-11 青岛经济技术开发区海尔热水器有限公司 Fire linkage control method, device and equipment
CN113920671B (en) * 2021-09-15 2024-04-02 青岛经济技术开发区海尔热水器有限公司 Fire linkage control method, device and equipment

Also Published As

Publication number Publication date
CN111199627B (en) 2021-09-07

Similar Documents

Publication Publication Date Title
CN111199627B (en) Safety monitoring method and chip for cleaning robot
US10339388B1 (en) Virtual sensors
CN108354526B (en) Security method and device for sweeping robot
CN205247085U (en) Intelligence household security robot
US20170154517A1 (en) Air quality alert system and method
CN107085380B (en) Intelligent household system user position judgment method and electronic equipment
CN111258357B (en) Environment distribution establishing method, intelligent device, cleaning robot and storage medium
CN204650695U (en) The removable special toy dog of intelligence and safety defense monitoring system
CN102830675A (en) Intelligent home robot system based on GIS (geographic information system)
JP5469496B2 (en) Thermal image monitoring device
CN111610724B (en) Smart home smart device control method, device, equipment and storage medium
CN108427310A (en) Intelligent home furnishing control method, device and computer readable storage medium
CN104376679A (en) Intelligent household pre-warning method
CN108834276A (en) A kind of method and system of the control of intelligent terminal based on multisensor
CN111557482A (en) Electronic cigarette starting control device and method and electronic cigarette
CN105844836A (en) Method and device for detecting fire by means of mobile phone temperature sensor
JP5766087B2 (en) Security system
US10929798B1 (en) Guard tour tracking
CN112327773B (en) Intelligent household terminal equipment control method and device and intelligent household system
CN107289586B (en) Air conditioning system, air conditioner and method for alarming falling through air conditioning system
EP3908987B1 (en) Method and system for reducing carbon monoxide in a building
CN112148105A (en) Access control system awakening method and device
CN107168091A (en) A kind of smart home interactive system based on virtual reality technology
CN104075412A (en) Control method and control system of air-conditioner
CN108592346A (en) Air conditioning system and control method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address

Address after: 519000 2706, No. 3000, Huandao East Road, Hengqin new area, Zhuhai, Guangdong

Patentee after: Zhuhai Yiwei Semiconductor Co.,Ltd.

Country or region after: China

Address before: Room 105-514, No.6 Baohua Road, Hengqin New District, Zhuhai City, Guangdong Province

Patentee before: AMICRO SEMICONDUCTOR Co.,Ltd.

Country or region before: China

CP03 Change of name, title or address