CN112394717B - Indoor environment warning method and device and sweeping robot - Google Patents

Indoor environment warning method and device and sweeping robot Download PDF

Info

Publication number
CN112394717B
CN112394717B CN201910699637.7A CN201910699637A CN112394717B CN 112394717 B CN112394717 B CN 112394717B CN 201910699637 A CN201910699637 A CN 201910699637A CN 112394717 B CN112394717 B CN 112394717B
Authority
CN
China
Prior art keywords
monitored
image
state
information
image information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910699637.7A
Other languages
Chinese (zh)
Other versions
CN112394717A (en
Inventor
李威
李九翔
金方明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Midea Robozone Technology Co Ltd
Original Assignee
Midea Robozone Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Midea Robozone Technology Co Ltd filed Critical Midea Robozone Technology Co Ltd
Priority to CN201910699637.7A priority Critical patent/CN112394717B/en
Publication of CN112394717A publication Critical patent/CN112394717A/en
Application granted granted Critical
Publication of CN112394717B publication Critical patent/CN112394717B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0253Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Alarm Systems (AREA)

Abstract

The embodiment of the invention provides an indoor environment warning method, an indoor environment warning device and a sweeping robot, wherein the method comprises the following steps: collecting an indoor environment image, wherein the environment image comprises image information of an object to be monitored; and judging whether the object to be monitored is in a state needing warning or not according to the image information of the object to be monitored in the environment image. The sweeping robot provided by the embodiment of the invention can judge whether the object to be monitored is in a state needing warning or not in an image acquisition mode while carrying out sweeping operation.

Description

Indoor environment warning method and device and sweeping robot
Technical Field
The invention relates to the technical field of intelligent equipment, in particular to an indoor environment warning method and device and a floor sweeping robot.
Background
In the current society, every family pays more attention to the safety of the family environment, so that a plurality of intelligent household appliances appear. These home appliances can prompt the user when a malfunction or abnormality occurs. However, there are cases where the non-intelligent appliances are abnormally operated, and therefore, the monitoring of the non-intelligent appliances is particularly important when no one is present at home. The household sweeping robot generally has an autonomous navigation and positioning function. In the process of executing the cleaning task, a path is planned mostly based on an electronic map corresponding to an indoor working environment, and autonomous movement is performed according to the planned path according to the current position of the user. Therefore, when no one is present at home, the sweeping robot can have certain monitoring capability besides automatic sweeping. However, the camera configured in the existing household sweeping robot does not have a function of monitoring the environment in the home, and is only used for sensing obstacles, so that the existing household sweeping robot has a single function.
Disclosure of Invention
The embodiment of the invention provides an indoor environment warning method and device and a sweeping robot, and aims to solve one or more technical problems in the prior art.
In a first aspect, an embodiment of the present invention provides an indoor environment warning method, including:
collecting an indoor environment image, wherein the environment image comprises image information of an object to be monitored;
and judging whether the object to be monitored is in a state needing warning or not according to the image information of the object to be monitored in the environment image.
In one embodiment, an image of an environment within a room is acquired, comprising:
acquiring the position information of the object to be monitored in the moving process of the sweeping robot;
and acquiring an environment image of the target position under the condition that the sweeping robot moves to the target position corresponding to the position information.
In one embodiment, the acquiring the position information of the object to be monitored during the moving process of the sweeping robot includes:
the method comprises the steps that in the process that the sweeping robot moves according to a pre-constructed map, the position information of an object to be monitored in the map is obtained, and the map comprises the movable area information of the sweeping robot and the position information of the object to be monitored.
In one embodiment, the determining whether the object to be monitored is in a state that needs to be warned according to the image information of the object to be monitored in the environment image includes:
matching the image information of the object to be monitored with the standard image information of the object to be monitored;
and responding to the inconsistency of the matching results, and judging that the object to be monitored is in a state needing to be warned.
In one embodiment, the determining whether the object to be monitored is in a state that needs to be warned according to the image information of the object to be monitored in the environment image includes:
identifying whether the object to be monitored is in an open state or not according to the image information of the object to be monitored;
and responding to the opening state of the object to be monitored, and judging whether the object to be monitored is in a state needing warning according to the environmental information around the object to be monitored.
In one embodiment, the object to be monitored comprises: domestic appliance and indoor door and window.
In one embodiment, the object to be monitored is a household appliance or an indoor door and window, and the matching of the image information of the object to be monitored in the indoor environment image and the standard image information of the object to be monitored includes:
acquiring standard image information of the household appliance or an indoor door and window, wherein the standard image information comprises an image of the household appliance or the indoor door and window in a closed state;
and matching the image information of the household appliance or the indoor door and window in the indoor environment image with the standard image information of the household appliance or the indoor door and window.
In one embodiment, the monitoring method includes the steps of determining whether the object to be monitored is in a state needing to be warned according to environmental information around the object to be monitored in response to the object to be monitored being in an open state, and including:
determining whether a user exists in a preset range of the household appliance in the starting state or not according to the environmental information around the household appliance in the starting state;
and responding to the situation that the user does not exist in the preset range of the household appliance in the starting state, and judging that the household appliance is in a state needing warning.
In one embodiment, the monitoring method includes the steps of determining whether the object to be monitored is in a state needing to be warned according to the environmental information around the object to be monitored in response to the object to be monitored being in an open state, and includes:
determining the environment weather outside the window in the opening state according to the weather information;
and responding to the outdoor environment weather of the window which is preset severe weather, and judging that the window is in a state needing warning.
In one embodiment, the method further comprises:
and determining whether to generate prompt information according to the judgment result.
In one embodiment, the method further comprises:
and sending the generated prompt information to a mobile terminal to inform a user of the state of the object to be monitored.
In a second aspect, an embodiment of the present invention provides an indoor environment warning device, including:
the system comprises an acquisition module, a monitoring module and a monitoring module, wherein the acquisition module is used for acquiring an indoor environment image which comprises image information of an object to be monitored;
and the judging module is used for judging whether the object to be monitored is in a state needing warning according to the image information of the object to be monitored in the environment image.
In one embodiment, the determining module comprises:
the matching sub-module is used for matching the image information of the object to be monitored in the indoor environment image with the standard image information of the object to be monitored; and if the matching results are inconsistent, judging that the object to be monitored is in a state needing to be warned.
In one embodiment, the determining module comprises:
the identification submodule is used for identifying whether the object to be monitored is in an open state or not according to the image information of the object to be monitored in the indoor environment image;
and the judging submodule is used for judging whether the object to be monitored is in a state needing warning according to the environmental information around the object to be monitored if the object to be monitored is in an open state.
In one embodiment, the method further comprises:
and the generating module is used for determining whether to generate the prompt information according to the judgment result.
In one embodiment, the method further comprises:
and the sending module is used for sending the generated prompt information to the mobile terminal so as to inform a user of the state of the object to be monitored.
In a third aspect, an embodiment of the present invention provides a cleaning robot, where functions of the cleaning robot may be implemented by hardware, or may be implemented by hardware executing corresponding software. The hardware or software includes one or more modules corresponding to the above-described functions.
In one possible design, the indoor environment warning terminal structurally includes a processor, a memory, and a camera, where the memory is used to store a program that supports the indoor environment warning terminal to execute the indoor environment warning method, the camera is used to collect an image, and the processor is configured to execute the program stored in the memory. The indoor environment warning terminal may further include a communication interface for communicating with other devices or a communication network.
In a fourth aspect, an embodiment of the present invention provides a floor sweeping robot, including the above indoor environment warning device.
In a fifth aspect, an embodiment of the present invention provides a computer-readable storage medium for storing computer software instructions for an indoor environment warning terminal, which includes a program for executing the indoor environment warning method.
One of the above technical solutions has the following advantages or beneficial effects: the sweeping robot provided by the embodiment of the invention can judge whether the object to be monitored is in a state needing warning or not in an image acquisition mode while carrying out sweeping operation.
The foregoing summary is provided for the purpose of description only and is not intended to be limiting in any way. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features of the present invention will be readily apparent by reference to the drawings and following detailed description.
Drawings
In the drawings, like reference characters designate like or similar parts or elements throughout the several views unless otherwise specified. The figures are not necessarily to scale. It is appreciated that these drawings depict only some embodiments in accordance with the disclosure and are therefore not to be considered limiting of its scope.
Fig. 1 shows a flowchart of an indoor environment alerting method according to an embodiment of the present invention.
Fig. 2 shows a flowchart of an indoor environment alerting method according to another embodiment of the present invention.
Fig. 3 is a detailed flowchart illustrating step S100 of an indoor environment warning method according to an embodiment of the present invention.
Fig. 4 is a flowchart illustrating a step S100 of an indoor environment warning method according to another embodiment of the present invention.
Fig. 5 shows a flowchart of an indoor environment alerting method according to another embodiment of the present invention.
Fig. 6 is a flowchart illustrating the step S210 of the indoor environment warning method according to an embodiment of the present invention.
Fig. 7 shows a flowchart of an indoor environment alerting method according to another embodiment of the present invention.
Fig. 8 is a flowchart illustrating a specific step S240 of an indoor environment warning method according to an embodiment of the present invention.
Fig. 9 is a detailed flowchart of step S240 of the indoor environment warning method according to an embodiment of the present invention.
Fig. 10 is a block diagram showing the construction of an indoor environment warning device according to an embodiment of the present invention.
Fig. 11 is a block diagram illustrating an indoor environment warning device according to another embodiment of the present invention.
Fig. 12 is a block diagram illustrating a structure of a determination module of an indoor environment warning device according to an embodiment of the present invention.
Fig. 13 is a block diagram illustrating a structure of a determination module of an indoor environment warning device according to another embodiment of the present invention.
Fig. 14 is a block diagram illustrating a sweeping robot according to an embodiment of the present invention.
Fig. 15 is a block diagram illustrating a sweeping robot according to another embodiment of the present invention.
Fig. 16 shows a schematic structural diagram of a sweeping robot according to another embodiment of the present invention.
Detailed Description
In the following, only certain exemplary embodiments are briefly described. As those skilled in the art will recognize, the described embodiments may be modified in various different ways, all without departing from the spirit or scope of the present invention. Accordingly, the drawings and description are to be regarded as illustrative in nature, and not as restrictive.
Fig. 1 shows a flowchart of an indoor environment alerting method according to an embodiment of the present invention. As shown in fig. 1, the indoor environment warning method applied to a sweeping robot includes the following steps:
s100: the method comprises the steps of collecting indoor environment images, wherein the environment images contain image information of an object to be monitored.
The indoor space can be understood as the space area where the sweeping robot is located. An indoor environment may include an environment in a home, public or office. The object to be monitored may include any visible object present in the room. Such as household appliances, homes, doors and windows, etc. The image information of the object to be monitored may include an image of the object to be monitored in the environmental image, or various information associated with the object to be monitored in the environmental image.
In one example, the object to be monitored may be preset by a user or may be selected by the sweeping robot.
S200: and judging whether the object to be monitored is in a state needing warning or not according to the image information of the object to be monitored in the environment image. The state needing to be warned can be understood as the state that the object to be monitored is in an abnormal working state at present, has potential safety hazards, and has influence on the indoor environment or users. The standard for specifically judging whether the object to be monitored is in a state needing warning or not can be selected and adjusted according to needs. Different objects to be monitored may use different criteria to determine whether they are in a state requiring warning.
In one embodiment, as shown in fig. 2, the indoor environment warning method further includes:
s300: and determining whether to generate prompt information according to the judgment result. The generated prompt message can be used for enabling the sweeping robot to inform the user of the current state of the object to be monitored.
In one embodiment, as shown in fig. 2, the indoor environment warning method further includes:
s400: and sending the generated prompt information to the mobile terminal to inform a user of the state of the object to be monitored. The prompt message received by the mobile terminal can be in the format of voice broadcast, characters or images and the like.
In one example, after the prompt information is sent to the mobile terminal, the method may further include:
and according to the prompt information, marking the corresponding object to be monitored on a map pre-constructed by the sweeping robot.
And/or the presence of a gas in the gas,
and generating a broadcasting voice according to the prompt information, and broadcasting through the sweeping robot.
In one embodiment, as shown in fig. 3, acquiring an image of an environment within a room comprises:
s110: and in the moving process of the sweeping robot, acquiring the position information of the object to be monitored. The position information of the object to be monitored may include a coordinate position of the object to be monitored in the room, or a relative position with respect to a preset reference object in the room.
S120: and acquiring an environment image of the target position under the condition that the sweeping robot moves to the target position corresponding to the position information. A certain distance can exist between the target position and the actual position of the object to be monitored, so that the environment image information containing the complete object to be monitored can be acquired.
In one example, the captured environmental image may include a plurality of frames of images extracted from a video. The captured environmental image may also include a picture taken at a particular capture location of the object to be monitored.
In one embodiment, as shown in fig. 4, in the process of moving the sweeping robot, acquiring the position information of the object to be monitored includes:
s1110: in the process that the sweeping robot moves according to a pre-constructed map, the position information of an object to be monitored in the map is obtained, and the map comprises the movable area information of the sweeping robot and the position information of the object to be monitored.
It should be noted that, in the map building process, any map building manner of the sweeping robot in the prior art may be used. For example, the map construction is accomplished by VSLAM (Visual simultaneous localization and mapping) technology and SLAM (simultaneous localization and mapping) technology.
The technical framework of the VSLAM mainly comprises five processes of sensor data preprocessing, front end, rear end, loop detection and graph building. Among them, the front end is also called visual odometer (VO for short). It is mainly studied how to quantitatively estimate the motion of an inter-frame camera from adjacent frame images. The motion trail of the camera carrier (such as a robot) is formed by connecting the motion trails of the adjacent frames in series, and the problem of positioning is solved. Then, the position of the spatial point of each pixel is calculated according to the estimated position of the camera at each moment, and a map is obtained. In VSLAM, the front end is primarily concerned with computer vision related algorithms. The typical practice is generally as follows: firstly, extracting each frame of image feature point, carrying out coarse feature point matching on adjacent frames, then removing unreasonable matching pairs by using RANSAC (random sample consensus) algorithm, and then obtaining position and attitude information. The rear end is mainly used for optimizing the result of the front end to obtain the optimal pose estimation. There are two main approaches: one is optimization based on filter theory, and its idea is to linearize the state estimation model, approximate its noise with gaussian distribution, and then update according to prediction by kalman filter. The other is nonlinear optimization (graph optimization). The basic idea is that optimized variables are used as nodes of the graph, error terms are used as edges of the graph, and after initial values are given, iterative optimization updating can be carried out. Due to the sparsity of the graph optimization, the calculation amount can be reduced while the accuracy is ensured. The loop detection mainly aims to enable the robot to know the place where the robot has gone so as to solve the problem that the position drifts over time. Visual loop detection is generally accomplished by determining similarity between images, which is the same reason we use the eyes to determine two identical locations. Because of the abundance of image information, VSLAM has great advantages in loop back detection. When loop detection is successful, the corresponding relation between the current image and the image which has been seen in the past is established, and the back-end optimization algorithm can readjust the track and the map according to the information, so that the accumulated error is eliminated to the maximum extent. SLAM builds different maps according to different sensor types and application requirements. Common are 2D grid maps, 2D topological maps, 3D point cloud maps, and the like.
In one embodiment, as shown in fig. 5, the determining whether the object to be monitored is in the state needing to be warned according to the image information of the object to be monitored in the environment image includes:
s210: and matching the image information of the object to be monitored with the standard image information of the object to be monitored. The standard image information of the object to be monitored can be understood as the image information of the object to be monitored in a state without early warning.
S220: and if the matching results are inconsistent, judging that the object to be monitored is in a state needing to be warned.
In one embodiment, as shown in fig. 6, the step of matching the image information of the object to be monitored in the indoor environment image with the standard image information of the object to be monitored includes:
s10: and acquiring standard image information of the household appliance or the indoor door and window, wherein the standard image information comprises images of the household appliance or the indoor door and window in a closed state.
S20: and matching the image information of the household appliance or the indoor door and window in the indoor environment image with the standard image information of the household appliance or the indoor door and window.
In one example, treat that the control article is domestic appliance or indoor door and window, if the matching result is inconsistent, then judge that the control article of treating is in needs warning state, include:
and identifying image information of the position where the household appliance or the indoor door and window can be opened and closed in the indoor environment image.
And acquiring image information of the position where the household appliance or the indoor door and window can be opened and closed from the standard image information.
And matching whether the image information of the two openable and closable positions is consistent.
If the two states are inconsistent, the household appliance or the indoor door and window are judged to be in an unclosed state which needs to be warned.
In one embodiment, as shown in fig. 7, the determining whether the object to be monitored is in the state that needs to be warned according to the image information of the object to be monitored in the environment image includes:
s230: and identifying whether the object to be monitored is in an open state or not according to the image information of the object to be monitored.
S240: and if the object to be monitored is in the opening state, judging whether the object to be monitored is in a state needing warning according to the environmental information around the object to be monitored.
In one embodiment, as shown in fig. 8, the step of determining whether the object to be monitored is in a state that needs to be warned according to the environmental information around the object to be monitored if the object to be monitored is in an on state includes:
s30: and determining whether a user exists in the preset range of the household appliance in the starting state or not according to the environmental information around the household appliance in the starting state.
S40: if the user does not exist, the household appliance is judged to be in the state needing warning.
In one example, the object to be monitored is a refrigerator. The process of judging whether the refrigerator is in a state needing early warning comprises the following steps:
and determining the door opening area of the refrigerator according to the image information of the refrigerator in the environment image.
And identifying a door body and a door frame of the refrigerator in the door opening area, and judging whether the door body and the door frame are in a separated state.
If yes, judging that the refrigerator is in an opening state at present.
And acquiring all-around environment information around the refrigerator under the condition of judging that the refrigerator is in an open state at present. And judging whether the user exists around the refrigerator or not according to the omnibearing environment information.
If the user exists, the user is considered to be operating the refrigerator and only leaves the refrigerator for a short time.
If the user does not exist, the user is considered to forget to close the refrigerator door, and the refrigerator is judged to be in a state needing warning.
Further, sending the prompt message to the mobile terminal includes:
and sending prompt information to a mobile terminal of a user, wherein the prompt information can comprise prompt contents of 'the refrigerator door is in an open state, and please close in time'.
In one example, in order to improve the accuracy of judging whether the refrigerator is in a state needing warning, the environmental information around the object to be monitored can be continuously collected within a certain time. And if no user appears around the object to be monitored within a certain time range, determining that the user forgets to close the object to be monitored.
In one embodiment, as shown in fig. 9, the step of determining whether the object to be monitored is in a state that needs to be warned according to the environmental information around the object to be monitored if the object to be monitored is in an open state includes:
s50: and determining the environmental weather outside the window in the opening state according to the weather information.
S60: if the outdoor environment weather of window is predetermined bad weather, then judge that the window is in needs the warning state. Inclement weather may include weather conditions that have some impact on the indoor environment. For example, weather such as haze weather, rain, snow, and the like. When the window is in not closed state, haze and sleet all can enter into indoorly.
Further, sending the prompt message to the mobile terminal includes:
and sending prompt information to the mobile terminal of the user, wherein the prompt information can comprise prompt contents of 'raining today, opening the window of the bedroom A and closing the window in time'.
In one embodiment, the object to be monitored described in the above embodiments may include: household appliances, indoor doors and windows, and the like.
In one embodiment, the process of capturing images in the above embodiments may be performed by using a camera or other image capturing devices in the prior art.
Fig. 10 is a block diagram showing the construction of an indoor environment warning device according to an embodiment of the present invention. As shown in fig. 10, the indoor environment warning device includes:
the acquisition module 10 is configured to acquire an indoor environment image, where the environment image includes image information of an object to be monitored.
The judging module 20 is configured to judge whether the object to be monitored is in a state that needs to be warned according to the image information of the object to be monitored in the environment image.
In one embodiment, as shown in fig. 11, the method further includes:
and the generating module 30 is configured to determine whether to generate the prompt message according to the determination result.
In one embodiment, as shown in fig. 11, the method further includes:
and the sending module 40 is configured to send the generated prompt message to the mobile terminal so as to inform a user of a state of the object to be monitored. The prompt message received by the mobile terminal can be in the format of voice broadcast, characters or images and the like.
In one embodiment, as shown in fig. 12, the determining module 20 includes:
the matching submodule 21 is configured to match image information of an object to be monitored in the indoor environment image with standard image information of the object to be monitored. And if the matching results are inconsistent, judging that the object to be monitored is in a state needing to be warned.
In one embodiment, as shown in fig. 13, the determining module 20 includes:
and the identification submodule 22 is used for identifying whether the object to be monitored is in an open state or not according to the image information of the object to be monitored in the indoor environment image.
And the judging submodule 23 is configured to judge whether the object to be monitored is in a state needing to be warned according to the environmental information around the object to be monitored if the object to be monitored is in an open state.
In one embodiment, the acquisition module 10 comprises:
and the acquisition submodule is used for acquiring the position information of the object to be monitored in the moving process of the sweeping robot.
And the acquisition submodule is used for acquiring an environment image of the target position under the condition that the sweeping robot moves to the target position corresponding to the position information.
In one example, the obtaining submodule is configured to obtain position information of an object to be monitored in a map during a process that the sweeping robot moves according to a pre-constructed map, where the map includes movable area information of the sweeping robot and position information of the object to be monitored.
Fig. 14 is a block diagram illustrating a sweeping robot according to an embodiment of the present invention. As shown in fig. 14, the sweeping robot includes the indoor environment warning device according to any one of the embodiments. Specifically, the cleaning robot 100 includes: the device comprises an acquisition module 10, a judgment module 20 and a generation module 30.
In one example, as shown in fig. 15, the sweeping robot 100 further includes a sending module 40.
The functions of each module in each apparatus in the embodiments of the present invention may refer to the corresponding description in the above method, and are not described herein again.
Fig. 16 is a block diagram illustrating a sweeping robot according to an embodiment of the present invention. As shown in fig. 16, the terminal includes: a memory 910, a processor 920 and a camera 940, wherein the memory 910 stores computer programs running on the processor 920. The processor 920 implements the indoor environment warning method in the above embodiment when executing the computer program. The camera 940 is used to collect images. The number of the memory 910, the processor 920 and the camera 940 may be one or more.
This robot of sweeping floor still includes:
and a communication interface 930 for communicating with an external device to transmit the indoor environment warning data.
Memory 910 may include high-speed RAM memory, and may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
If the memory 910, the processor 920 and the communication interface 930 are implemented independently, the memory 910, the processor 920 and the communication interface 930 may be connected to each other through a bus and perform communication with each other. The bus may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an Extended ISA (Extended Industry Standard Architecture) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown in FIG. 16, but this is not intended to represent only one bus or type of bus.
Optionally, in an implementation, if the memory 910, the processor 920 and the communication interface 930 are integrated on a chip, the memory 910, the processor 920 and the communication interface 930 may complete communication with each other through an internal interface.
An embodiment of the present invention provides a computer-readable storage medium, which stores a computer program, and the computer program is executed by a processor to implement the method in any one of the above embodiments.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or to implicitly indicate the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one of the feature. In the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
The logic and/or steps represented in the flowcharts or otherwise described herein, such as an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, various steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following technologies, which are well known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried out in the method of implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and the program, when executed, includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a separate product, may also be stored in a computer readable storage medium. The storage medium may be a read-only memory, a magnetic or optical disk, or the like.
The above description is only for the specific embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive various changes or substitutions within the technical scope of the present invention, and these should be covered by the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (14)

1. The utility model provides an indoor environment warning method, is applied to robot of sweeping the floor, its characterized in that includes:
acquiring an indoor environment image, wherein the environment image comprises image information of an object to be monitored;
judging whether the object to be monitored is in a state needing warning or not according to the image information of the object to be monitored in the environment image;
wherein, according to the image information of the object to be monitored in the environment image, whether the object to be monitored is in a state needing to be warned or not is judged, including:
identifying whether the object to be monitored is in an open state or not according to the image information of the object to be monitored;
if the object to be monitored is a household appliance, determining whether a user exists in a preset range of the household appliance in the starting state or not according to the environmental information around the household appliance in the starting state;
and responding to the situation that the user does not exist in the preset range of the household appliance in the starting state, and judging that the household appliance is in a state needing warning.
2. The method of claim 1, wherein capturing an image of an environment within a room comprises:
in the moving process of the sweeping robot, acquiring the position information of the object to be monitored;
and acquiring an environment image of the target position under the condition that the sweeping robot moves to the target position corresponding to the position information.
3. The method according to claim 2, wherein the obtaining of the position information of the object to be monitored during the movement of the sweeping robot comprises:
the method comprises the steps that in the process that the sweeping robot moves according to a pre-constructed map, the position information of an object to be monitored in the map is obtained, and the map comprises the movable area information of the sweeping robot and the position information of the object to be monitored.
4. The method according to claim 1, wherein the object to be monitored is a window, and in response to the object to be monitored being in an open state, determining whether the object to be monitored is in a state requiring warning according to environmental information around the object to be monitored comprises:
determining the outdoor environment weather of the window in the opening state according to the weather information;
and responding to the outdoor environment weather of the window as preset severe weather, and judging that the window is in a state needing warning.
5. The method of claim 1, further comprising:
and determining whether to generate prompt information according to the judgment result.
6. The method of claim 5, further comprising:
and sending the generated prompt information to a mobile terminal to inform a user of the state of the object to be monitored.
7. The utility model provides an indoor environment warning method, is applied to the robot of sweeping the floor, its characterized in that includes:
acquiring an indoor environment image, wherein the environment image comprises image information of an object to be monitored;
judging whether the object to be monitored is in a state needing warning or not according to the image information of the object to be monitored in the environment image;
wherein, treat that the control article is domestic appliance or indoor door and window, according to in the environment image treat the image information of control article, judge whether treat that the control article is in needs warning state, include:
acquiring standard image information of the household appliance or an indoor door and window, wherein the standard image information comprises an image of the household appliance or the indoor door and window in a closed state;
and matching the image information of the household appliance or the indoor door and window in the indoor environment image with the standard image information of the household appliance or the indoor door and window, responding to the inconsistency of the matching result, and judging that the object to be monitored is in a state needing to be warned.
8. An indoor environment warning device, comprising:
the system comprises an acquisition module, a monitoring module and a monitoring module, wherein the acquisition module is used for acquiring an indoor environment image which comprises image information of an object to be monitored;
the judging module is used for judging whether the object to be monitored is in a state needing warning or not according to the image information of the object to be monitored in the environment image;
wherein, the judging module comprises:
the identification submodule is used for identifying whether the object to be monitored is in an open state or not according to the image information of the object to be monitored in the indoor environment image;
and the judging submodule is used for judging whether the object to be monitored is in a state needing warning according to the environmental information around the object to be monitored if the object to be monitored is in an open state.
9. The apparatus of claim 8, further comprising:
and the generating module is used for determining whether to generate the prompt information according to the judgment result.
10. The apparatus of claim 9, further comprising:
and the sending module is used for sending the generated prompt information to the mobile terminal so as to inform a user of the state of the object to be monitored.
11. An indoor environment warning device, its characterized in that includes:
the system comprises an acquisition module, a monitoring module and a monitoring module, wherein the acquisition module is used for acquiring an indoor environment image which comprises image information of an object to be monitored;
the judging module is used for judging whether the object to be monitored is in a state needing warning or not according to the image information of the object to be monitored in the environment image;
wherein, the judging module comprises:
the matching sub-module is used for matching the image information of the object to be monitored in the indoor environment image with the standard image information of the object to be monitored; and if the matching results are inconsistent, judging that the object to be monitored is in a state needing to be warned.
12. A robot of sweeping the floor, characterized in that includes:
one or more processors;
a memory for storing one or more programs;
the camera is used for collecting images;
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of any of claims 1-7.
13. A sweeping robot comprising the indoor environment warning device as claimed in any one of claims 8 to 11.
14. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 7.
CN201910699637.7A 2019-07-31 2019-07-31 Indoor environment warning method and device and sweeping robot Active CN112394717B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910699637.7A CN112394717B (en) 2019-07-31 2019-07-31 Indoor environment warning method and device and sweeping robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910699637.7A CN112394717B (en) 2019-07-31 2019-07-31 Indoor environment warning method and device and sweeping robot

Publications (2)

Publication Number Publication Date
CN112394717A CN112394717A (en) 2021-02-23
CN112394717B true CN112394717B (en) 2022-07-26

Family

ID=74601158

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910699637.7A Active CN112394717B (en) 2019-07-31 2019-07-31 Indoor environment warning method and device and sweeping robot

Country Status (1)

Country Link
CN (1) CN112394717B (en)

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103926912B (en) * 2014-05-07 2016-07-06 桂林赛普电子科技有限公司 A kind of intelligent family monitoring system based on home-services robot
CN205568875U (en) * 2016-03-30 2016-09-14 中国计量学院 Robot of sweeping floor
CN207545026U (en) * 2017-06-01 2018-06-29 北京小米移动软件有限公司 Intelligent robot for sweeping floor
CN107544521A (en) * 2017-10-26 2018-01-05 绵阳鑫阳知识产权运营有限公司 Household cleaning equipment with alarm function
CN108335458A (en) * 2018-03-05 2018-07-27 李孟星 It is a kind of to see that the domestic intelligent of people sees guard system and its keeps an eye on method
CN108614545B (en) * 2018-05-31 2021-05-07 北京智行者科技有限公司 Abnormal state monitoring method
CN108806142A (en) * 2018-06-29 2018-11-13 炬大科技有限公司 A kind of unmanned security system, method and sweeping robot

Also Published As

Publication number Publication date
CN112394717A (en) 2021-02-23

Similar Documents

Publication Publication Date Title
CN109522803A (en) A kind of room area divides and recognition methods, device and terminal device
US5757287A (en) Object recognition system and abnormality detection system using image processing
US20130265419A1 (en) System and method for available parking space estimation for multispace on-street parking
CN110587597B (en) SLAM closed loop detection method and detection system based on laser radar
CN111127500A (en) Space partitioning method and device and mobile robot
CN111176301A (en) Map construction method and sweeping method of sweeping robot
CN112369982B (en) Threshold identification method and device, sweeping robot and storage medium
CN111322993B (en) Visual positioning method and device
CN111178315B (en) Method and device for identifying corner and computer equipment
CN110610150A (en) Tracking method, device, computing equipment and medium of target moving object
Wang et al. Automatic node selection and target tracking in wireless camera sensor networks
CN110262487B (en) Obstacle detection method, terminal and computer readable storage medium
CN111012254A (en) Intelligent floor sweeping robot
CN112085838A (en) Automatic cleaning equipment control method and device and storage medium
CN110315538B (en) Method and device for displaying barrier on electronic map and robot
CN112394717B (en) Indoor environment warning method and device and sweeping robot
CN114779777A (en) Sensor control method and device for self-moving robot, medium and robot
CN112748721A (en) Visual robot and cleaning control method, system and chip thereof
CN108062515B (en) Obstacle detection method and system based on binocular vision and storage medium
CN111199177A (en) Automobile rearview pedestrian detection alarm method based on fisheye image correction
CN112205939A (en) Control method of cleaning robot, cleaning robot and Internet of things system
CN111784750A (en) Method, device and equipment for tracking moving object in video image and storage medium
CN111428626A (en) Moving object identification method and device and storage medium
CN112927278A (en) Control method, control device, robot and computer-readable storage medium
Bravo et al. Outdoor vacant parking space detector for improving mobility in smart cities

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210305

Address after: No.39 Caohu Avenue, Xiangcheng Economic Development Zone, Suzhou, Jiangsu Province, 215000

Applicant after: Meizhizongheng Technology Co.,Ltd.

Address before: No.39 Caohu Avenue, Xiangcheng Economic Development Zone, Suzhou, Jiangsu Province, 215000

Applicant before: JIANGSU MIDEA CLEANING APPLIANCES Co.,Ltd.

Applicant before: MIDEA GROUP Co.,Ltd.

GR01 Patent grant
GR01 Patent grant