WO2023138063A1 - 家居巡检方法、非易失性可读存储介质和计算机设备 - Google Patents

家居巡检方法、非易失性可读存储介质和计算机设备 Download PDF

Info

Publication number
WO2023138063A1
WO2023138063A1 PCT/CN2022/116850 CN2022116850W WO2023138063A1 WO 2023138063 A1 WO2023138063 A1 WO 2023138063A1 CN 2022116850 W CN2022116850 W CN 2022116850W WO 2023138063 A1 WO2023138063 A1 WO 2023138063A1
Authority
WO
WIPO (PCT)
Prior art keywords
event
home
inspection
information
data
Prior art date
Application number
PCT/CN2022/116850
Other languages
English (en)
French (fr)
Inventor
梅江元
刘三军
李育胜
区志财
罗富波
唐剑
Original Assignee
美的集团(上海)有限公司
美的集团股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 美的集团(上海)有限公司, 美的集团股份有限公司 filed Critical 美的集团(上海)有限公司
Publication of WO2023138063A1 publication Critical patent/WO2023138063A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0251Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Definitions

  • the present application relates to the technical field of home inspection, and in particular, relates to a home inspection method, a non-volatile readable storage medium and computer equipment.
  • home anomaly detection relies on a single sensor with a fixed setting, such as a smoke sensor, and a sensor can only detect specific events within a fixed range, which has poor versatility.
  • This application aims to solve at least one of the technical problems existing in the prior art or related art.
  • the first aspect of the present application proposes a home inspection method.
  • the second aspect of the present application proposes a home inspection device.
  • a third aspect of the present application proposes a non-volatile readable storage medium.
  • a fourth aspect of the present application proposes a computer device.
  • a fifth aspect of the present application proposes a computer program product.
  • the first aspect of the present application provides a home inspection method, including: determining an inspection route; capturing home image data along the inspection path; determining home inspection events according to the home image data; determining a processing program corresponding to the abnormal event when the home inspection event includes an abnormal event; and executing the processing program.
  • the home space or home equipment can be inspected by an inspection robot, or by arranging a plurality of electrically connected fixed cameras.
  • the inspection robot includes a walking system and a machine vision system.
  • the walking system can include walking wheels, walking tracks, etc., and relies on navigation radar, vision system, infrared sensor and other equipment to realize automatic inspection movement along the inspection path.
  • the machine vision system can detect the environment images around the robot in real time.
  • the machine vision system can be a camera, which can be a panoramic camera or a wide-angle camera installed on a turntable, so as to take a comprehensive shot of the area the robot passes through and obtain home image data on the inspection path.
  • the main control system of the inspection robot is equipped with an algorithm processor and a corresponding image recognition algorithm model.
  • the image algorithm model is used to perform artificial intelligence recognition on the captured home image data, thereby identifying home inspection events in various areas of the inspection path, and judging whether there are abnormal events in these home inspection events.
  • the inspection path passes through various rooms in the user's home, including bedrooms, living rooms, bathrooms, and kitchens.
  • the inspection robot walks along the inspection path to the bedroom, it takes image information in the bedroom, and based on the image information, combined with machine vision algorithms and big data artificial intelligence models, it extracts the information of key points of interest in the bedroom, such as the opening and closing status of bedroom windows, the switching status of bedroom lights, etc.
  • the robot when the robot enters the bathroom, living room, and kitchen, it extracts key points of interest information in the bathroom, living room, or kitchen, such as the opening and closing status of the bathroom faucet, such as the switching status of the stove in the kitchen, such as the opening and closing status of the refrigerator door in the living room, etc.
  • the control system of the inspection robot judges whether any abnormal events are included in these home inspection events according to the built-in judgment logic or user-defined judgment logic.
  • the door of the refrigerator in the living room should be closed. If it is judged that the door of the refrigerator in the living room is open, it will be recorded as an abnormal event.
  • the inspection robot searches for the corresponding processing program, and executes the corresponding processing program to eliminate and resolve these abnormal events.
  • the inspection robot is connected to the Internet of Things of the smart home. Taking the inspection robot detecting the abnormal event of "turning on the bedroom lighting during the day" as an example, the inspection robot can control the bedroom lighting to turn off through the Internet of Things by executing the corresponding processing program, so as to realize the processing of the abnormal event.
  • the inspection robot detects home inspection events based on visual sensors and image processing technology, and does not rely on a single sensor. Therefore, it can accurately detect different types of events in a wide range, and has good versatility. At the same time, by executing the corresponding processing program, the detected abnormal events are processed and eliminated, thereby having the ability to handle abnormalities, and realizing flexible and convenient home environment detection and abnormal processing.
  • the home inspection method in the above technical solution provided by the present application may also have the following additional technical features:
  • determining the home inspection event according to the home image data includes: performing image recognition on the home image data to determine the image information included in the home image data; and determining the home inspection event according to the image information.
  • the inspection robot uses an image sensor to capture real-time home image data in the inspection path and the passing area during the inspection work according to the inspection path.
  • the home image data is acquired and stored in the form of video.
  • the home image data that is, each frame of the captured video, is acquired and image recognition is performed.
  • image recognition can be performed for each frame of image, or after frame images are classified, similar frames are integrated and processed, and then image recognition is performed, thereby reducing calculation pressure.
  • the specific image information contained in each frame of the home image data is identified.
  • the image information can be home image information, such as images of windows, lighting images, refrigerators and other home appliances, or images of people, such as user images, images of strangers, etc.
  • the home inspection events in the home environment passed by the current inspection path are determined, such as the opening and closing state of the bedroom window, such as the switching state of the bedroom light, such as the opening and closing state of the bathroom faucet, such as the switching state of the stove in the kitchen, such as the opening and closing state of the refrigerator door in the living room, etc.
  • the inspection robot includes a sensor component for acquiring environmental information; determining the home inspection event according to the image information includes: determining the home inspection event according to the image information and the environmental information.
  • the inspection robot is also provided with sensor components.
  • the sensor assembly is arranged on the body of the inspection robot.
  • the sensor components can also be distributed and arranged at fixed positions in the home environment, and perform data command interaction with the inspection robot body through a wireless data network.
  • the corresponding environmental information can be obtained.
  • the environmental information can include oxygen content information, smog content information, temperature information, rainfall information, light information, etc.
  • the inspection robot when the inspection robot detects that the window of the room is open through the visual sensor and image processing algorithm, the inspection robot further obtains the rainfall information collected by the rain sensor. If the rainfall information shows that it is currently raining, it regards "window opening” as an abnormal event. At this time, the Internet of Things system controls the automatic closing of the window. If the rainfall information shows that it is not currently raining, "window opening" is not considered an abnormal event.
  • the inspection path passes through at least one target area
  • the home image data includes first data and/or second data, wherein the first data is data of the inspection path, and the second data is data of the target area.
  • the inspection path is a path planned by the user, and may also be a path automatically generated by the system.
  • the inspection path is the walking path of the inspection robot.
  • the inspection robot When the inspection robot performs inspection work according to the inspection path, the inspection robot will pass through various sub-areas in the area to be inspected, such as each room in the user's home, including bedrooms, living rooms, kitchens, bathrooms, etc.
  • Each of the sub-areas can be regarded as a target area.
  • these target areas will be used as nodes to plan the overall inspection path, so that the inspection path of the inspection robot passes through at least one of the above-mentioned target areas. For example, if the inspection robot sequentially inspects the bedroom, living room, kitchen, and bathroom, the inspection path will first pass through the bedroom, then enter the living room, then enter the kitchen, and finally reach the bathroom.
  • the home image data captured and recorded by the inspection robot specifically includes the first data and the second data according to the area where it is located.
  • the first data specifically refers to all video data captured in real time by the inspection robot while walking along the inspection path.
  • the second data is the video data captured by the inspection robot in the target area after arriving in the target area.
  • the way the inspection robot captures video data may be different.
  • the inspection robot takes home image data during the whole inspection, that is, whether it is the home image data in the target area or the home image data on the path before reaching the target area, all are collected, that is, the first data is obtained, thereby improving the scope of inspection.
  • the inspection robot can be controlled to only capture the household image data in the target area, and for the path before reaching the target area, the household image data is not captured to obtain the second data. Since the second data only includes the household image data in the target area, the pressure on data storage and data processing can be reduced, thereby improving performance.
  • the image information includes person information
  • the abnormal event includes a first event
  • the first event corresponds to the first data and/or the second data
  • determining the home inspection event according to the image information includes: determining the corresponding identity information according to the person information; when the identity information does not match the preset identity, determining that the home inspection event includes the first event
  • determining the processing program corresponding to the abnormal event includes: generating first alarm information corresponding to the first event, and sending the first alarm information to the first terminal.
  • the inspection robot identifies the information of people in the home environment through the home image data obtained by shooting, so as to determine the identity of the people in the home environment. Specifically, the inspection robot can learn the "appearance" of family members by pre-collecting images of family members from various angles, thereby recognizing the identity information of family members and forming a preset identity.
  • the inspection robot extracts the images of people that may be included in the home image data obtained by shooting, and obtains their identity information based on the extracted images of people. If it is determined that the identity information belongs to the category of preset identity information, that is, in the preset information, there is an identity information that matches the current identity information, it can be determined that the person photographed is a family member, and a normal home inspection event is recorded at this time.
  • the inspection robot generates first alarm information, and the first alarm information is used to warn the user that there are strangers in the home.
  • the inspection robot can send the first alarm information to the mobile phone of the user, that is, the first terminal, so as to effectively warn the user.
  • the inspection robot when the inspection robot detects the first event, that is, when there is a stranger at home, the inspection robot can adjust the current inspection route in real time, so that the inspection robot follows the person corresponding to the person information, that is, the stranger, and takes a real-time image of the person corresponding to the person information to realize real-time monitoring of the stranger.
  • the abnormal event includes a second event, the second event is an abnormal state event of the target home appliance, and the second event corresponds to the second data; determining the processing program corresponding to the abnormal event includes: controlling the inspection robot to generate a control instruction, wherein the control instruction is used to instruct the target home appliance to process the second event; and sending the control instruction to the target home appliance.
  • the target home device may be a smart home appliance or a smart home device, and the inspection robot is connected to the same Internet of Things with these target home devices.
  • the inspection robot finds that the target household equipment is in an abnormal state in the target area, it judges that the current household inspection event is an abnormal event, specifically the second event.
  • the second event may be "the light in the bedroom is turned on during the day", “the refrigerator door is left open” and so on.
  • the inspection robot When it is determined that there is a second event, the inspection robot generates a corresponding control instruction according to the target household device corresponding to the second event and the event type of the second event.
  • the inspection robot finds that the lighting in the bedroom is not turned off during the day, it can generate a control command of “turning off the lighting in the bedroom” and send the command to the smart switch of the lighting in the bedroom through the smart gateway, or remotely control the smart switch of the lighting in the bedroom through a cloud server, so as to turn off the corresponding lighting to solve the second event.
  • the inspection robot finds that the refrigerator door is not closed, it generates a control command of "close the refrigerator door", and sends the command to the controller of the refrigerator, thereby closing the refrigerator door through the correspondingly set motor, thereby processing the second event.
  • control command based on the Internet of Things is used to control the home equipment to eliminate abnormal events, thereby realizing flexible and convenient home environment detection and abnormal handling.
  • the abnormal event includes a third event, and the third event corresponds to the first data and/or the second data; determining a processing program corresponding to the abnormal event includes: generating second alarm information corresponding to the third event, and sending the second alarm information to the second terminal.
  • abnormal events also include a third event.
  • the third event includes serious situations such as fire and flooding.
  • the inspection robot finds situations such as flames, smoke, and water accumulation through image processing, or detects corresponding environmental information through a temperature sensor, smoke sensor, or water accumulation sensor, the inspection robot immediately generates a second alarm message and sends it to the corresponding second terminal. sex and reliability.
  • the third event includes N event categories, and N is a positive integer; generating second alarm information corresponding to the third event includes: obtaining scene information; determining the priority order of N event categories corresponding to the third event according to the scene information; generating second alarm information according to the N event categories and priority orders, wherein the second alarm information includes N alarm contents, and the N event categories correspond to the N alarm contents one-to-one.
  • the third event also includes the situation of household abnormalities that cannot be automatically processed, or no automatic processing rules have been set, and these abnormalities constitute multiple event types of the third event. For example, if the user's window is not a smart window that can be opened and closed automatically, or the user's light does not support remote control switch, then these two scenarios are respectively two types of events, namely "open window” and "open light".
  • the generated second alarm information correspondingly includes different alarm content.
  • the event type of "open window” corresponds to the alarm content “please note that the bedroom window is not closed”
  • the event type of "unclosed light” corresponds to the alarm content "please note that the bedroom light is not closed”.
  • these alarm contents have different priorities.
  • the priority of the alarm content can be dynamically adjusted.
  • the scene information includes daytime and nighttime. During the daytime, the priority of "open lights” is higher than that of "open windows", and at night, the priority of "open windows” is higher than that of "open lights”.
  • the priority order of different event categories is adjusted, thereby generating alarm content in different orders, which is conducive to improving the pertinence of home inspections.
  • the abnormal event includes a fourth event, the fourth event is a custom event, and the fourth event corresponds to the first data and/or the second data;
  • Determining the home inspection event according to the image information includes: determining the home inspection event according to the image information and a preset image recognition model; before determining the home inspection event based on the home image data, the method further includes: receiving a custom image data set, wherein the custom image data set includes an image corresponding to the fourth event; training the image recognition model through the custom image data set, so that the image recognition model can recognize the fourth event according to the image information.
  • the user can customize the abnormal event, so that targeted inspection can be performed on the user-defined abnormal event. For example, users can set the scene of "clothes falling on the ground” as an abnormal event, so that when clothes are detected falling on the ground during home inspection, a corresponding prompt will be sent to the user, thereby realizing a more intelligent inspection.
  • the image information can be recognized by artificial intelligence through a preset image recognition model, so as to identify the inspection event included in the captured image.
  • the scene of the fourth event can be manually set.
  • the fourth event including the scene of "clothes falling on the ground” as an example
  • the user can manually place the clothes at different positions on the ground and take photos from different angles to form a custom image data set. The more pictures taken in the data set, the higher the recognition accuracy of the abnormal inspection.
  • the user By inputting the custom image data set into the image recognition model, the user performs targeted training on the image recognition model, so that the trained image recognition model can recognize the user-defined fourth event. Specifically, when it is detected that “clothes are on the floor” exists in the real-time home image data, the inspection event output by the image recognition model will include the corresponding fourth event, and generate corresponding alarm content, prompting the user to pick up the clothes on the ground, realizing intelligent home inspection.
  • determining the inspection route includes: acquiring map data of the area to be inspected; and determining the inspection route according to the map data.
  • semantic map data of the area to be inspected can be input, and the semantic map data includes the semantic map of the user's family, which indicates the area range and position coordinates of at least one target area.
  • a three-dimensional semantic map of the indoor home scene is established, and an inspection path is automatically planned according to the three-dimensional semantic map. It can be understood that the inspection path can pass through all target areas in the indoor home scene, thereby improving the coverage of the inspection and realizing high-precision and high-reliability indoor inspection.
  • the second aspect of the present application provides a home inspection device, including:
  • a determination module is used to determine the inspection path
  • the shooting module is used to shoot corresponding home image data along the inspection path
  • the determination module is also used to determine the home inspection event according to the home image data; when the home inspection event includes an abnormal event, determine the processing program corresponding to the abnormal event;
  • Execution module used to execute handlers.
  • the home space or home equipment can be inspected by an inspection robot, or by arranging a plurality of electrically connected fixed cameras.
  • the inspection robot includes a walking system and a machine vision system.
  • the walking system can include walking wheels, walking tracks, etc., and relies on navigation radar, vision system, infrared sensor and other equipment to realize automatic inspection movement along the inspection path.
  • the machine vision system can detect the environment images around the robot in real time.
  • the machine vision system can be a camera, which can be a panoramic camera or a wide-angle camera installed on a turntable, so as to take a comprehensive shot of the area the robot passes through and obtain home image data on the inspection path.
  • the main control system of the inspection robot is equipped with an algorithm processor and a corresponding image recognition algorithm model.
  • the image algorithm model is used to perform artificial intelligence recognition on the captured home image data, thereby identifying home inspection events in various areas of the inspection path, and judging whether there are abnormal events in these home inspection events.
  • the inspection path passes through various rooms in the user's home, including bedrooms, living rooms, bathrooms, and kitchens.
  • the inspection robot walks along the inspection path to the bedroom, it takes image information in the bedroom, and based on the image information, combined with machine vision algorithms and big data artificial intelligence models, it extracts the information of key points of interest in the bedroom, such as the opening and closing status of bedroom windows, the switching status of bedroom lights, etc.
  • the robot when the robot enters the bathroom, living room, and kitchen, it extracts key points of interest information in the bathroom, living room, or kitchen, such as the opening and closing status of the bathroom faucet, such as the switching status of the stove in the kitchen, such as the opening and closing status of the refrigerator door in the living room, etc.
  • the control system of the inspection robot judges whether any abnormal events are included in these home inspection events according to the built-in judgment logic or user-defined judgment logic.
  • the door of the refrigerator in the living room should be closed. If it is judged that the door of the refrigerator in the living room is open, it will be recorded as an abnormal event.
  • the inspection robot searches for the corresponding processing program, and executes the corresponding processing program to eliminate and resolve these abnormal events.
  • the inspection robot is connected to the Internet of Things of the smart home. Taking the inspection robot detecting the abnormal event of "turning on the bedroom lighting during the day" as an example, the inspection robot can control the bedroom lighting to turn off through the Internet of Things by executing the corresponding processing program, so as to realize the processing of the abnormal event.
  • the inspection robot detects home inspection events based on visual sensors and image processing technology, and does not rely on a single sensor. Therefore, it can accurately detect different types of events in a wide range, and has good versatility. At the same time, by executing the corresponding processing program, the detected abnormal events are processed and eliminated, thereby having the ability to handle abnormalities, and realizing flexible and convenient home environment detection and abnormal processing.
  • the third aspect of the present application provides a non-volatile readable storage medium, on which a program or instruction is stored.
  • the program or instruction is executed by a processor, the steps of the home inspection method provided in at least one of the above technical solutions are implemented. Therefore, the non-volatile readable storage medium also includes all the beneficial effects of the home inspection method provided in at least one of the above technical solutions. In order to avoid repetition, details are not repeated here.
  • the fourth aspect of the present application provides a computer device, including: a memory for storing programs or instructions; a processor for implementing the steps of the home inspection method provided in at least one of the above technical solutions when executing the programs or instructions. Therefore, the computer device also includes all the beneficial effects of the home inspection method provided in at least one of the above technical solutions. To avoid repetition, details are not repeated here.
  • the fifth aspect of the present application provides a computer program product, the computer program product is stored in a storage medium, and the computer program product is executed by at least one processor to implement the steps of the home inspection method provided in at least one of the above technical solutions. Therefore, the computer program product also includes all the beneficial effects of the home inspection method provided in the above at least one technical solution. To avoid repetition, details are not repeated here.
  • Fig. 1 shows the flow chart of the home inspection method according to the embodiment of the application
  • Fig. 2 shows a schematic diagram of working logic of an inspection robot according to an embodiment of the present application
  • Fig. 3 shows a structural block diagram of a home inspection device according to an embodiment of the present application.
  • FIG. 1 shows a flow chart of the home inspection method according to an embodiment of the application. As shown in FIG. 1, the method includes:
  • Step 102 determine the inspection path
  • Step 104 taking home image data along the inspection path
  • Step 106 determine the home inspection event according to the home image data
  • Step 108 in the case that the home inspection event includes an abnormal event, determine a processing program corresponding to the abnormal event;
  • Step 110 execute the processing program.
  • the home space or home equipment can be inspected by an inspection robot, or by arranging a plurality of electrically connected fixed cameras.
  • the inspection robot includes a walking system and a machine vision system.
  • the walking system can include walking wheels, walking tracks, etc., and relies on navigation radar, vision system, infrared sensor and other equipment to realize automatic inspection movement along the inspection path.
  • the machine vision system can detect the environment images around the robot in real time.
  • the machine vision system can be a camera, which can be a panoramic camera or a wide-angle camera installed on a turntable, so as to take a comprehensive shot of the area the robot passes through and obtain home image data on the inspection path.
  • the main control system of the inspection robot is equipped with an algorithm processor and a corresponding image recognition algorithm model.
  • the image algorithm model is used to perform artificial intelligence recognition on the captured home image data, thereby identifying home inspection events in various areas of the inspection path, and judging whether there are abnormal events in these home inspection events.
  • the inspection path passes through various rooms in the user's home, including bedrooms, living rooms, bathrooms, and kitchens.
  • the inspection robot walks along the inspection path to the bedroom, it takes image information in the bedroom, and based on the image information, combined with machine vision algorithms and big data artificial intelligence models, it extracts the information of key points of interest in the bedroom, such as the opening and closing status of bedroom windows, the switching status of bedroom lights, etc.
  • the robot when the robot enters the bathroom, living room, and kitchen, it extracts key points of interest information in the bathroom, living room, or kitchen, such as the opening and closing status of the bathroom faucet, such as the switching status of the stove in the kitchen, such as the opening and closing status of the refrigerator door in the living room, etc.
  • the control system of the inspection robot judges whether any abnormal events are included in these home inspection events according to the built-in judgment logic or user-defined judgment logic.
  • the door of the refrigerator in the living room should be closed. If it is judged that the door of the refrigerator in the living room is open, it will be recorded as an abnormal event.
  • the inspection robot searches for the corresponding processing program, and executes the corresponding processing program to eliminate and resolve these abnormal events.
  • the inspection robot is connected to the Internet of Things of the smart home. Taking the inspection robot detecting the abnormal event of "turning on the bedroom lighting during the day" as an example, the inspection robot can control the bedroom lighting to turn off through the Internet of Things by executing the corresponding processing program, so as to realize the processing of the abnormal event.
  • the inspection robot detects home inspection events based on visual sensors and image processing technology, and does not rely on a single sensor. Therefore, it can accurately detect different types of events in a wide range, and has good versatility. At the same time, by executing the corresponding processing program, the detected abnormal events are processed and eliminated, thereby having the ability to handle abnormalities, and realizing flexible and convenient home environment detection and abnormal processing.
  • determining the home inspection event according to the home image data includes: performing image recognition on the home image data to determine image information included in the home image data; and determining the home inspection event according to the image information.
  • the inspection robot uses the image sensor to capture real-time home image data in the inspection route and the passing area.
  • the home image data is acquired and stored in the form of video.
  • the home image data that is, each frame of the captured video, is acquired and image recognition is performed.
  • image recognition can be performed for each frame of image, or after frame images are classified, similar frames are integrated and processed, and then image recognition is performed, thereby reducing calculation pressure.
  • the specific image information contained in each frame of the home image data is identified.
  • the image information can be home image information, such as images of windows, lighting images, refrigerators and other home appliances, or images of people, such as user images, images of strangers, etc.
  • the home inspection events in the home environment passed by the current inspection path are determined, such as the opening and closing state of the bedroom window, such as the switching state of the bedroom light, such as the opening and closing state of the bathroom faucet, such as the switching state of the stove in the kitchen, such as the opening and closing state of the refrigerator door in the living room, etc.
  • the inspection robot includes a sensor component for acquiring environment information; determining the home inspection event according to the image information includes: determining the home inspection event according to the image information and the environment information.
  • the inspection robot is also provided with a sensor component.
  • the sensor component is arranged on the body of the inspection robot.
  • the sensor component can also be distributed and arranged in a fixed position in the home environment, and perform data command interaction with the inspection robot body through a wireless data network.
  • the corresponding environmental information can be obtained.
  • the environmental information can include oxygen content information, smog content information, temperature information, rainfall information, light information, etc.
  • the inspection robot when the inspection robot detects that the window of the room is open through the visual sensor and image processing algorithm, the inspection robot further obtains the rainfall information collected by the rain sensor. If the rainfall information shows that it is currently raining, it regards "window opening” as an abnormal event. At this time, the Internet of Things system controls the automatic closing of the window. If the rainfall information shows that it is not currently raining, "window opening" is not considered an abnormal event.
  • the inspection route passes through at least one target area
  • the household image data includes first data and/or second data, wherein the first data is the data of the inspection route, and the second data is the data of the target area.
  • the inspection route is a route planned by the user, and may also be a route automatically generated by the system.
  • the inspection path is the walking path of the inspection robot.
  • the inspection robot When the inspection robot performs inspection work according to the inspection path, the inspection robot will pass through various sub-areas in the area to be inspected, such as each room in the user's home, including bedrooms, living rooms, kitchens, bathrooms, etc.
  • Each of the sub-areas can be regarded as a target area.
  • these target areas will be used as nodes to plan the overall inspection path, so that the inspection path of the inspection robot passes through at least one of the above-mentioned target areas. For example, if the inspection robot sequentially inspects the bedroom, living room, kitchen, and bathroom, the inspection path will first pass through the bedroom, then enter the living room, then enter the kitchen, and finally reach the bathroom.
  • the home image data captured and recorded by the inspection robot specifically includes the first data and the second data according to the area where it is located.
  • the first data specifically refers to all video data captured in real time by the inspection robot while walking along the inspection path.
  • the second data is the video data captured by the inspection robot in the target area after arriving in the target area.
  • the way the inspection robot captures video data may be different.
  • the inspection robot takes home image data during the whole inspection, that is, whether it is the home image data in the target area or the home image data on the path before reaching the target area, all are collected, that is, the first data is obtained, thereby improving the scope of inspection.
  • the inspection robot can be controlled to only capture the household image data in the target area, and for the path before reaching the target area, the household image data is not captured to obtain the second data. Since the second data only includes the household image data in the target area, the pressure on data storage and data processing can be reduced, thereby improving performance.
  • the image information includes person information
  • the abnormal event includes a first event
  • the first event corresponds to the first data and/or the second data
  • determining the home inspection event according to the image information includes: determining the corresponding identity information according to the person information; when the identity information does not match the preset identity, determining that the home inspection event includes the first event
  • determining the processing program corresponding to the abnormal event includes: generating first alarm information corresponding to the first event, and sending the first alarm information to the first terminal.
  • the inspection robot identifies the information of persons in the home environment through photographed home image data, thereby determining the identities of the people in the home environment. Specifically, the inspection robot can learn the "appearance" of family members by pre-collecting images of family members from various angles, thereby recognizing the identity information of family members and forming a preset identity.
  • the inspection robot extracts the images of people that may be included in the home image data obtained by shooting, and obtains their identity information based on the extracted images of people. If it is determined that the identity information belongs to the category of preset identity information, that is, in the preset information, there is an identity information that matches the current identity information, it can be determined that the person photographed is a family member, and a normal home inspection event is recorded at this time.
  • the inspection robot generates first alarm information, and the first alarm information is used to warn the user that there are strangers in the home.
  • the inspection robot can send the first alarm information to the user's mobile phone, that is, the first terminal, so as to effectively warn the user.
  • the inspection robot when the inspection robot detects the first event, that is, when there is a stranger at home, the inspection robot can adjust the current inspection route in real time, so that the inspection robot follows the person corresponding to the person information, that is, the stranger, and takes a real-time image of the person corresponding to the person information to realize real-time monitoring of the stranger.
  • the abnormal event includes a second event, the second event is an abnormal state event of the target household device, and the second event corresponds to the second data; determining the processing program corresponding to the abnormal event includes: controlling the inspection robot to generate a control instruction, wherein the control instruction is used to instruct the target household device to process the second event; and sending the control instruction to the target household device.
  • the inspection robot during the inspection process of the inspection robot, when the inspection robot enters the target area, it will conduct targeted inspections on the home appliances or home devices in the target area, wherein these home appliances or home devices to be inspected are the above-mentioned target home devices.
  • the target home device may be a smart home appliance or a smart home device, and the inspection robot is connected to the same Internet of Things with these target home devices.
  • the inspection robot finds that the target household equipment is in an abnormal state in the target area, it judges that the current household inspection event is an abnormal event, specifically the second event.
  • the second event may be "the light in the bedroom is turned on during the day", “the refrigerator door is left open” and so on.
  • the inspection robot When it is determined that there is a second event, the inspection robot generates a corresponding control instruction according to the target household device corresponding to the second event and the event type of the second event.
  • the inspection robot finds that the lighting in the bedroom is not turned off during the day, it can generate a control command of “turning off the lighting in the bedroom” and send the command to the smart switch of the lighting in the bedroom through the smart gateway, or remotely control the smart switch of the lighting in the bedroom through a cloud server, so as to turn off the corresponding lighting to solve the second event.
  • the inspection robot finds that the refrigerator door is not closed, it generates a control command of "close the refrigerator door", and sends the command to the controller of the refrigerator, thereby closing the refrigerator door through the correspondingly set motor, thereby processing the second event.
  • control command based on the Internet of Things is used to control the home equipment to eliminate abnormal events, thereby realizing flexible and convenient home environment detection and abnormal handling.
  • the abnormal event includes a third event, and the third event corresponds to the first data and/or the second data; determining the processing program corresponding to the abnormal event includes: generating a second alarm information corresponding to the third event, and sending the second alarm information to the second terminal.
  • the abnormal event also includes a third event.
  • the third event includes serious situations such as fire and flooding.
  • the inspection robot finds situations such as flames, smoke, and water accumulation through image processing, or detects corresponding environmental information through a temperature sensor, smoke sensor, or water accumulation sensor, the inspection robot immediately generates a second alarm message and sends it to the corresponding second terminal. effectiveness and reliability.
  • the third event includes N event categories, and N is a positive integer; generating the second alarm information corresponding to the third event includes: acquiring scene information; determining the priority order of the N event categories corresponding to the third event according to the scene information; generating second alarm information according to the N event categories and priority orders, wherein the second alarm information includes N alarm contents, and the N event categories correspond to the N alarm contents one-to-one.
  • the third event also includes abnormalities in the home that cannot be automatically processed, or where no automatic processing rules are set, and these abnormalities constitute multiple event types of the third event. For example, if the user's window is not a smart window that can be opened and closed automatically, or the user's light does not support remote control switch, then these two scenarios are respectively two types of events, namely "open window” and "open light".
  • the generated second alarm information correspondingly includes different alarm content.
  • the event type of "open window” corresponds to the alarm content “please note that the bedroom window is not closed”
  • the event type of "unclosed light” corresponds to the alarm content "please note that the bedroom light is not closed”.
  • these alarm contents have different priorities.
  • the priority of the alarm content can be dynamically adjusted.
  • the scene information includes daytime and nighttime. During the daytime, the priority of "open lights” is higher than that of "open windows", and at night, the priority of "open windows” is higher than that of "open lights”.
  • the priority order of different event categories is adjusted, thereby generating alarm content in different orders, which is conducive to improving the pertinence of home inspections.
  • the abnormal event includes a fourth event, the fourth event is a custom event, and the fourth event corresponds to the first data and/or the second data;
  • Determining the home inspection event according to the image information includes: determining the home inspection event according to the image information and a preset image recognition model; before determining the home inspection event based on the home image data, the method further includes: receiving a custom image data set, wherein the custom image data set includes an image corresponding to the fourth event; training the image recognition model through the custom image data set, so that the image recognition model can recognize the fourth event according to the image information.
  • the user can customize the abnormal events, so that targeted inspection can be performed on the user-defined abnormal events. For example, users can set the scene of "clothes falling on the ground” as an abnormal event, so that when clothes are detected falling on the ground during home inspection, a corresponding prompt will be sent to the user, thereby realizing a more intelligent inspection.
  • the image information can be recognized by artificial intelligence through a preset image recognition model, so as to identify the inspection event included in the captured image.
  • the scene of the fourth event can be manually set.
  • the scene of the fourth event including "clothes falling on the ground” as an example, the user can manually place the clothes at different positions on the ground and take photos from different angles to form a custom image data set. The more pictures taken in the data set, the higher the recognition accuracy of the abnormal inspection.
  • the user By inputting the custom image data set into the image recognition model, the user performs targeted training on the image recognition model, so that the trained image recognition model can recognize the user-defined fourth event. Specifically, when it is detected that “clothes are on the floor” exists in the real-time home image data, the inspection event output by the image recognition model will include the corresponding fourth event, and generate corresponding alarm content, prompting the user to pick up the clothes on the ground, realizing intelligent home inspection.
  • determining the inspection route includes: acquiring map data of an area to be inspected; and determining the inspection route according to the map data.
  • the semantic map data of the area to be inspected can be input.
  • the semantic map data includes the semantic map of the user's family, which indicates the area range and position coordinates of at least one target area.
  • a three-dimensional semantic map of the indoor home scene is established, and an inspection path is automatically planned according to the three-dimensional semantic map. It can be understood that the inspection path can pass through all target areas in the indoor home scene, thereby improving the coverage of the inspection and realizing high-precision and high-reliability indoor inspection.
  • determining the inspection path includes: receiving a setting instruction; and determining the inspection path according to the setting instruction.
  • the setting instruction is the instruction for the user to set the inspection path of the inspection robot. After receiving the setting instruction, the inspection robot establishes the corresponding inspection path according to the user's setting instruction, and performs the inspection operation according to the inspection path.
  • the setting instruction may be an instruction for setting a complete inspection instruction at one time, or an instruction for real-time remote control of an inspection robot. After the inspection robot completes an inspection, the inspection robot saves the path of this inspection. In the subsequent inspection work, it can directly call the corresponding path, thereby reducing the calculation cost.
  • the inspection robot is a part of the smart home system and is the main body of home abnormality detection. It is interconnected with various smart home appliances and can communicate wirelessly with the mobile phone; the inspection robot can be remotely controlled on the mobile APP (Application, application program), and the home environment can be viewed in real time on the mobile phone.
  • the inspection robot has a voice interaction function.
  • the inspection robot can set two inspection modes.
  • the first mode is automatic inspection, which can be automatically inspected according to a certain trajectory, or it can be inspected according to a user-defined method.
  • the second is manual remote control inspection.
  • the inspection robot stores the face identity information of the family members, and the user needs to register on the mobile APP and enter the family member’s face identity information; the inspection robot can control various household equipment, and can check the operation status of various household equipment; the inspection robot is embedded with various related sensors to assist the blind spots in the machine vision method.
  • Fig. 2 shows a schematic diagram of the working logic of the patrol robot according to the embodiment of the present application.
  • the working system of the patrol robot includes a user terminal, a patrol robot body and a server, wherein the patrol robot includes a visual sensor and a non-visual auxiliary sensor, wherein the visual sensor includes a camera, which can obtain a real-time scene image of the home, and the non-visual sensor includes a smoke sensor, a temperature sensor, a rain sensor, etc., and detects and analyzes the state of the home through the visual sensor and the non-visual auxiliary sensor.
  • the user terminal is the user's mobile phone APP.
  • the user establishes an inspection task through the mobile phone APP, and combines with the home exception database to designate the target inspection area or target home equipment.
  • the system automatically plans the inspection route according to the user's designation and sends it to the inspection robot.
  • the inspection robot body includes a main control system.
  • the main control system can control the exception processing module and the home control module.
  • the exception processing module can detect and identify home abnormalities.
  • the home control module can send control instructions to home devices, thereby controlling the home devices to handle abnormalities.
  • Data command interaction is performed between the server and the inspection robot.
  • the inspection robot detects an abnormal situation, it sends corresponding alarm information to the server, and the server sends an abnormal alarm to the user terminal to prompt the user of the abnormality.
  • Step 1 the user can import the pre-stored 3D semantic map of the indoor home scene on the user's mobile phone APP terminal, and select the inspection mode.
  • the first optional inspection mode is automatic inspection based on the semantic map;
  • the second optional inspection mode is to realize remote manual inspection according to user needs.
  • the user For the inspection of the remote control robot, the user only needs to directly control the robot in real time on the APP side of the mobile phone, and manually check whether there is any abnormal information.
  • Step 2 For the second automatic inspection method based on the semantic map, firstly, the user can select the device or area to be inspected on the mobile phone APP terminal, and the inspection robot will automatically plan the inspection route according to the current location, and start the inspection according to the user-defined inspection route; the system will automatically save the inspection task route, and the user can select it with one click next time.
  • Step 3 after step 1 and step 2 are ready, enter into step 3 home anomaly detection stage.
  • each frame in the home abnormality detection process is obtained in real time through the visual sensor of the inspection robot.
  • the inspection robot After the inspection robot arrives at the designated area, it detects the household equipment or area that may be abnormal. After detecting the household equipment of interest, the robot will adjust the pose according to the target position, so that the visual sensor used by the inspection robot can always be aligned with the equipment or area to be detected.
  • Step 4 when the inspection robot detects an abnormality in the home, the system will further analyze the abnormality in the home.
  • the home abnormality may include the following situations: the refrigerator is not turned off, the light is not turned off, the window is open, a fire occurs, a stranger is detected, and the like.
  • the inspection robot detects an abnormality in the home, it transmits the abnormality information to the abnormality processing module inside the robot for further analysis and processing.
  • the main control system of the robot will not only upload the abnormal information records to the cloud, but also send the abnormal information to the client.
  • the user can take corresponding measures according to the abnormal information, and the robot will also solve the abnormal problem by itself through the smart home control module according to the abnormal information.
  • the weather information is first obtained through the weather sensor. If the weather is rainy and windy, the auxiliary system determines that the home is abnormal, and the system reports the abnormal information to the user. At the same time, the smart home control system is used to close the smart window.
  • the detection priority is the same, that is, no matter which sensor detects the flame, it should immediately alert the user.
  • the inspection robot When a stranger is detected at any location, the inspection robot will track the stranger, synchronize the video to the cloud for backup, and immediately prompt the user.
  • the inspection robot When it is detected that the light is not turned off, the inspection robot sends a shutdown command to the smart home control module, and the smart home control module turns off the light after receiving the shutdown command.
  • clothes falling on the ground is regarded as an abnormality in the home.
  • Users can input images of abnormal scenes (clothes falling on the ground) for training, so that this scene is also added to the abnormal database to support the detection of this scene.
  • the embodiment of the present application uses machine vision plus other hardware sensors to realize the detection of abnormalities in the home. It can not only combine the background knowledge in the home scene, but also combine the experience and knowledge of other sensors to assist the blind spots detected by the visual method, effectively improving the abnormality detection ability of the robot.
  • the robot has the ability to detect abnormalities, and also has the ability to solve abnormal problems, which guarantees home safety to a certain extent. Integrate multiple anomaly detections inside the robot, identify different anomaly categories according to different home anomalies, and update and expand the home anomaly database according to requirements. Using the inspection robot to realize automatic multi-point inspection, flexible and convenient to use, strong scalability, suitable for abnormal detection of most home environments.
  • FIG. 3 shows a structural block diagram of the home inspection device according to the embodiment of the application.
  • the home inspection device 300 includes:
  • a photographing module 304 configured to photograph household image data along the inspection path
  • the determination module 302 is also used to determine the home inspection event according to the home image data; when the home inspection event includes an abnormal event, determine the processing program corresponding to the abnormal event;
  • the execution module 306 is configured to execute the processing program.
  • the home space or home equipment can be inspected by an inspection robot, or by arranging a plurality of electrically connected fixed cameras.
  • the inspection robot includes a walking system and a machine vision system.
  • the walking system can include walking wheels, walking tracks, etc., and relies on navigation radar, vision system, infrared sensor and other equipment to realize automatic inspection movement along the inspection path.
  • the machine vision system can detect the environment images around the robot in real time.
  • the machine vision system can be a camera, which can be a panoramic camera or a wide-angle camera installed on a turntable, so as to take a comprehensive shot of the area the robot passes through and obtain home image data on the inspection path.
  • the main control system of the inspection robot is equipped with an algorithm processor and a corresponding image recognition algorithm model.
  • the image algorithm model is used to perform artificial intelligence recognition on the captured home image data, thereby identifying home inspection events in various areas of the inspection path, and judging whether there are abnormal events in these home inspection events.
  • the inspection path passes through various rooms in the user's home, including bedrooms, living rooms, bathrooms, and kitchens.
  • the inspection robot walks along the inspection path to the bedroom, it takes image information in the bedroom, and based on the image information, combined with machine vision algorithms and big data artificial intelligence models, it extracts the information of key points of interest in the bedroom, such as the opening and closing status of bedroom windows, the switching status of bedroom lights, etc.
  • the robot when the robot enters the bathroom, living room, and kitchen, it extracts key points of interest information in the bathroom, living room, or kitchen, such as the opening and closing status of the bathroom faucet, such as the switching status of the stove in the kitchen, such as the opening and closing status of the refrigerator door in the living room, etc.
  • the control system of the inspection robot judges whether any abnormal events are included in these home inspection events according to the built-in judgment logic or user-defined judgment logic.
  • the door of the refrigerator in the living room should be closed. If it is judged that the door of the refrigerator in the living room is open, it will be recorded as an abnormal event.
  • the inspection robot searches for the corresponding processing program, and executes the corresponding processing program to eliminate and resolve these abnormal events.
  • the inspection robot is connected to the Internet of Things of the smart home. Taking the inspection robot detecting the abnormal event of "turning on the bedroom lighting during the day" as an example, the inspection robot can control the bedroom lighting to turn off through the Internet of Things by executing the corresponding processing program, so as to realize the processing of the abnormal event.
  • the inspection robot detects home inspection events based on visual sensors and image processing technology, and does not rely on a single sensor. Therefore, it can accurately detect different types of events in a wide range, and has good versatility. At the same time, by executing the corresponding processing program, the detected abnormal events are processed and eliminated, thereby having the ability to handle abnormalities, and realizing flexible and convenient home environment detection and abnormal processing.
  • a non-volatile readable storage medium on which programs or instructions are stored.
  • the steps of the home inspection method provided in at least one of the above embodiments are implemented. Therefore, the non-volatile readable storage medium also includes all the beneficial effects of the home inspection method provided in at least one of the above embodiments. To avoid repetition, details are not repeated here.
  • an inspection robot including the home inspection device as provided in at least one of the above embodiments, and/or the non-volatile readable storage medium as provided in at least one of the above embodiments, therefore, the inspection robot also includes all the beneficial effects of the control device as provided in at least one of the above embodiments and/or the readable storage medium as provided in at least one of the above embodiments, and will not be repeated here to avoid repetition.
  • a computer device including: a memory for storing programs or instructions; a processor for implementing the steps of the home inspection method provided in at least one of the above embodiments when executing the programs or instructions. Therefore, the computer device also includes all the beneficial effects of the home inspection method provided in at least one of the above embodiments. To avoid repetition, details are not repeated here.
  • the computer equipment includes a patrol robot.
  • a computer program product is provided.
  • the steps of the home inspection method provided in at least one of the above embodiments are implemented. Therefore, the computer program also includes all the beneficial effects of the home inspection method provided in at least one of the above embodiments. To avoid repetition, details are not repeated here.
  • the term “plurality” refers to two or more. Unless otherwise clearly defined, the orientation or positional relationship indicated by the terms “upper”, “lower”, etc. is based on the orientation or positional relationship described in the drawings, and is only for the convenience of describing the application and simplifying the description, rather than indicating or implying that the device or element referred to must have a specific orientation, be constructed and operated in a specific orientation, so it cannot be construed as limiting the application; "It can be a fixed connection, a detachable connection, or an integral connection; it can be a direct connection or an indirect connection through an intermediary. Those of ordinary skill in the art can understand the specific meanings of the above terms in this application according to specific situations.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Alarm Systems (AREA)

Abstract

一种家居巡检方法、非易失性可读存储介质和计算机设备,其中,家居巡检方法,包括:确定巡检路径(102);沿巡检路径拍摄对应的家居图像数据(104);根据家居图像数据,确定家居巡检事件(106);在家居巡检事件包括异常事件的情况下,确定异常事件对应的处理程序(108);执行处理程序(110)。基于视觉传感器和图像处理技术,来对家居巡检事件进行检测,不依赖于单一的传感器,因此能够针对大范围内的不同事件类型进行准确检测,具有良好的泛用性。同时,通过执行对应的处理程序,来对检测到的异常事件进行处理和排除,从而具有了处理异常的能力,实现了灵活方便的家居环境检测和异常处理。

Description

家居巡检方法、非易失性可读存储介质和计算机设备
本申请要求于2022年01月24日提交中国专利局、申请号为“202210081422.0”、申请名称为“家居巡检方法、非易失性可读存储介质和计算机设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及家居巡检技术领域,具体而言,涉及一种家居巡检方法、非易失性可读存储介质和计算机设备。
背景技术
在相关技术中,家居异常检测依赖于固定设置的单个传感器,如烟雾传感器,而一种传感器只能检测固定范围内的特定事件,泛用性差。
发明内容
本申请旨在至少解决现有技术或相关技术中存在的技术问题之一。
为此,本申请的第一方面提出一种家居巡检方法。
本申请的第二方面提出一种家居巡检装置。
本申请的第三方面提出一种非易失性可读存储介质。
本申请的第四方面提出一种计算机设备。
本申请的第五方面提出一种计算机程序产品。
有鉴于此,本申请的第一方面提供了一种家居巡检方法,包括:确定巡检路径;沿巡检路径拍摄家居图像数据;根据家居图像数据,确定家居巡检事件;在家居巡检事件包括异常事件的情况下,确定异常事件对应的处理程序;执行处理程序。
在该技术方案中,可以通过巡检机器人,对家居空间或家居设备进行巡检,也可以通过设置电连接的多个固定摄像头,来进行巡检。以通过巡检机器人来进行家居巡检为例,该巡检机器人包括走行系统和机器视觉系 统,其中,走行系统可以包括走行轮、走行履带等,并依靠导航雷达、视觉系统、红外线传感器等设备,实现沿巡检路径自动进行巡检运动。机器视觉系统能够实时检测机器人四周的环境图像,举例来说,机器视觉系统可以是摄像头,该摄像头可以是全景摄像头,也可以是设置在转台上的广角摄像头,从而对机器人经过区域进行全面拍摄,得到巡检路径上的家居图像数据。
同时,巡检机器人的主控系统中设置有算法处理器,和对应的图像识别算法模型,在拍摄得到家居图像数据后,通过图像算法模型,对拍摄得到的家居图像数据进行人工智能识别,从而识别出巡检路径途径的各个区域中的家居巡检事件,并判断这些家居巡检事件中,是否存在异常事件。
举例来说,巡检路径途径用户家中的各个房间,包括卧室、客厅、卫生间和厨房。当巡检机器人沿巡检路径走行到卧室后,拍摄卧室内的图像信息,并根据该图像信息,结合机器视觉算法和大数据人工智能模型,对卧室内各关键兴趣点的信息进行提取,如卧室窗子的开闭状态,卧室灯光的开关状态等。
同理,当机器人进入卫生间、客厅和厨房等空间后,对卫生间、客厅或厨房内的关键兴趣点信息进行提取,如卫生间水龙头的开闭状态,如厨房内灶台的开关状态,如客厅内冰箱门体的开合状态等。
在识别到这些家居巡检事件后,巡检机器人的控制系统根据内置的判断逻辑,或用户自定义的判断逻辑,判断这些家居巡检事件中,是否包含了异常事件。
举例来说,在白天,日照充足,不需要开启卧室照明灯,此时满足条件“白天”、“卧室照明灯关闭”,则代表卧室灯光状态正常。而如果满足条件“白天”、“卧室内照明灯开启”,则记录为异常事件。
再次举例来说,客厅冰箱门应当处于关闭状态,如果判断客厅的冰箱门处于开启状态,则记录为异常事件。
在确定家居巡检事件中,包含有异常事件时,根据异常事件的对象,和异常事件的类型,巡检机器人查找与之相对应的处理程序,并通过执行对应的处理程序的方式,对这些异常事件进行排除和解决。
举例来说,巡检机器人接入到智能家居的物联网中,以巡检机器人检测到“白天卧室照明灯开启”的异常事件为例,巡检机器人可以通过执行对应的处理程序,来通过物联网控制卧室照明灯关闭,从而实现对异常事件的处理。
本申请实施例中,巡检机器人基于视觉传感器和图像处理技术,来对家居巡检事件进行检测,不依赖于单一的传感器,因此能够针对大范围内的不同事件类型进行准确检测,具有良好的泛用性。同时,通过执行对应的处理程序,来对检测到的异常事件进行处理和排除,从而具有了处理异常的能力,实现了灵活方便的家居环境检测和异常处理。
另外,本申请提供的上述技术方案中的家居巡检方法还可以具有如下附加技术特征:
在上述技术方案中,根据家居图像数据,确定家居巡检事件,包括:对家居图像数据进行图像识别,确定家居图像数据中包括的图像信息;根据图像信息确定家居巡检事件。
在该技术方案中,巡检机器人在按照巡检路径进行巡检工作的过程中,通过图像传感器实时拍摄巡检路径和途径区域内的家居图像数据,具体地,该家居图像数据以视频的形式进行获取和存储。在得到这些家居图像数据后,对家居图像数据,也即拍摄的视频的每一帧图像均进行获取,并进行图像识别,其中,可以针对每一帧图像进行图像识别,也可以对帧图像进行分类后,对同类帧进行整合处理,再进行图像识别,从而降低计算压力。
通过图像识别算法,识别出家居图像数据的每一帧图像中,具体包含的图像信息,举例来说,图像信息可以是家居的图像信息,如窗户的图像、照明灯的图像、冰箱等家电的图像,也可以是人物图像,如用户图像、陌生人图像等。
通过这些图像信息,对当前巡检路径经过的家居环境中的家居巡检事件进行确定,如卧室窗子的开闭状态,如卧室灯光的开关状态、如卫生间水龙头的开闭状态,如厨房内灶台的开关状态,如客厅内冰箱门体的开合状态等。
通过对这些家居巡检事件进行进一步的条件判断,从而确定出其中是 否包含有异常事件,并对检测到的异常事件进行处理和排除,实现了灵活方便的家居环境检测和异常处理。
在上述至少一个技术方案中,巡检机器人包括传感器组件,用于获取环境信息;根据图像信息确定家居巡检事件,包括:根据图像信息和环境信息,确定家居巡检事件。
在该技术方案中,巡检机器人还设置有传感器组件,在一些实施方式中,传感器组件就设置在巡检机器人的本体上,在另一些实施方式中,传感器组件还可以分布设置在家居环境中的固定位置上,并通过无线数据网络与巡检机器人本体之间进行数据指令交互。
通过这些传感器组件,能够获取对应的环境信息,具体地,环境信息可以包括氧含量信息、烟雾含量信息、温度信息、雨量信息、光线信息等,通过结合图像信息与这些环境信息,能够对家居环境内的家居巡检事件类型进行更加准确地判断。
举例来说,当巡检机器人通过视觉传感器和图像处理算法,检测到房间窗户处于开启状态时,巡检机器人进一步获取雨量传感器采集到的雨量信息,如果雨量信息显示当前处于下雨状态,则将“窗户开启”视为异常事件,此时通过物联网系统控制窗户自动关闭,如果雨量信息显示当前没有处于下雨状态,则“窗户开启”不被视为异常事件,巡检机器人仅需提示用户窗户未关闭即可,无需执行关窗操作。
通过结合传感器检测到的环境信息,和巡检机器人通过视觉传感器检测到的图像信息,能够对家居巡检事件和异常事件进行更加准确的检测和判断,提高巡检效率。
在上述至少一个技术方案中,巡检路径经过至少一个目标区域,家居图像数据包括第一数据和/或第二数据,其中,第一数据为巡检路径的数据,第二数据为目标区域的数据。
在该技术方案中,巡检路径是用户规划的路径,也可以是系统自动生成的路径。其中,巡检路径是巡检机器人的走行路径,巡检机器人在按照巡检路径进行巡检工作的过程中,巡检机器人会经过待巡检区域内的各个子区域,如用户家庭内的各个房间,包括卧室、客厅、厨房、卫生间等。
其中的每一个子区域,均可视为一个目标区域。在规划巡检路径时,将以这些目标区域作为节点,来对整体巡检路径进行规划,从而使巡检机器人的巡检路径至少途径一个上述目标区域,如巡检机器人依次对卧室、客厅、厨房、卫生间进行巡检,则巡检路径先经过卧室、再进入客厅、再进入厨房,最后到达卫生间。
在巡检过程中,巡检机器人所拍摄并记录的家居图像数据,按照所处区域的不同,具体包括第一数据和第二数据。其中,第一数据具体指的是巡检机器人在沿巡检路径走行过程中,实时拍摄的全部视频数据。而第二数据则是巡检机器人在到达一个目标区域内之后,在目标区域内拍摄得到的视频数据。
具体地,对于不同巡检需求,巡检机器人拍摄视频数据的方式可能不同。其中,对于一些实施方式,巡检机器人在巡检的全程均拍摄家居图像数据,也即无论是目标区域内的家居图像数据,还是到达目标区域之前的路径上的家居图像数据均进行采集,即得到第一数据,从而提高巡检范围。
对于另一些实施方式,可以控制巡检机器人仅拍摄目标区域内的家居图像数据,而对于在到达目标区域之前的路径上,则不拍摄家居图像数据,从而得到第二数据,由于第二数据仅包括目标区域内的家居图像数据,因此能够降低数据存储压力和数据处理压力,从而提高性能。
在上述至少一个技术方案中,图像信息包括人物信息,异常事件包括第一事件,第一事件与第一数据和/或第二数据相对应;根据图像信息确定家居巡检事件,包括:根据人物信息确定对应的身份信息;在身份信息与预设身份不匹配的情况下,确定家居巡检事件包括第一事件;确定异常事件对应的处理程序,包括:生成第一事件对应的第一报警信息,将第一报警信息发送至第一终端。
在该技术方案中,巡检机器人通过拍摄得到的家居图像数据,对家居环境中的人物信息进行识别,从而确定家居环境中的人物身份。具体地,巡检机器人可以通过预先采集家庭成员在各个角度下的人物图像,从而学习家庭成员的“外貌”,从而识得家庭成员的身份信息,并形成为预设身份。
在巡检过程中,巡检机器人通过拍摄得到的家居图像数据,对其中可能包括的人物图像进行针对性的提取,并基于提取出的人物图像,获取其身份信息。如果确定身份信息属于预设身份信息的范畴,即在预设信息内,存在一个身份信息与当前身份信息相匹配,则可以确定拍摄到的人物是家庭成员,此时记录正常的家居巡检事件。
而如果当前身份信息与全部的预设身份均不匹配,则说明居家环境中存在陌生人,此时判断为异常事件,具体为第一事件。当检测到该第一事件时,巡检机器人生成第一报警信息,第一报警信息用于警告用户,家中存在陌生人。
其中,巡检机器人可以将该第一报警信息发送到用户手机,即第一终端上,从而对用户进行有效的警示。
能够理解的是,当巡检机器人检测到第一事件,也即家中存在陌生人时,巡检机器人可以实时调整当前的巡检路线,从而使巡检机器人跟随人物信息对应的人物,也即陌生人行走,并实时拍摄人物信息对应的人物的人物图像,实现对陌生人的实时监控。
在上述至少一个技术方案中,异常事件包括第二事件,第二事件为目标家居设备的状态异常事件,且第二事件与第二数据相对应;确定异常事件对应的处理程序,包括:控制巡检机器人生成控制指令,其中,控制指令用于指示目标家居设备处理第二事件;将控制指令发送至目标家居设备。
在该技术方案中,在巡检机器人的巡检过程中,当巡检机器人进入目标区域后,会对目标区域内的家电设备或家居设备进行有针对性的巡检,其中,这些待巡检的家电设备或家居设备,即上述目标家居设备。
具体地,目标家居设备可以是智能家电设备,或智能家居设备,巡检机器人与这些目标家居设备连接在同一个物联网中。当巡检机器人在目标区域内发现目标家居设备处于异常状态时,判断当前的家居巡检事件为异常事件,具体为第二事件。
举例来说,第二事件可以是“白天卧室照明灯开启”、“冰箱门未关”等。当确定存在第二事件时,巡检机器人根据第二事件对应的目标家居设备,和第二事件的事件类型,生成对应的控制指令。
以上述举例中的“白天卧室照明灯开启”为例,如果巡检机器人发现白天卧室的照明灯未关,则可以生成“关闭卧室灯”的控制指令,并通过智能网关将该指令发送至卧室内的照明灯的智能开关,或通过云服务器的方式,远程控制卧室内的照明灯的智能开关,从而将对应的照明灯关闭,来解决该第二事件。
同理,当巡检机器人发现冰箱门未关,则生成“关闭冰箱门”的控制指令,将该指令发送至冰箱的控制器,从而通过对应设置的马达来关闭冰箱门,从而处理第二事件。
本申请实施例通过基于物联网的控制指令,来控制家居设备对异常事件进行排除,实现了灵活方便的家居环境检测和异常处理。
在该技术方案中,异常事件包括第三事件,第三事件与第一数据和/或第二数据相对应;确定异常事件对应的处理程序,包括:生成第三事件对应的第二报警信息,将第二报警信息发送至第二终端。
在该技术方案中,异常事件还包括第三事件,具体地,第三事件包括如火情、水淹等严重情况,当巡检机器人通过图像处理,发现了如火焰、烟雾、积水等情况,或通过温度传感器、烟雾传感器或积水传感器检测到对应的环境信息时,巡检机器人立即生成第二报警信息,并发送至对应的第二终端,其中,第二终端包括用户持有的手机,也包括物业终端或火警终端,从而保证这类紧急事件能够在第一时间内被处理,提高巡检的时效性和可靠性。
在上述至少一个技术方案中,第三事件包括N个事件类别,N为正整数;生成第三事件对应的第二报警信息,包括:获取场景信息;根据场景信息,确定第三事件对应的N个事件类别的优先级顺序;根据N个事件类别和优先级顺序,生成第二报警信息,其中,第二报警信息包括N个报警内容,N个事件类别与N个报警内容一一对应。
在该技术方案中,第三事件还包括无法自动处理,或未设置自动处理规则的家居异常的情况,这些异常构成为第三事件的多种事件类型。举例来说,用户的窗户不是可自动开关的智能窗户,或用户的照明灯不支持遥控开关,则这两种场景分别为两种事件类型,即“未关窗”和“未关灯”。
一般情况下,针对不同的事件类型,则生成的第二报警信息中,对应包括不同的报警内容。如“未关窗”的事件类型对应于报警内容“请注意,卧室窗未关”,对于“未关灯”的事件类型对应于报警内容“请注意,卧室照明灯未关”。
同时,这些报警内容具有不同的优先度,优先度越高,则报警信息越靠前,用户收到的报警提示越激烈,如提示音越高亢或震动强度越大。而优先度较低的报警内容,可以仅提示文字,不进行提示音或振动的反馈。
针对不同的场景信息,可以对报警内容的优先度进行动态调整。举例来说,场景信息包括白天和晚上,在白天,“未关灯”的优先度高于“未关窗”,而在晚上,则“未关窗”的优先度高于“未关灯”。
通过针对不同的场景信息,对不同的事件类别的优先级顺序进行调整,从而生成不同顺序的报警内容,有利于提高家居巡检的针对性。
在上述至少一个技术方案中,异常事件包括第四事件,第四事件为自定义事件,且第四事件与第一数据和/或第二数据相对应;
根据图像信息确定家居巡检事件,包括:根据图像信息和预设的图像识别模型,确定家居巡检事件;在根据家居图像数据,确定家居巡检事件之前,方法还包括:接收自定义图像数据集,其中,自定义图像数据集包括第四事件对应的图像;通过自定义图像数据集训练图像识别模型,以使图像识别模型能够根据图像信息识别出第四事件。
在该技术方案中,用户可以对异常事件进行自定义,从而能够对用户自定义的异常事件进行针对性的巡检。举例来说,用户可以将“衣服掉在地上”这一场景设置为异常事件,从而当家居巡检过程中,检测到衣服掉在地上时,向用户发出对应的提示,从而实现更加智能化的巡检。
具体地,在根据拍摄的图像信息确定对应的家居巡检事件时,可以通过预设的图像识别模型,对图像信息进行人工智能识别,从而识别出所拍摄的图像中包括的巡检事件。
在用户自定义异常事件,即第四事件时,可以手动设置第四事件的场景,以第四事件包括“衣服掉在地上”这一场景为例,用户可手动将衣服放在地上的不同位置,并拍摄不同角度的照片,从而形成为自定义图像数 据集,该数据集中拍摄的图片越多,则异常巡检的识别准确率越高。
用户通过将该自定义图像数据集输入至上述图像识别模型,从而对图像识别模型进行针对性的训练,从而使训练后的图像识别模型能够识别出用户自定义的第四事件。具体地,当检测到实时拍摄的家居图像数据中,存在“衣服位于地板上”这一情形时,图像识别模型输出的巡检事件中就会包含对应的第四事件,并生成对应的报警内容,提示用户拾取地上的衣服,实现了智能化的家居巡检。
在上述至少一个技术方案中,确定巡检路径,包括:获取待巡检区域的地图数据;根据地图数据,确定巡检路径。
在该技术方案中,在建立巡检路径时,可以输入待巡检区域的语义地图数据,该语义地图数据包括了用户家庭的语义地图,其中指示有至少一个目标区域的区域范围和位置坐标。
根据该地图数据,建立室内家居场景的三维语义地图,并根据该三维语义地图,自动规划巡检路径,能够理解的是,该巡检路径能够途径室内家居场景中的全部目标区域,从而提高巡检的覆盖范围,实现高精度、高可靠性的室内巡检。
本申请的第二方面提供了一种家居巡检装置,包括:
确定模块,用于确定巡检路径;
拍摄模块,用于沿巡检路径拍摄对应的家居图像数据;
确定模块,还用于根据家居图像数据,确定家居巡检事件;在家居巡检事件包括异常事件的情况下,确定异常事件对应的处理程序;
执行模块,用于执行处理程序。
在该技术方案中,可以通过巡检机器人,对家居空间或家居设备进行巡检,也可以通过设置电连接的多个固定摄像头,来进行巡检。以通过巡检机器人来进行家居巡检为例,该巡检机器人包括走行系统和机器视觉系统,其中,走行系统可以包括走行轮、走行履带等,并依靠导航雷达、视觉系统、红外线传感器等设备,实现沿巡检路径自动进行巡检运动。机器视觉系统能够实时检测机器人四周的环境图像,举例来说,机器视觉系统可以是摄像头,该摄像头可以是全景摄像头,也可以是设置在转台上的广 角摄像头,从而对机器人经过区域进行全面拍摄,得到巡检路径上的家居图像数据。
同时,巡检机器人的主控系统中设置有算法处理器,和对应的图像识别算法模型,在拍摄得到家居图像数据后,通过图像算法模型,对拍摄得到的家居图像数据进行人工智能识别,从而识别出巡检路径途径的各个区域中的家居巡检事件,并判断这些家居巡检事件中,是否存在异常事件。
举例来说,巡检路径途径用户家中的各个房间,包括卧室、客厅、卫生间和厨房。当巡检机器人沿巡检路径走行到卧室后,拍摄卧室内的图像信息,并根据该图像信息,结合机器视觉算法和大数据人工智能模型,对卧室内各关键兴趣点的信息进行提取,如卧室窗子的开闭状态,卧室灯光的开关状态等。
同理,当机器人进入卫生间、客厅和厨房等空间后,对卫生间、客厅或厨房内的关键兴趣点信息进行提取,如卫生间水龙头的开闭状态,如厨房内灶台的开关状态,如客厅内冰箱门体的开合状态等。
在识别到这些家居巡检事件后,巡检机器人的控制系统根据内置的判断逻辑,或用户自定义的判断逻辑,判断这些家居巡检事件中,是否包含了异常事件。
举例来说,在白天,日照充足,不需要开启卧室照明灯,此时满足条件“白天”、“卧室照明灯关闭”,则代表卧室灯光状态正常。而如果满足条件“白天”、“卧室内照明灯开启”,则记录为异常事件。
再次举例来说,客厅冰箱门应当处于关闭状态,如果判断客厅的冰箱门处于开启状态,则记录为异常事件。
在确定家居巡检事件中,包含有异常事件时,根据异常事件的对象,和异常事件的类型,巡检机器人查找与之相对应的处理程序,并通过执行对应的处理程序的方式,对这些异常事件进行排除和解决。
举例来说,巡检机器人接入到智能家居的物联网中,以巡检机器人检测到“白天卧室照明灯开启”的异常事件为例,巡检机器人可以通过执行对应的处理程序,来通过物联网控制卧室照明灯关闭,从而实现对异常事件的处理。
本申请实施例中,巡检机器人基于视觉传感器和图像处理技术,来对家居巡检事件进行检测,不依赖于单一的传感器,因此能够针对大范围内的不同事件类型进行准确检测,具有良好的泛用性。同时,通过执行对应的处理程序,来对检测到的异常事件进行处理和排除,从而具有了处理异常的能力,实现了灵活方便的家居环境检测和异常处理。
本申请的第三方面提供了一种非易失性可读存储介质,其上存储有程序或指令,该程序或指令被处理器执行时实现如上述至少一个技术方案中提供的家居巡检方法的步骤,因此,该非易失性可读存储介质也包括如上述至少一个技术方案中提供的家居巡检方法的全部有益效果,为避免重复,在此不再赘述。
本申请的第四方面提供了一种计算机设备,包括:存储器,用于存储程序或指令;处理器,用于执行程序或指令时实现如上述至少一个技术方案中提供的家居巡检方法的步骤,因此,该计算机设备也包括如上述至少一个技术方案中提供的家居巡检方法的全部有益效果,为避免重复,在此不再赘述。
本申请的第五方面提供了一种计算机程序产品,该计算机程序产品被存储在存储介质中,该计算机程序产被至少一个处理器执行以实现如上述至少一个技术方案中提供的家居巡检方法的步骤,因此,该计算机程序产品也包括如上述至少一个技术方案中提供的家居巡检方法的全部有益效果,为避免重复,在此不再赘述。
附图说明
本申请的上述和/或附加的方面和优点从结合下面附图对实施例的描述中将变得明显和容易理解,其中:
图1示出了根据本申请实施例的家居巡检方法的流程图;
图2示出了根据本申请实施例的巡检机器人的工作逻辑示意图;
图3示出了根据本申请实施例的家居巡检装置的结构框图。
具体实施方式
为了能够更清楚地理解本申请的上述目的、特征和优点,下面结合附图和具体实施方式对本申请进行进一步的详细描述。需要说明的是,在不冲突的情况下,本申请的实施例及实施例中的特征可以相互组合。
在下面的描述中阐述了很多具体细节以便于充分理解本申请,但是,本申请还可以采用其他不同于在此描述的其他方式来实施,因此,本申请的保护范围并不受下面公开的具体实施例的限制。
下面参照图1至图3描述根据本申请一些实施例所述家居巡检方法、非易失性可读存储介质和计算机设备。
实施例一
在本申请的一些实施例中,提供了一种家居巡检方法,图1示出了根据本申请实施例的家居巡检方法的流程图,如图1所示,方法包括:
步骤102,确定巡检路径;
步骤104,沿巡检路径拍摄家居图像数据;
步骤106,根据家居图像数据,确定家居巡检事件;
步骤108,在家居巡检事件包括异常事件的情况下,确定异常事件对应的处理程序;
步骤110,执行处理程序。
在本申请实施例中,可以通过巡检机器人,对家居空间或家居设备进行巡检,也可以通过设置电连接的多个固定摄像头,来进行巡检。以通过巡检机器人来进行家居巡检为例,该巡检机器人包括走行系统和机器视觉系统,其中,走行系统可以包括走行轮、走行履带等,并依靠导航雷达、视觉系统、红外线传感器等设备,实现沿巡检路径自动进行巡检运动。机器视觉系统能够实时检测机器人四周的环境图像,举例来说,机器视觉系统可以是摄像头,该摄像头可以是全景摄像头,也可以是设置在转台上的广角摄像头,从而对机器人经过区域进行全面拍摄,得到巡检路径上的家居图像数据。
同时,巡检机器人的主控系统中设置有算法处理器,和对应的图像识别算法模型,在拍摄得到家居图像数据后,通过图像算法模型,对拍摄得到的家居图像数据进行人工智能识别,从而识别出巡检路径途径的各个区 域中的家居巡检事件,并判断这些家居巡检事件中,是否存在异常事件。
举例来说,巡检路径途径用户家中的各个房间,包括卧室、客厅、卫生间和厨房。当巡检机器人沿巡检路径走行到卧室后,拍摄卧室内的图像信息,并根据该图像信息,结合机器视觉算法和大数据人工智能模型,对卧室内各关键兴趣点的信息进行提取,如卧室窗子的开闭状态,卧室灯光的开关状态等。
同理,当机器人进入卫生间、客厅和厨房等空间后,对卫生间、客厅或厨房内的关键兴趣点信息进行提取,如卫生间水龙头的开闭状态,如厨房内灶台的开关状态,如客厅内冰箱门体的开合状态等。
在识别到这些家居巡检事件后,巡检机器人的控制系统根据内置的判断逻辑,或用户自定义的判断逻辑,判断这些家居巡检事件中,是否包含了异常事件。
举例来说,在白天,日照充足,不需要开启卧室照明灯,此时满足条件“白天”、“卧室照明灯关闭”,则代表卧室灯光状态正常。而如果满足条件“白天”、“卧室内照明灯开启”,则记录为异常事件。
再次举例来说,客厅冰箱门应当处于关闭状态,如果判断客厅的冰箱门处于开启状态,则记录为异常事件。
在确定家居巡检事件中,包含有异常事件时,根据异常事件的对象,和异常事件的类型,巡检机器人查找与之相对应的处理程序,并通过执行对应的处理程序的方式,对这些异常事件进行排除和解决。
举例来说,巡检机器人接入到智能家居的物联网中,以巡检机器人检测到“白天卧室照明灯开启”的异常事件为例,巡检机器人可以通过执行对应的处理程序,来通过物联网控制卧室照明灯关闭,从而实现对异常事件的处理。
本申请实施例中,巡检机器人基于视觉传感器和图像处理技术,来对家居巡检事件进行检测,不依赖于单一的传感器,因此能够针对大范围内的不同事件类型进行准确检测,具有良好的泛用性。同时,通过执行对应的处理程序,来对检测到的异常事件进行处理和排除,从而具有了处理异常的能力,实现了灵活方便的家居环境检测和异常处理。
在本申请的一些实施例中,根据家居图像数据,确定家居巡检事件,包括:对家居图像数据进行图像识别,确定家居图像数据中包括的图像信息;根据图像信息确定家居巡检事件。
在本申请实施例中,巡检机器人在按照巡检路径进行巡检工作的过程中,通过图像传感器实时拍摄巡检路径和途径区域内的家居图像数据,具体地,该家居图像数据以视频的形式进行获取和存储。在得到这些家居图像数据后,对家居图像数据,也即拍摄的视频的每一帧图像均进行获取,并进行图像识别,其中,可以针对每一帧图像进行图像识别,也可以对帧图像进行分类后,对同类帧进行整合处理,再进行图像识别,从而降低计算压力。
通过图像识别算法,识别出家居图像数据的每一帧图像中,具体包含的图像信息,举例来说,图像信息可以是家居的图像信息,如窗户的图像、照明灯的图像、冰箱等家电的图像,也可以是人物图像,如用户图像、陌生人图像等。
通过这些图像信息,对当前巡检路径经过的家居环境中的家居巡检事件进行确定,如卧室窗子的开闭状态,如卧室灯光的开关状态、如卫生间水龙头的开闭状态,如厨房内灶台的开关状态,如客厅内冰箱门体的开合状态等。
通过对这些家居巡检事件进行进一步的条件判断,从而确定出其中是否包含有异常事件,并对检测到的异常事件进行处理和排除,实现了灵活方便的家居环境检测和异常处理。
在本申请的一些实施例中,巡检机器人包括传感器组件,用于获取环境信息;根据图像信息确定家居巡检事件,包括:根据图像信息和环境信息,确定家居巡检事件。
在本申请实施例中,巡检机器人还设置有传感器组件,在一些实施方式中,传感器组件就设置在巡检机器人的本体上,在另一些实施方式中,传感器组件还可以分布设置在家居环境中的固定位置上,并通过无线数据网络与巡检机器人本体之间进行数据指令交互。
通过这些传感器组件,能够获取对应的环境信息,具体地,环境信息 可以包括氧含量信息、烟雾含量信息、温度信息、雨量信息、光线信息等,通过结合图像信息与这些环境信息,能够对家居环境内的家居巡检事件类型进行更加准确地判断。
举例来说,当巡检机器人通过视觉传感器和图像处理算法,检测到房间窗户处于开启状态时,巡检机器人进一步获取雨量传感器采集到的雨量信息,如果雨量信息显示当前处于下雨状态,则将“窗户开启”视为异常事件,此时通过物联网系统控制窗户自动关闭,如果雨量信息显示当前没有处于下雨状态,则“窗户开启”不被视为异常事件,巡检机器人仅需提示用户窗户未关闭即可,无需执行关窗操作。
通过结合传感器检测到的环境信息,和巡检机器人通过视觉传感器检测到的图像信息,能够对家居巡检事件和异常事件进行更加准确的检测和判断,提高巡检效率。
在本申请的一些实施例中,巡检路径经过至少一个目标区域,家居图像数据包括第一数据和/或第二数据,其中,第一数据为巡检路径的数据,第二数据为目标区域的数据。
在本申请实施例中,巡检路径是用户规划的路径,也可以是系统自动生成的路径。其中,巡检路径是巡检机器人的走行路径,巡检机器人在按照巡检路径进行巡检工作的过程中,巡检机器人会经过待巡检区域内的各个子区域,如用户家庭内的各个房间,包括卧室、客厅、厨房、卫生间等。
其中的每一个子区域,均可视为一个目标区域。在规划巡检路径时,将以这些目标区域作为节点,来对整体巡检路径进行规划,从而使巡检机器人的巡检路径至少途径一个上述目标区域,如巡检机器人依次对卧室、客厅、厨房、卫生间进行巡检,则巡检路径先经过卧室、再进入客厅、再进入厨房,最后到达卫生间。
在巡检过程中,巡检机器人所拍摄并记录的家居图像数据,按照所处区域的不同,具体包括第一数据和第二数据。其中,第一数据具体指的是巡检机器人在沿巡检路径走行过程中,实时拍摄的全部视频数据。而第二数据则是巡检机器人在到达一个目标区域内之后,在目标区域内拍摄得到的视频数据。
具体地,对于不同巡检需求,巡检机器人拍摄视频数据的方式可能不同。其中,对于一些实施方式,巡检机器人在巡检的全程均拍摄家居图像数据,也即无论是目标区域内的家居图像数据,还是到达目标区域之前的路径上的家居图像数据均进行采集,即得到第一数据,从而提高巡检范围。
对于另一些实施方式,可以控制巡检机器人仅拍摄目标区域内的家居图像数据,而对于在到达目标区域之前的路径上,则不拍摄家居图像数据,从而得到第二数据,由于第二数据仅包括目标区域内的家居图像数据,因此能够降低数据存储压力和数据处理压力,从而提高性能。
在本申请的一些实施例中,图像信息包括人物信息,异常事件包括第一事件,第一事件与第一数据和/或第二数据相对应;根据图像信息确定家居巡检事件,包括:根据人物信息确定对应的身份信息;在身份信息与预设身份不匹配的情况下,确定家居巡检事件包括第一事件;确定异常事件对应的处理程序,包括:生成第一事件对应的第一报警信息,将第一报警信息发送至第一终端。
在本申请实施例中,巡检机器人通过拍摄得到的家居图像数据,对家居环境中的人物信息进行识别,从而确定家居环境中的人物身份。具体地,巡检机器人可以通过预先采集家庭成员在各个角度下的人物图像,从而学习家庭成员的“外貌”,从而识得家庭成员的身份信息,并形成为预设身份。
在巡检过程中,巡检机器人通过拍摄得到的家居图像数据,对其中可能包括的人物图像进行针对性的提取,并基于提取出的人物图像,获取其身份信息。如果确定身份信息属于预设身份信息的范畴,即在预设信息内,存在一个身份信息与当前身份信息相匹配,则可以确定拍摄到的人物是家庭成员,此时记录正常的家居巡检事件。
而如果当前身份信息与全部的预设身份均不匹配,则说明居家环境中存在陌生人,此时判断为异常事件,具体为第一事件。当检测到该第一事件时,巡检机器人生成第一报警信息,第一报警信息用于警告用户,家中存在陌生人。
其中,巡检机器人可以将该第一报警信息发送到用户手机,即第一终 端上,从而对用户进行有效的警示。
能够理解的是,当巡检机器人检测到第一事件,也即家中存在陌生人时,巡检机器人可以实时调整当前的巡检路线,从而使巡检机器人跟随人物信息对应的人物,也即陌生人行走,并实时拍摄人物信息对应的人物的人物图像,实现对陌生人的实时监控。
在本申请的一些实施例中,异常事件包括第二事件,第二事件为目标家居设备的状态异常事件,且第二事件与第二数据相对应;确定异常事件对应的处理程序,包括:控制巡检机器人生成控制指令,其中,控制指令用于指示目标家居设备处理第二事件;将控制指令发送至目标家居设备。
在本申请实施例中,在巡检机器人的巡检过程中,当巡检机器人进入目标区域后,会对目标区域内的家电设备或家居设备进行有针对性的巡检,其中,这些待巡检的家电设备或家居设备,即上述目标家居设备。
具体地,目标家居设备可以是智能家电设备,或智能家居设备,巡检机器人与这些目标家居设备连接在同一个物联网中。当巡检机器人在目标区域内发现目标家居设备处于异常状态时,判断当前的家居巡检事件为异常事件,具体为第二事件。
举例来说,第二事件可以是“白天卧室照明灯开启”、“冰箱门未关”等。当确定存在第二事件时,巡检机器人根据第二事件对应的目标家居设备,和第二事件的事件类型,生成对应的控制指令。
以上述举例中的“白天卧室照明灯开启”为例,如果巡检机器人发现白天卧室的照明灯未关,则可以生成“关闭卧室灯”的控制指令,并通过智能网关将该指令发送至卧室内的照明灯的智能开关,或通过云服务器的方式,远程控制卧室内的照明灯的智能开关,从而将对应的照明灯关闭,来解决该第二事件。
同理,当巡检机器人发现冰箱门未关,则生成“关闭冰箱门”的控制指令,将该指令发送至冰箱的控制器,从而通过对应设置的马达来关闭冰箱门,从而处理第二事件。
本申请实施例通过基于物联网的控制指令,来控制家居设备对异常事件进行排除,实现了灵活方便的家居环境检测和异常处理。
在本申请实施例中,异常事件包括第三事件,第三事件与第一数据和/或第二数据相对应;确定异常事件对应的处理程序,包括:生成第三事件对应的第二报警信息,将第二报警信息发送至第二终端。
在本申请实施例中,异常事件还包括第三事件,具体地,第三事件包括如火情、水淹等严重情况,当巡检机器人通过图像处理,发现了如火焰、烟雾、积水等情况,或通过温度传感器、烟雾传感器或积水传感器检测到对应的环境信息时,巡检机器人立即生成第二报警信息,并发送至对应的第二终端,其中,第二终端包括用户持有的手机,也包括物业终端或火警终端,从而保证这类紧急事件能够在第一时间内被处理,提高巡检的时效性和可靠性。
在本申请的一些实施例中,第三事件包括N个事件类别,N为正整数;生成第三事件对应的第二报警信息,包括:获取场景信息;根据场景信息,确定第三事件对应的N个事件类别的优先级顺序;根据N个事件类别和优先级顺序,生成第二报警信息,其中,第二报警信息包括N个报警内容,N个事件类别与N个报警内容一一对应。
在本申请实施例中,第三事件还包括无法自动处理,或未设置自动处理规则的家居异常的情况,这些异常构成为第三事件的多种事件类型。举例来说,用户的窗户不是可自动开关的智能窗户,或用户的照明灯不支持遥控开关,则这两种场景分别为两种事件类型,即“未关窗”和“未关灯”。
一般情况下,针对不同的事件类型,则生成的第二报警信息中,对应包括不同的报警内容。如“未关窗”的事件类型对应于报警内容“请注意,卧室窗未关”,对于“未关灯”的事件类型对应于报警内容“请注意,卧室照明灯未关”。
同时,这些报警内容具有不同的优先度,优先度越高,则报警信息越靠前,用户收到的报警提示越激烈,如提示音越高亢或震动强度越大。而优先度较低的报警内容,可以仅提示文字,不进行提示音或振动的反馈。
针对不同的场景信息,可以对报警内容的优先度进行动态调整。举例来说,场景信息包括白天和晚上,在白天,“未关灯”的优先度高于“未关窗”,而在晚上,则“未关窗”的优先度高于“未关灯”。
通过针对不同的场景信息,对不同的事件类别的优先级顺序进行调整,从而生成不同顺序的报警内容,有利于提高家居巡检的针对性。
在本申请的一些实施例中,异常事件包括第四事件,第四事件为自定义事件,且第四事件与第一数据和/或第二数据相对应;
根据图像信息确定家居巡检事件,包括:根据图像信息和预设的图像识别模型,确定家居巡检事件;在根据家居图像数据,确定家居巡检事件之前,方法还包括:接收自定义图像数据集,其中,自定义图像数据集包括第四事件对应的图像;通过自定义图像数据集训练图像识别模型,以使图像识别模型能够根据图像信息识别出第四事件。
在本申请实施例中,用户可以对异常事件进行自定义,从而能够对用户自定义的异常事件进行针对性的巡检。举例来说,用户可以将“衣服掉在地上”这一场景设置为异常事件,从而当家居巡检过程中,检测到衣服掉在地上时,向用户发出对应的提示,从而实现更加智能化的巡检。
具体地,在根据拍摄的图像信息确定对应的家居巡检事件时,可以通过预设的图像识别模型,对图像信息进行人工智能识别,从而识别出所拍摄的图像中包括的巡检事件。
在用户自定义异常事件,即第四事件时,可以手动设置第四事件的场景,以第四事件包括“衣服掉在地上”这一场景为例,用户可手动将衣服放在地上的不同位置,并拍摄不同角度的照片,从而形成为自定义图像数据集,该数据集中拍摄的图片越多,则异常巡检的识别准确率越高。
用户通过将该自定义图像数据集输入至上述图像识别模型,从而对图像识别模型进行针对性的训练,从而使训练后的图像识别模型能够识别出用户自定义的第四事件。具体地,当检测到实时拍摄的家居图像数据中,存在“衣服位于地板上”这一情形时,图像识别模型输出的巡检事件中就会包含对应的第四事件,并生成对应的报警内容,提示用户拾取地上的衣服,实现了智能化的家居巡检。
在本申请的一些实施例中,确定巡检路径,包括:获取待巡检区域的地图数据;根据地图数据,确定巡检路径。
在本申请实施例中,在建立巡检路径时,可以输入待巡检区域的语义 地图数据,该语义地图数据包括了用户家庭的语义地图,其中指示有至少一个目标区域的区域范围和位置坐标。
根据该地图数据,建立室内家居场景的三维语义地图,并根据该三维语义地图,自动规划巡检路径,能够理解的是,该巡检路径能够途径室内家居场景中的全部目标区域,从而提高巡检的覆盖范围,实现高精度、高可靠性的室内巡检。
在本申请的一些实施例中,确定巡检路径,包括:接收设置指令;根据设置指令,确定巡检路径。
在本申请实施例中,设置指令,即用户对巡检机器人的巡检路径进行设置的指令,在接收到设置指令后,巡检机器人根据用户的设置指令,建立对应的巡检路径,并按照该巡检路径进行巡检作业。
其中,以通过巡检机器人进行家居巡检为例,设置指令可以是一次性对完整的巡检指令进行设置的指令,也可以是对巡检机器人进行实时遥控的指令。当巡检机器人完成一次巡检后,巡检机器人对本次巡检的路径进行保存,在后续巡检工作中,可以直接调用对应的路径,从而减少计算开销。
在本申请的一些实施例中,巡检机器人是智能家居系统中的一部分,是家居异常检测的主体,与各种智能家电互联,并且可与手机进行无线通信;可在手机APP(Application,应用程序)上远程遥控巡检机器人,并且可以在手机上实时查看家居环境,巡检机器人有语音交互功能。
其中,巡检机器人可以设定两种巡检模式,第一种模式就是自动巡检,按照一定的轨迹自动巡检,也可以按照用户自定义方法巡检,第二种就是人工远程操控巡检。
巡检机器人内存储了家人的人脸身份信息,用户需要在手机APP上注册,并进行家人人脸身份信息录入;巡检机器人可控制各种家居设备,可以查看各种家居设备的运行情况;巡检机器人内部嵌入各种相关的传感器,用以辅助机器视觉方法上的盲区。
图2示出了根据本申请实施例的巡检机器人的工作逻辑示意图,如图2所示,巡检机器人的工作系统,包括用户终端、巡检机器人本体和服务 器,其中,巡检机器人包括视觉传感器和非视觉辅助传感器,其中,视觉传感器包括摄像头,能够获取家居实时场景图像,非视觉传感器包括烟雾传感器、温度传感器、雨量传感器等,通过视觉传感器和非视觉辅助传感器,来对家居状态进行检测和分析。
用户终端即用户手机APP,用户通过手机APP建立巡检任务,并结合家居异常数据库,来指定目标巡检区域或目标家居设备,系统根据用户指定自动规划巡检路线,并发送至巡检机器人。
巡检机器人本体包括主控系统,主控系统能够控制异常处理模块和家居控制模块,异常处理模块能够对家居异常进行检测和识别,家居控制模块则可以向家居设备发送控制指令,从而控制家居设备处理异常。
服务器与巡检机器人之间进行数据指令交互,当巡检机器人检测到异常情况时,向服务器发送对应的警报信息,服务器向用户终端发送异常警报,从而提示用户异常。
在巡检机器人工作的过程中,包括以下五个步骤:
步骤一,首先用户可以在用户手机APP终端上导入预先存储的室内家居场景三维语义地图,选择巡检模式,第一种可选择的巡检模式是根据语义地图自动巡检;第二种可选择的巡检模式是根据用户需求实现远程人工操控巡检。对于远程操控机器人巡检,用户只需在手机APP端直接对机器人进行实时远程遥控,人工检查是否有异常信息。
步骤二,对于第二种根据语义地图自动巡检的方法,首先用户在手机APP终端可以选择需要检测的设备或区域,巡检机器人根据当前位置自动规划巡检路线,根据用户自定义的巡检路线开始巡检;系统自动保存本次巡检任务路线,用户下次使用时可以一键选择。
步骤三,步骤一和步骤二准备就绪后,进入步骤三的家居异常检测阶段。首先通过巡检机器人视觉传感器实时获取家居异常检测过程中的每一帧,巡检机器人到达指定区域后,对可能发生异常的家居设备或区域进行检测,检测到感兴趣的家居设备后机器人会根据目标位置调整位姿,使得巡检机器人使用的视觉传感器能够始终对准待检测的设备或待测区域。
步骤四,当巡检机器人检测到家居异常时,系统对家居异常做进一步 分析。其中,家居异常可以包括以下情况:冰箱未关、灯未关、窗户开、起火、检测到陌生人等。
具体地,当巡检机器人检测到家居异常时,将异常信息传输到机器人内部的异常处理模块中做进一步分析和处理。
机器人主控系统不仅会将异常信息记录上传至云端,而且会将异常信息发送至用户端,用户可根据异常信息做出相应的措施,机器人也会根据异常信息通过智能家居控制模块自行解决异常问题。
当检测到窗户未关时,首先通过天气传感器获取气象信息,若天气是下雨刮风等恶劣天气,则辅助系统判定为家居异常,系统将异常信息上报至用户,同时运用智能家居控制系统将智能窗户关闭。
当视觉传感器和烟雾传感器检测到灶台或者其他区域有火焰时,检测的优先级相同,即不管是哪个传感器检测到火焰,都应立即向用户报警。
当检测到任何位置有陌生人时,巡检机器人会对陌生人进行跟踪,全程将录像同步至云端备份,并立即提示用户。
当检测到灯未关时,由巡检机器人向智能家居控制模块发送关闭指令,智能家居控制模块接收到关闭指令后将灯关闭。
例如将衣服掉落在地上视为家居异常,用户可以通过自己输入异常场景(衣服掉在地上)的图像进行训练,使得这一场景也加入到异常数据库中,从而支持该场景的检测。
本申请实施例通过机器视觉外加其他硬件传感器的辅助,实现对家居异常的检测,不仅能够结合家居场景下的背景知识,也能够结合其他传感器的经验知识,辅助视觉方法上检测的盲区,有效的提高了机器人的异常检测能力。
且机器人具备异常检测能力,还具备解决异常问题的能力,一定程度上保障了家居安全。将多种异常的检测集成在机器人内部,根据不同的家居异常情况识别出不同的异常类别,还可以根据需求更新和扩充家居异常数据库。利用巡检机器人实现自动多点巡检,使用灵活方便,可扩展性强,适合绝大部分家居环境的异常检测。
实施例二
在本申请的一些实施例中,提供了一种家居巡检装置,图3示出了根据本申请实施例的家居巡检装置的结构框图,如图3所示,家居巡检装置300包括:
确定模块302,用于确定巡检路径;
拍摄模块304,用于沿巡检路径拍摄家居图像数据;
确定模块302,还用于根据家居图像数据,确定家居巡检事件;在家居巡检事件包括异常事件的情况下,确定异常事件对应的处理程序;
执行模块306,用于执行处理程序。
在本申请实施例中,可以通过巡检机器人,对家居空间或家居设备进行巡检,也可以通过设置电连接的多个固定摄像头,来进行巡检。以通过巡检机器人来进行家居巡检为例,该巡检机器人包括走行系统和机器视觉系统,其中,走行系统可以包括走行轮、走行履带等,并依靠导航雷达、视觉系统、红外线传感器等设备,实现沿巡检路径自动进行巡检运动。机器视觉系统能够实时检测机器人四周的环境图像,举例来说,机器视觉系统可以是摄像头,该摄像头可以是全景摄像头,也可以是设置在转台上的广角摄像头,从而对机器人经过区域进行全面拍摄,得到巡检路径上的家居图像数据。
同时,巡检机器人的主控系统中设置有算法处理器,和对应的图像识别算法模型,在拍摄得到家居图像数据后,通过图像算法模型,对拍摄得到的家居图像数据进行人工智能识别,从而识别出巡检路径途径的各个区域中的家居巡检事件,并判断这些家居巡检事件中,是否存在异常事件。
举例来说,巡检路径途径用户家中的各个房间,包括卧室、客厅、卫生间和厨房。当巡检机器人沿巡检路径走行到卧室后,拍摄卧室内的图像信息,并根据该图像信息,结合机器视觉算法和大数据人工智能模型,对卧室内各关键兴趣点的信息进行提取,如卧室窗子的开闭状态,卧室灯光的开关状态等。
同理,当机器人进入卫生间、客厅和厨房等空间后,对卫生间、客厅或厨房内的关键兴趣点信息进行提取,如卫生间水龙头的开闭状态,如厨房内灶台的开关状态,如客厅内冰箱门体的开合状态等。
在识别到这些家居巡检事件后,巡检机器人的控制系统根据内置的判断逻辑,或用户自定义的判断逻辑,判断这些家居巡检事件中,是否包含了异常事件。
举例来说,在白天,日照充足,不需要开启卧室照明灯,此时满足条件“白天”、“卧室照明灯关闭”,则代表卧室灯光状态正常。而如果满足条件“白天”、“卧室内照明灯开启”,则记录为异常事件。
再次举例来说,客厅冰箱门应当处于关闭状态,如果判断客厅的冰箱门处于开启状态,则记录为异常事件。
在确定家居巡检事件中,包含有异常事件时,根据异常事件的对象,和异常事件的类型,巡检机器人查找与之相对应的处理程序,并通过执行对应的处理程序的方式,对这些异常事件进行排除和解决。
举例来说,巡检机器人接入到智能家居的物联网中,以巡检机器人检测到“白天卧室照明灯开启”的异常事件为例,巡检机器人可以通过执行对应的处理程序,来通过物联网控制卧室照明灯关闭,从而实现对异常事件的处理。
本申请实施例中,巡检机器人基于视觉传感器和图像处理技术,来对家居巡检事件进行检测,不依赖于单一的传感器,因此能够针对大范围内的不同事件类型进行准确检测,具有良好的泛用性。同时,通过执行对应的处理程序,来对检测到的异常事件进行处理和排除,从而具有了处理异常的能力,实现了灵活方便的家居环境检测和异常处理。
实施例三
在本申请的一些实施例中,提供了一种非易失性可读存储介质,其上存储有程序或指令,该程序或指令被处理器执行时实现如上述至少一个实施例中提供的家居巡检方法的步骤,因此,该非易失性可读存储介质也包括如上述至少一个实施例中提供的家居巡检方法的全部有益效果,为避免重复,在此不再赘述。
实施例四
在本申请的一些实施例中,提供了一种巡检机器人,包括如上述至少一个实施例中提供的家居巡检装置,和/或如上述至少一个实施例中提供的 非易失性可读存储介质,因此,该巡检机器人也包括如上述至少一个实施例中提供的控制装置和/或如上述至少一个实施例中提供的可读存储介质的全部有益效果,为避免重复,在此不再赘述。
实施例五
在本申请的一些实施例中,提供了一种计算机设备,包括:存储器,用于存储程序或指令;处理器,用于执行程序或指令时实现如上述至少一个实施例中提供的家居巡检方法的步骤,因此,该计算机设备也包括如上述至少一个实施例中提供的家居巡检方法的全部有益效果,为避免重复,在此不再赘述。在一些实施方式中,计算机设备包括巡检机器人。
实施例六
在本申请的一些实施例中,提供了一种计算机程序产品,该计算机程序产品被处理器执行时实现如上述至少一个实施例中提供的家居巡检方法的步骤,因此,该计算机程序也包括如上述至少一个实施例中提供的家居巡检方法的全部有益效果,为避免重复,在此不再赘述。
本申请的描述中,术语“多个”则指两个或两个以上,除非另有明确的限定,术语“上”、“下”等指示的方位或位置关系为基于附图所述的方位或位置关系,仅是为了便于描述本申请和简化描述,而不是指示或暗示所指的装置或元件必须具有特定的方位、以特定的方位构造和操作,因此不能理解为对本申请的限制;术语“连接”、“安装”、“固定”等均应做广义理解,例如,“连接”可以是固定连接,也可以是可拆卸连接,或一体地连接;可以是直接相连,也可以通过中间媒介间接相连。对于本领域的普通技术人员而言,可以根据具体情况理解上述术语在本申请中的具体含义。
在本申请的描述中,术语“一个实施例”、“一些实施例”、“具体实施例”等的描述意指结合该实施例或示例描述的具体特征、结构、材料或特点包含于本申请的至少一个实施例或示例中。在本申请中,对上述术语的示意性表述不一定指的是相同的实施例或实例。而且,描述的具体特征、结构、材料或特点可以在任何的一个或多个实施例或示例中以合适的方式结合。
以上所述仅为本申请的优选实施例而已,并不用于限制本申请,对于本领域的技术人员来说,本申请可以有各种更改和变化。凡在本申请的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。

Claims (14)

  1. 一种家居巡检方法,其中,包括:
    确定巡检路径;
    沿所述巡检路径拍摄对应的家居图像数据;
    根据所述家居图像数据,确定家居巡检事件;
    在所述家居巡检事件包括异常事件的情况下,确定所述异常事件对应的处理程序;
    执行所述处理程序。
  2. 根据权利要求1所述的家居巡检方法,其中,所述根据所述图像数据,确定家居巡检事件,包括:
    对所述家居图像数据进行图像识别,确定所述家居图像数据中包括的图像信息;
    根据所述图像信息确定所述家居巡检事件。
  3. 根据权利要求2所述的家居巡检方法,其中,还包括:
    获取家居环境信息;
    所述根据所述图像信息确定所述家居巡检事件,包括:
    根据所述图像信息和所述家居环境信息,确定所述家居巡检事件。
  4. 根据权利要求2所述的家居巡检方法,其中,所述巡检路径经过至少一个目标区域,所述家居图像数据包括第一数据和/或第二数据,其中,所述第一数据为所述巡检路径的数据,所述第二数据为所述目标区域的数据。
  5. 根据权利要求4所述的家居巡检方法,其中,所述图像信息包括人物信息,所述异常事件包括第一事件,所述第一事件与所述第一数据和/或所述第二数据相对应;
    所述根据所述图像信息确定所述家居巡检事件,包括:
    根据所述人物信息确定对应的身份信息;
    在所述身份信息与预设身份不匹配的情况下,确定所述家居巡检事件包括所述第一事件;
    所述确定所述异常事件对应的处理程序,包括:
    生成所述第一事件对应的第一报警信息,将所述第一报警信息发送至第一终端。
  6. 根据权利要求4所述的家居巡检方法,其中,所述异常事件包括第二事件,所述第二事件为目标家居设备的状态异常事件,且所述第二事件与所述第二数据相对应;
    所述确定所述异常事件对应的处理程序,包括:
    生成控制指令,其中,所述控制指令用于指示所述目标家居设备处理所述第二事件;
    将所述控制指令发送至所述目标家居设备。
  7. 根据权利要求4所述的家居巡检方法,其中,所述异常事件包括第三事件,所述第三事件与所述第一数据和/或所述第二数据相对应;
    所述确定所述异常事件对应的处理程序,包括:
    生成所述第三事件对应的第二报警信息,将所述第二报警信息发送至第二终端。
  8. 根据权利要求7所述的家居巡检方法,其中,所述第三事件包括N个事件类别,N为正整数;
    所述生成所述第三事件对应的第二报警信息,包括:
    获取场景信息;
    根据所述场景信息,确定所述第三事件对应的N个事件类别的优先级顺序;
    根据所述N个事件类别和所述优先级顺序,生成所述第二报警信息,其中,所述第二报警信息包括N个报警内容,所述N个事件类别与所述N个报警内容一一对应。
  9. 根据权利要求4所述的家居巡检方法,其中,所述异常事件包括第四事件,所述第四事件为自定义事件,且所述第四事件与所述第一数据和/或所述第二数据相对应;
    所述根据所述图像信息确定所述家居巡检事件,包括:
    根据所述图像信息和预设的图像识别模型,确定所述家居巡检事件;
    在所述根据所述家居图像数据,确定家居巡检事件之前,所述方法还包括:
    接收自定义图像数据集,其中,所述自定义图像数据集包括所述第四事件对应的图像;
    通过所述自定义图像数据集训练所述图像识别模型,以使所述图像识别模型能够根据所述图像信息识别出所述第四事件。
  10. 根据权利要求1至9中任一项所述的家居巡检方法,其中,所述确定巡检路径,包括:
    获取待巡检区域的地图数据;
    根据所述地图数据,确定所述巡检路径。
  11. 一种家居巡检装置,其中,包括:
    确定模块,用于确定巡检路径;
    拍摄模块,用于沿所述巡检路径拍摄对应的家居图像数据;
    所述确定模块,还用于根据所述家居图像数据,确定家居巡检事件;在所述家居巡检事件包括异常事件的情况下,确定所述异常事件对应的处理程序;
    执行模块,用于执行所述处理程序。
  12. 一种非易失性可读存储介质,其上存储有程序或指令,其中,所述程序或指令被处理器执行时实现如权利要求1至10中任一项所述的家居巡检方法的步骤。
  13. 一种计算机设备,其中,包括:
    存储器,用于存储程序或指令;
    处理器,用于执行所述程序或指令时实现如权利要求1至10中任一项所述的家居巡检方法的步骤。
  14. 一种计算机程序产品,其中,所述计算机程序产品被存储在存储介质中,所述计算机程序产被至少一个处理器执行以实现如权利要求1至10中任一项所述的家居巡检方法的步骤。
PCT/CN2022/116850 2022-01-24 2022-09-02 家居巡检方法、非易失性可读存储介质和计算机设备 WO2023138063A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210081422.0A CN114489070A (zh) 2022-01-24 2022-01-24 家居巡检方法、非易失性可读存储介质和计算机设备
CN202210081422.0 2022-01-24

Publications (1)

Publication Number Publication Date
WO2023138063A1 true WO2023138063A1 (zh) 2023-07-27

Family

ID=81474855

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/116850 WO2023138063A1 (zh) 2022-01-24 2022-09-02 家居巡检方法、非易失性可读存储介质和计算机设备

Country Status (2)

Country Link
CN (1) CN114489070A (zh)
WO (1) WO2023138063A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114489070A (zh) * 2022-01-24 2022-05-13 美的集团(上海)有限公司 家居巡检方法、非易失性可读存储介质和计算机设备
CN116700247B (zh) * 2023-05-30 2024-03-19 东莞市华复实业有限公司 一种家居机器人的智能巡航管理方法及系统
CN117708382A (zh) * 2023-10-12 2024-03-15 广州信邦智能装备股份有限公司 巡检数据处理方法及智慧工厂巡检系统和相关介质程序

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030229474A1 (en) * 2002-03-29 2003-12-11 Kaoru Suzuki Monitoring apparatus
US20050091684A1 (en) * 2003-09-29 2005-04-28 Shunichi Kawabata Robot apparatus for supporting user's actions
US20050096790A1 (en) * 2003-09-29 2005-05-05 Masafumi Tamura Robot apparatus for executing a monitoring operation
CN204790566U (zh) * 2015-07-16 2015-11-18 高世恒 一种多功能智能家居机器人
CN105654648A (zh) * 2016-03-28 2016-06-08 浙江吉利控股集团有限公司 防盗监控装置及系统和方法
CN106598052A (zh) * 2016-12-14 2017-04-26 南京阿凡达机器人科技有限公司 一种基于环境地图的机器人安防巡检方法及其机器人
CN107186731A (zh) * 2017-06-29 2017-09-22 上海未来伙伴机器人有限公司 一种防卫机器人
CN108354526A (zh) * 2018-02-11 2018-08-03 深圳市沃特沃德股份有限公司 扫地机器人的安防方法及装置
CN108540780A (zh) * 2018-06-08 2018-09-14 苏州清研微视电子科技有限公司 基于扫地机器人设备的智能移动家庭监控系统
CN108942955A (zh) * 2018-07-04 2018-12-07 广东技术师范学院 一种家用监控机器人
CN113658406A (zh) * 2021-08-13 2021-11-16 广州果乐文化传播有限公司 智能巡防机器人
CN114489070A (zh) * 2022-01-24 2022-05-13 美的集团(上海)有限公司 家居巡检方法、非易失性可读存储介质和计算机设备

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102058918B1 (ko) * 2012-12-14 2019-12-26 삼성전자주식회사 홈 모니터링 방법 및 장치
CN113726606B (zh) * 2021-08-30 2022-12-27 杭州申昊科技股份有限公司 异常检测方法及装置、电子设备、存储介质

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030229474A1 (en) * 2002-03-29 2003-12-11 Kaoru Suzuki Monitoring apparatus
US20050091684A1 (en) * 2003-09-29 2005-04-28 Shunichi Kawabata Robot apparatus for supporting user's actions
US20050096790A1 (en) * 2003-09-29 2005-05-05 Masafumi Tamura Robot apparatus for executing a monitoring operation
CN204790566U (zh) * 2015-07-16 2015-11-18 高世恒 一种多功能智能家居机器人
CN105654648A (zh) * 2016-03-28 2016-06-08 浙江吉利控股集团有限公司 防盗监控装置及系统和方法
CN106598052A (zh) * 2016-12-14 2017-04-26 南京阿凡达机器人科技有限公司 一种基于环境地图的机器人安防巡检方法及其机器人
CN107186731A (zh) * 2017-06-29 2017-09-22 上海未来伙伴机器人有限公司 一种防卫机器人
CN108354526A (zh) * 2018-02-11 2018-08-03 深圳市沃特沃德股份有限公司 扫地机器人的安防方法及装置
CN108540780A (zh) * 2018-06-08 2018-09-14 苏州清研微视电子科技有限公司 基于扫地机器人设备的智能移动家庭监控系统
CN108942955A (zh) * 2018-07-04 2018-12-07 广东技术师范学院 一种家用监控机器人
CN113658406A (zh) * 2021-08-13 2021-11-16 广州果乐文化传播有限公司 智能巡防机器人
CN114489070A (zh) * 2022-01-24 2022-05-13 美的集团(上海)有限公司 家居巡检方法、非易失性可读存储介质和计算机设备

Also Published As

Publication number Publication date
CN114489070A (zh) 2022-05-13

Similar Documents

Publication Publication Date Title
WO2023138063A1 (zh) 家居巡检方法、非易失性可读存储介质和计算机设备
US10375361B1 (en) Video camera and sensor integration
US10623622B1 (en) Monitoring system configuration technology
US11143521B2 (en) System and method for aiding responses to an event detected by a monitoring system
US11151864B2 (en) System and method for monitoring a property using drone beacons
CN205334101U (zh) 一种智能家居系统
US20220351598A1 (en) Enhanced audiovisual analytics
US10756919B1 (en) Connected automation controls using robotic devices
AU2019295856B2 (en) Object tracking using disparate monitoring systems
CN105939236A (zh) 控制智能家居设备的方法及装置
US10706699B1 (en) Projector assisted monitoring system
US11972352B2 (en) Motion-based human video detection
WO2017146313A1 (ko) 사물인터넷과 광대역 레이더 센싱 기술을 이용한 가정용 인텔리전트 스마트 모니터링시스템
CN105159128A (zh) 一种智能家居设备的控制方法和控制系统
CN111844039A (zh) 一种基于机器人控制的智慧空间系统
US11074471B2 (en) Assisted creation of video rules via scene analysis
AU2019231258B2 (en) System and method for preventing false alarms due to display images
CN108132607A (zh) 一种智能家居控制系统及控制方法
US11846941B2 (en) Drone graphical user interface
US11734932B2 (en) State and event monitoring
Chiu et al. A smart home system with security and electrical appliances
Dong et al. An intelligent embedded cloud monitoring system design
KR20200115723A (ko) 영상 감시장치, 영상 분석 서버장치 및 그 학습 방법들
CN115052110B (zh) 安保方法、安保系统及计算机可读存储介质
Qi et al. Design of wireless smart home safety system based on visual identity

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22921491

Country of ref document: EP

Kind code of ref document: A1