CN115431266A - Inspection method, inspection device and inspection robot - Google Patents

Inspection method, inspection device and inspection robot Download PDF

Info

Publication number
CN115431266A
CN115431266A CN202211024331.XA CN202211024331A CN115431266A CN 115431266 A CN115431266 A CN 115431266A CN 202211024331 A CN202211024331 A CN 202211024331A CN 115431266 A CN115431266 A CN 115431266A
Authority
CN
China
Prior art keywords
inspection
inspection robot
abnormal object
robot
identified
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211024331.XA
Other languages
Chinese (zh)
Inventor
赵振亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Damo Institute Hangzhou Technology Co Ltd
Original Assignee
Alibaba Damo Institute Hangzhou Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Damo Institute Hangzhou Technology Co Ltd filed Critical Alibaba Damo Institute Hangzhou Technology Co Ltd
Priority to CN202211024331.XA priority Critical patent/CN115431266A/en
Publication of CN115431266A publication Critical patent/CN115431266A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1661Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1689Teleoperation
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C1/00Registering, indicating or recording the time of events or elapsed time, e.g. time-recorders for work people
    • G07C1/20Checking timed patrols, e.g. of watchman

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Alarm Systems (AREA)

Abstract

The embodiment of the application discloses a patrol method, a patrol device and a patrol robot. The method comprises the following steps: acquiring map data of a polling area, and controlling a polling robot to perform fixed-point polling according to the map data; if an abnormal object is identified in the fixed point inspection process, detecting the motion track of the abnormal object, and controlling the inspection robot to follow the abnormal object according to the motion track and perform alarm processing. This application can reduce the cost of labor who patrols and examines, improves and patrols and examines the effect.

Description

Inspection method, inspection device and inspection robot
Technical Field
The application relates to the technical field of computer application, in particular to a patrol method, a patrol device and a patrol robot.
Background
The inspection work plays an important role in production and life, and aims to discover and feed back various hidden dangers in time. The following three main modes exist in the traditional inspection work:
the first mode is that the inspection personnel walk back and forth in the inspection area to observe. This kind of mode consumes the cost of labor, and can't realize through the mode of artifical inspection under some special scenes such as high temperature, oxygen deficiency, high risk.
The second mode is that a camera is installed at a fixed position, and an inspector observes a real-time image shot by the camera in a security room to inspect. In the mode, the installation position of the camera is fixed, dead angles exist, warning cannot be given at any time when abnormality is found, and more labor cost is consumed.
The third mode is to arrange a fixed-point inspection robot in an inspection area, perform inspection according to a fixed route, and timely feed back when an abnormality is found. However, in this way, the inspection robot can only walk along a fixed route, and therefore the flexibility is poor and the inspection effect is not good.
Disclosure of Invention
In view of this, the application provides an inspection method, an inspection device and an inspection robot, so as to reduce labor cost and improve inspection effect.
The application provides the following scheme:
in a first aspect, a method for routing inspection is provided, which includes:
acquiring map data of a polling area, and controlling a polling robot to perform fixed-point polling according to the map data;
if an abnormal object is identified in the fixed point inspection process, detecting the motion track of the abnormal object, and controlling the inspection robot to follow the abnormal object according to the motion track and perform alarm processing.
According to an implementation manner of the embodiment of the present application, the acquiring the map data of the patrol area includes:
controlling the inspection robot to move in the inspection area, sensing and positioning the environment by using a sensor of the inspection robot in the moving process, and constructing map data of the inspection area by using the environment information and the positioning information obtained by sensing; alternatively, the first and second electrodes may be,
and acquiring map data of the inspection area which is constructed in advance and providing the map data to the inspection robot for fixed-point inspection.
According to an implementable manner in an embodiment of the present application, the identifying the abnormal object in the fixed point inspection process includes:
acquiring image information acquired by the inspection robot by using a visual sensor in the fixed-point inspection process;
performing target identification by using the image information;
if the preset object type is identified, determining that an abnormal object is identified; or, if the identified object is not a preset object type, determining that the abnormal object is identified.
According to an implementation manner in the embodiment of the present application, detecting the motion trajectory of the abnormal object includes:
determining the distance and direction of the abnormal object relative to the inspection robot at each moment by utilizing information acquired by a visual sensor and/or an acoustic sensor of the inspection robot;
and determining the motion trail of the abnormal object by combining the position information obtained by the positioning device of the inspection robot at each moment and the distance and the direction of the abnormal object relative to the inspection robot.
According to an implementation manner in the embodiment of the present application, controlling the inspection robot to follow the abnormal object according to the motion trajectory includes:
and controlling the inspection robot to move according to the movement track, and controlling the speed of the inspection robot so as to ensure the distance between the abnormal object and the abnormal object within a preset distance range.
According to an implementation manner in the embodiment of the present application, the performing of the alarm processing includes at least one of:
playing an alarm audio;
displaying alarm information on a screen of the inspection robot;
establishing voice communication connection with a management terminal, and playing voice information sent by the management terminal and/or sending collected voice information to the management terminal;
and establishing video communication connection with a management terminal, and playing the video information sent by the management terminal and/or sending the collected video information to the management terminal.
According to an implementation manner in the embodiment of the present application, after the performing the alarm processing, the method further includes:
and if the abnormal object cannot be identified within the set time length or the alarm processing is finished, continuing the fixed point inspection.
In a second aspect, an inspection device is provided, the device comprising:
a map acquisition unit configured to acquire map data of a patrol area;
the motion control unit is configured to control the inspection robot to perform fixed-point inspection according to the map data;
an abnormality identification unit configured to identify an abnormal object in the fixed point inspection process;
a trajectory detection unit configured to detect a motion trajectory of an abnormal object if the abnormal object is identified by the abnormality identification unit;
the motion control unit is further configured to control the inspection robot to follow the abnormal object according to the motion track;
and the alarm processing unit is configured to perform alarm processing if the abnormal object is identified by the abnormal identification unit.
According to a third aspect, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, carries out the steps of the method of any of the first aspects described above.
According to a fourth aspect, there is provided a patrol robot, comprising:
one or more processors; and
a memory associated with the one or more processors for storing program instructions that, when read and executed by the one or more processors, perform the steps of the method of any of the first aspects described above.
According to the specific embodiments provided by the present application, the present application can have the following technical effects:
1) The robot that patrols and examines in this application carries out to patrol and examine, and the personnel of patrolling and examining are compared and are patrolled and examined and the mode that the installation camera was observed at the security room by the personnel of patrolling and examining, greatly reduced the cost of labor.
2) The inspection robot detects the abnormal object in the fixed-point inspection process, the motion track of the abnormal object can be detected, the inspection robot is controlled to follow the abnormal object and perform alarm processing, the flexibility is high, and the inspection effect is better.
3) Thereby it patrols and examines the robot and patrol and examine the image information that the in-process can utilize vision sensor to gather at the fixed point and carry out target identification unusual object, compares and sets up the camera in fixed position, and the visual angle is more nimble.
Of course, it is not necessary for any product to achieve all of the above-described advantages at the same time for the practice of the present application.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings required in the embodiments will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 illustrates an exemplary system architecture to which embodiments of the present application may be applied;
fig. 2 is a flowchart of a polling method provided in the embodiment of the present application;
FIG. 3 is a schematic diagram of one example provided by an embodiment of the present application;
fig. 4 is a schematic block diagram of an inspection device provided in an embodiment of the present application;
fig. 5 is a schematic block diagram of an electronic device provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application are within the scope of protection of the present application.
The terminology used in the embodiments of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the description of the invention and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be understood that the term "and/or" as used herein is merely one type of association that describes an associated object, meaning that three relationships may exist, e.g., a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
The word "if," as used herein, may be interpreted as "at … …" or "at … …" or "in response to a determination" or "in response to a detection," depending on context. Similarly, the phrases "if determined" or "if detected (a stated condition or event)" may be interpreted as "when determined" or "in response to a determination" or "when detected (a stated condition or event)" or "in response to a detection (a stated condition or event)", depending on the context.
FIG. 1 illustrates an exemplary system architecture to which embodiments of the present application may be applied. As shown in fig. 1, the system architecture may include an inspection robot, an inspection device, and a server, and may further include a management terminal. The inspection robot and the server and the management terminal can communicate through a network. The network may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The inspection robot is equipment for realizing inspection tasks through technologies such as navigation positioning, path planning, image analysis, environment perception and the like, manual intervention is not needed, and judgment and automatic alarm aiming at abnormal objects can be provided, so that automatic management is realized.
The inspection device is used for acquiring data acquired by various sensors of the inspection robot and making decisions according to the data to control the inspection robot to execute inspection tasks. The inspection device can be arranged in the inspection robot, but also can be arranged in the server to uniformly control the inspection robot. But along with patrolling and examining the continuous promotion of robot performance, the device of patrolling and examining sets up in patrolling and examining the robot usually. The inspection device may be implemented as a plurality of software or software modules (for example, to provide distributed services), or may be implemented as a single software or software module, which is not specifically limited herein.
The inspection robot can upload an inspection report or an inspection log to the server periodically or triggered by a specific event to execute the inspection task. The server stores the inspection reports or the inspection logs uploaded by the inspection robots so as to meet the requirements of inquiry, statistics, analysis and the like.
In addition, the inspection robot can establish communication connection with the management terminal through the server, for example, video communication connection or voice communication connection is established with the management terminal, and the video stream or voice stream from the management terminal is played.
The server may be a single server, a server group formed by a plurality of servers, or a cloud server. The cloud Server is also called a cloud computing Server or a cloud host, and is a host product in a cloud computing service system, so as to solve the defects of large management difficulty and weak service expansibility in the conventional physical host and Virtual Private Server (VPs) service.
It should be understood that the number of inspection robots, inspection devices, servers, and management terminals in fig. 1 are merely illustrative. Any number of inspection robots, inspection devices, servers and management terminals can be provided according to implementation needs.
Fig. 2 is a flowchart of an inspection method according to an embodiment of the present disclosure, where the inspection method may be performed by an inspection device in the system architecture shown in fig. 1. As shown in fig. 2, the method may include the steps of:
step 202: and acquiring map data of the inspection area.
Step 204: and controlling the inspection robot to perform fixed-point inspection according to the map data.
Step 206: whether an abnormal object is identified in the fixed point inspection process, if so, executing step 208; otherwise, continuing to carry out fixed-point inspection.
Step 208: and detecting the motion track of the abnormal object, controlling the inspection robot to follow the abnormal object according to the motion track and performing alarm processing.
It can be seen from above-mentioned flow that patrolling and examining the robot in this application detects unusual object in the fixed point patrols and examines the in-process, then can detect the motion trail of unusual object, and the control is patrolled and examined the robot and is followed unusual object and carry out alarm processing, and the flexibility is higher, and it is better to patrol and examine the effect.
The inspection method shown in the above flow can be applied to various application scenarios.
For example, the inspection robot performs personnel inspection in an inspection area, and finds that offenders, suspicious persons, and the like follow the personnel inspection and alarm to expel the offenders or possible personnel.
For another example, the inspection robot inspects the intelligent equipment in an inspection area, for example, the inspection robot inspects the logistics robot in a logistics industrial park, and if the logistics robot is found to be abnormal, the inspection robot follows the abnormal logistics robot to acquire detailed abnormal information and give an alarm, so that the abnormal logistics robot can be repaired in time.
But also can be applied to other application scenes, which are not listed here. In the following embodiments, the person inspection performed by the inspection robot will be described as an example.
The steps in the above-described flow are described in detail below. First, the above step 202, i.e., "obtaining map data of inspection area", will be described in detail with reference to the embodiments.
The inspection area is an area which is defined in advance and is required to be inspected by the inspection robot, and can be a cell, a building, a school, a factory and the like. The map data of the patrol area includes position information of the patrol area, environment information in the patrol area, and the like. Wherein the environmental information in the patrol area may include roads, obstacles, and the like in the patrol area.
The inspection robot can be controlled to move in an inspection area, a sensor of the inspection robot is used for sensing and positioning the environment in the moving process, and the environment information and the positioning information obtained by sensing are used for constructing map data of the inspection area.
The environment sensing method comprises the following steps of utilizing a laser radar, a vision sensor and the like on an inspection robot to acquire surrounding environment information in a motion process so as to construct a local environment map. When the positioning is carried out, the positioning device on the inspection robot can be used for realizing the positioning. After the environmental information and the positioning information are obtained through perception, the map data of the routing inspection area can be constructed by utilizing the existing map construction mode, and the detail is not described here in view of the fact that the map data are mature technologies at present.
As another realizable mode, the map data of the inspection area, which has been obtained in advance, may be preset in the inspection robot, or may be transmitted to the inspection robot through the server. The manner in which the map data of the inspection area is obtained in advance is not limited herein, and may be, for example, constructed in a manner of mapping or the like, constructed in advance by other sensing devices, constructed in advance by acquiring data such as images and point clouds by an acquisition vehicle, or the like.
The above step 204, i.e., "controlling the inspection robot to perform fixed point inspection according to the map data" will be described in detail with reference to the embodiments.
When the inspection robot is controlled to perform fixed-point inspection, a plurality of inspection points can be set in the map data of the inspection area, the inspection robot stores the position information (such as coordinate data) of the inspection points, and then at least one inspection path is formed in the inspection area according to the position information of the inspection points. The inspection robot inspects according to the inspection path, and the process is called fixed-point inspection.
In the embodiment of the application, the inspection robot executes the fixed point inspection task under the normal condition. But the abnormal object can be identified through the sensor in the fixed-point inspection process. The identification of the abnormal object may be assisted mainly by a visual sensor or by another sensor, for example, an infrared sensor or an acoustic sensor.
As one of the realizable modes, the method can acquire the image information acquired by the inspection robot by using the visual sensor in the fixed-point inspection process; then, carrying out target identification by utilizing the image information, and if a preset object type is identified, determining that an abnormal object is identified, wherein the method is similar to a blacklist mode; or if the identified object is not the preset object type, determining that the abnormal object is identified, similar to the white list mode.
The task of target recognition, which is mainly to find out the target of interest in the image to determine the category of the target, is one of the core problems in the field of computer vision. The manner of the target recognition may include a manner of template matching, a manner of image matching, a manner based on a deep learning model, and the like, and the application does not limit the specific manner of the target recognition.
By taking the inspection robot to inspect the illegal personnel in the inspection area as an example, the inspection robot can acquire images through the camera and perform target recognition on the images. For example, a person who does not wear a mask may be set as an offender in advance for a specific public area, and therefore, when performing target recognition, whether the person in the image wears the mask may be recognized by using a deep learning model, template matching, or the like, and if the person does not wear the mask, the offender may be confirmed to be recognized.
For another example, a person carrying a stick, a knife, or the like may be preset as a suspicious person for a school area. When the target is identified, whether the person in the image carries a stick, a prop and the like can be identified by adopting a deep learning model or template matching and other modes, and if so, the suspicious person is identified.
For another example, medical staff, hotel attendants, and the like may be set in advance as legal persons for the isolated area. When the target is identified, whether the person in the image is a medical person wearing a protective clothing or a hotel attendant wearing the protective clothing can be identified by adopting a deep learning model, template matching and other modes, if not, the person is possibly an isolated person who should not come out of a room, and the illegal person is determined to be identified. Or, images of the isolation personnel can be obtained and stored in advance, the collected images are matched with the prestored images of the isolation personnel in the inspection process, and if the matching result shows that the personnel in the images are the isolation personnel, the illegal personnel are determined to be identified.
The following describes in detail the above step 208, that is, "detect the motion trajectory of the abnormal object, control the inspection robot to follow the abnormal object according to the motion trajectory and perform alarm processing" with reference to the embodiment.
After the abnormal object is detected, the motion trail of the abnormal object can be detected by using a sensor of the inspection robot. As one of the realizable modes, the distance and the direction of the abnormal object relative to the inspection robot at each moment can be determined by utilizing the information acquired by the visual sensor and/or the acoustic sensor of the inspection robot; and then determining the motion trail of the abnormal object by combining the position information obtained by the positioning device of the inspection robot at each moment and the distance and the direction of the abnormal object relative to the inspection robot.
Wherein the visual sensor may be more than one camera, video camera, still camera, etc. For example, the multi-view camera may be used to acquire depth information, attitude information, and the like of the abnormal object in the image, so as to further determine the distance and direction of the abnormal object relative to the inspection robot.
Wherein the acoustic sensor may be a microphone array or the like. For example, sound information of the abnormal object may be collected by using a microphone array, and the position and direction of the sound source (i.e., the abnormal object) may be calculated according to a time difference of sound reaching the microphone array, so as to further determine the distance and direction of the abnormal object with respect to the inspection robot.
After the distance and the direction of the abnormal object relative to the inspection robot at each moment are obtained, the position information (for example, coordinate information) of the abnormal object at each moment can be obtained by adopting geometric calculation in combination with information such as the position and the direction of the inspection robot, so that the motion trail of the abnormal object can be obtained.
When the inspection robot is controlled to move according to the movement track of the abnormal object, in order to avoid the condition that the inspection robot is too far away from the abnormal object and cannot achieve a good alarm effect, and simultaneously avoid the condition that the inspection robot is too close to the abnormal object and brings a bad feeling to the abnormal object, the speed of the inspection robot can be controlled so as to ensure that the distance between the inspection robot and the abnormal object is within a preset distance range. For example, the distance between the inspection robot and the offender can be controlled within the range of 2-5 meters.
The main purpose of alarm processing is to inform abnormal objects, and also can inform management personnel of the information of the abnormal objects. Therefore, performing alarm processing may include, but is not limited to, the following:
the first mode is as follows: and playing the alarm audio.
For example, the inspection robot may play a voice alert such as "please wear the mask" when it identifies an offending person who is not wearing the mask. For another example, if the inspection robot recognizes the isolated person, a voice alert such as "please return to your room and do not move around" may be played.
The warning audio frequency can be prestored in the inspection robot, and the inspection robot broadcasts the warning audio frequency.
The second mode is as follows: and displaying alarm information on a screen of the inspection robot.
For example, when the inspection robot identifies an illegal person who does not wear a mask, video information or picture information for publicizing that the mask is worn can be displayed on the screen. For another example, if the inspection robot recognizes the isolated person, video information or picture information of the isolated person notice may be displayed on the screen.
The third mode is as follows: and establishing voice communication connection with the management terminal, and playing the voice information sent by the management terminal and/or sending the collected voice information to the management terminal.
For example, the inspection robot recognizes the isolated person, and then can establish voice communication connection with the management terminal in time, the manager at the management terminal can send voice information for prompting the isolated person to return to the room quickly through the voice communication connection, and the inspection robot plays the voice information, and can also send the voice of the isolated person to the management terminal.
The fourth mode is as follows: and establishing video communication connection with the management terminal, and playing the video information sent by the management terminal and/or sending the collected video information to the management terminal.
For example, after identifying the staff holding the stick in the campus, the inspection robot quickly establishes video communication interception with the management terminal, and the management staff or police staff and the like at the management terminal can know the actual situation in a video mode and can further placate and warn the staff holding the stick in the video mode.
The above modes may be used alternatively, or may be used in combination with each other, or may also be used in other modes to perform alarm processing, which are not listed here.
After the alarm processing is performed, as one of the realizable manners, if an abnormal object cannot be identified within a set time length, for example, the abnormal object disappears from the "visual field" of the inspection robot, the inspection robot can be controlled to continue to perform fixed-point inspection, that is, to return to a fixed path for inspection.
As another implementation manner, if the alarm processing is finished, for example, the time duration for broadcasting the alarm voice exceeds the preset time duration threshold, and for example, the video communication established with the management terminal is ended, the inspection robot may be controlled to continue to perform the fixed-point inspection, that is, to return to the fixed path to perform the inspection.
Two complete examples are listed below:
example 1: as shown in fig. 3, the inspection robot is set to perform inspection in public areas in the isolated hotel. The inspection robot moves in public areas of isolated hotels in advance, environment sensing is carried out by utilizing sensors such as laser radars, vision sensors and the like of the inspection robot in the moving process, and positioning is carried out by utilizing a positioning device of the inspection robot, so that map data of the public areas of the isolated hotels are constructed.
The inspection robot is controlled to perform fixed-point inspection along a fixed path (indicated by a dotted line in the figure) according to the map data. The inspection robot can acquire images by using the visual sensor in the inspection process and recognize targets by using the acquired image information. For example, a deep learning model is used to identify whether a person is identified in the image information and whether the person is a medical person or a hotel attendant. In the recognition, the person can recognize the garment, the accessory, the characters on the garment, and the like. An isolated person may be considered identified if the person in the image is identified as not being a medical person or a hotel attendant. At the moment, the distance and the direction of the isolation personnel can be detected through sensors such as a camera and a microphone array, and the position information of the isolation personnel at each moment is determined by combining the position information of the inspection robot, so that the motion trail of the isolation personnel is determined. The inspection robot is controlled to follow the isolation personnel according to the motion track, and the speed of the inspection robot is controlled so that the inspection robot and the isolation personnel can keep a reasonable distance range all the time. And the system reports 'please return to the room of the user without walking at will' in a voice alarm mode. If the voice alarm lasts for a certain time, the isolation personnel can still be identified, namely, the isolation personnel does not leave, the video connection can be established with the management terminal, and the manager advises the isolation personnel. If the isolated person is not identified within the set time, the inspection robot can return to the fixed path to continue the fixed-point inspection.
Example 2: and arranging an inspection robot to inspect in the logistics park. Map data of the logistics park can be led into the inspection robot in advance, and the inspection robot is controlled to conduct fixed-point inspection along a fixed path according to the map data.
In the inspection process, the inspection robot acquires images by using the visual sensor and identifies targets by using the acquired image information. For example, whether the logistics robot in the image information is abnormal is identified by using the deep learning model, for example, if the logistics robot in the image information is identified by the deep learning model to be abnormal, such as winding, abnormal operation, damaged parts and the like, the abnormal object can be considered to be identified. At the moment, the distance and the direction of the abnormal logistics robot can be detected through sensors such as a camera and the like, and the position information of the abnormal logistics robot at each moment is determined by combining the position information of the inspection robot, so that the motion track of the abnormal logistics robot is determined. And controlling the inspection robot to follow the abnormal logistics robot according to the motion track, and continuously acquiring image or video information so as to acquire more detailed abnormal information. And alarm processing is carried out in time, for example, video communication connection is established with the management terminal, the collected image or video information is transmitted to the management terminal through the video communication connection, and the manager finds abnormality in time and responds quickly. If the alarm is over, the inspection robot can return to the fixed path to continue fixed point inspection.
The foregoing description has been directed to specific embodiments of this disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
According to an embodiment of another aspect, an inspection device is provided. Fig. 4 illustrates a schematic block diagram of an inspection device according to one embodiment. The inspection device can be arranged in an inspection robot to be realized, and can also be arranged at a server to be realized. As shown in fig. 4, the apparatus 400 includes: a map acquisition unit 401, a motion control unit 402, an abnormality recognition unit 403, a trajectory detection unit 404, and an alarm processing unit 405. The main functions of each component unit are as follows:
a map acquisition unit 401 configured to acquire map data of the patrol area.
A motion control unit 402 configured to control the inspection robot to perform fixed point inspection according to the map data.
An anomaly identification unit 403 configured to identify an abnormal object in the fixed point inspection process.
A trajectory detection unit 404 configured to detect a motion trajectory of the abnormal object if the abnormal object is identified by the abnormality identification unit 403.
The motion control unit 402 is also configured to control the inspection robot to follow the abnormal object according to the motion track.
An alarm processing unit 405 configured to perform alarm processing if the abnormal object is identified by the abnormal identification unit 403.
As one of the realizable manners, the map obtaining unit 401 may be specifically configured to: the inspection robot is controlled to move in an inspection area, a sensor of the inspection robot is used for sensing and positioning the environment in the moving process, and the environment information and the positioning information obtained through sensing are used for constructing map data of the inspection area.
When environment sensing is carried out, environmental information around the inspection robot, such as a laser radar and a visual sensor, in the motion process is acquired, so that a local environment map is constructed. When the positioning is carried out, the positioning device on the inspection robot can be used for realizing the positioning. After the environmental information and the positioning information are obtained through sensing, the map data of the routing inspection area can be constructed by utilizing the existing map construction mode.
As another implementable manner, the map acquisition unit 401 may be specifically configured to: and obtaining map data of a pre-constructed inspection area and providing the map data for the inspection robot to perform fixed-point inspection.
The manner in which the map data of the inspection area is obtained in advance is not limited herein, and may be, for example, constructed in a manner of mapping or the like, constructed in advance by other sensing devices, constructed in advance by acquiring data such as images and point clouds by an acquisition vehicle, or the like.
As one of the realizable manners, the abnormality identifying unit 403 may be specifically configured to: acquiring image information acquired by the inspection robot by using a visual sensor in the fixed-point inspection process; carrying out target identification by utilizing the image information; if the preset object type is identified, determining that an abnormal object is identified; or, if the identified object is not a preset object type, determining that the abnormal object is identified.
As one of the realizable manners, the trajectory detection unit 404 may be specifically configured to: determining the distance and direction of the abnormal object relative to the inspection robot at each moment by utilizing information acquired by a visual sensor and/or an acoustic sensor of the inspection robot; and determining the motion trail of the abnormal object by combining the position information obtained by the positioning device of the inspection robot at each moment and the distance and direction of the abnormal object relative to the inspection robot.
As one of the realizable ways, the motion control unit 402 may be specifically configured to: and controlling the inspection robot to move according to the movement track, and controlling the speed of the inspection robot to ensure that the distance between the inspection robot and the abnormal object is within a preset distance range.
As several realizable modes, the alarm processing unit 405 may perform at least one of the following alarm processing modes:
playing an alarm audio;
displaying alarm information on a screen of the inspection robot;
establishing voice communication connection with a management terminal, and playing voice information sent by the management terminal and/or sending collected voice information to the management terminal;
and establishing video communication connection with the management terminal, and playing the video information sent by the management terminal and/or sending the collected video information to the management terminal.
As one of the realizable manners, the motion control unit 402 is further configured to: if the abnormal object cannot be identified by the abnormal identification unit 403 within the set time length after the alarm processing unit 405 performs the alarm processing or the alarm processing of the alarm processing unit 405 is completed, the inspection robot continues to be controlled to perform the fixed-point inspection.
In addition, in the above-mentioned apparatus, the inspection robot may include a display screen, a microphone, a speaker, and the like, and after recognizing the abnormal object, and establishing a voice communication connection or a video communication connection with the management terminal, the abnormal object may perform a voice or video call with the manager through voice, video, and the like in addition to the alarm function, to describe a specific situation. If the manager confirms that the abnormal object identified by the inspection robot can be released, the manager can send an instruction to the inspection robot through the management terminal to instruct the inspection robot to continue to carry out fixed-point cruising.
It should be noted that, in the embodiments of the present application, the user data may be used, and in practical applications, the user-specific personal data may be used in the scheme described herein within the scope permitted by the applicable law, under the condition of meeting the requirements of the applicable law and regulations in the country (for example, the user explicitly agrees, the user is informed, etc.).
In addition, the present application also provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor implements the steps of the method described in any of the preceding method embodiments.
The present application also provides a computer program product comprising a computer program which, when executed by a processor, performs the steps of the method of any of the preceding method embodiments.
And an electronic device, which may be embodied as a patrol robot, comprising:
one or more processors; and
a memory associated with the one or more processors for storing program instructions that, when read and executed by the one or more processors, perform the steps of the method of any of the preceding method embodiments.
Fig. 5 exemplarily shows an architecture of an electronic device, and may specifically include a processor 510, a video display adapter 511, a disk drive 512, an input/output interface 513, a network interface 514, and a memory 520. The processor 510, the video display adapter 511, the disk drive 512, the input/output interface 513, the network interface 514, and the memory 520 may be communicatively connected by a communication bus 530.
The processor 510 may be implemented by a general-purpose CPU, a microprocessor, an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits, and is configured to execute a relevant program to implement the technical solution provided in the present Application.
The Memory 520 may be implemented in the form of a ROM (Read Only Memory), a RAM (Random Access Memory), a static storage device, a dynamic storage device, or the like. The memory 520 may store an operating system 521 for controlling the operation of the electronic device 500, and a Basic Input Output System (BIOS) 522 for controlling low-level operations of the electronic device 500. In addition, a web browser 523, a data storage management system 524, and a routing inspection device 525, etc. may also be stored. The inspection device 525 may be an application program that implements the operations of the foregoing steps in this embodiment of the present application. In summary, when the technical solution provided in the present application is implemented by software or firmware, the relevant program codes are stored in the memory 520 and called to be executed by the processor 510.
The input/output interface 513 is used for connecting an input/output module to realize information input and output. The i/o module may be configured as a component within the device (not shown) or may be external to the device to provide corresponding functionality. Wherein the input devices may include a keyboard, mouse, touch screen, microphone, various sensors, etc., and the output devices may include a display, speaker, vibrator, indicator light, etc.
The network interface 514 is used for connecting a communication module (not shown in the figure) to realize communication interaction between the device and other devices. The communication module can realize communication in a wired mode (such as USB, network cable and the like) and also can realize communication in a wireless mode (such as mobile network, WIFI, bluetooth and the like).
Bus 530 includes a path that transfers information between the various components of the device, such as processor 510, video display adapter 511, disk drive 512, input/output interface 513, network interface 514, and memory 520.
It should be noted that although the above-mentioned devices only show the processor 510, the video display adapter 511, the disk drive 512, the input/output interface 513, the network interface 514, the memory 520, the bus 530, etc., in a specific implementation, the device may also include other components necessary for normal operation. Furthermore, it will be understood by those skilled in the art that the apparatus described above may also include only the components necessary to implement the solution of the present application, and not necessarily all of the components shown in the figures.
From the above description of the embodiments, it is clear to those skilled in the art that the present application can be implemented by software plus necessary general hardware platform. Based on such understanding, the technical solutions of the present application may be substantially implemented or contributed by the prior art in the form of a computer program product, which may be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, or the like, and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute the method according to the embodiments or some parts of the embodiments of the present application.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, the system or system embodiments, which are substantially similar to the method embodiments, are described in a relatively simple manner, and reference may be made to some descriptions of the method embodiments for relevant points. The above-described system and system embodiments are only illustrative, wherein the units described as separate parts may or may not be physically separate, and the parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
The technical solutions provided by the present application are introduced in detail above, and specific examples are applied in the present application to explain the principles and embodiments of the present application, and the descriptions of the above examples are only used to help understanding the method and the core ideas of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, the specific embodiments and the application range may be changed. In view of the above, the description should not be taken as limiting the application.

Claims (10)

1. A routing inspection method is characterized by comprising the following steps:
acquiring map data of a polling area, and controlling a polling robot to perform fixed-point polling according to the map data;
if an abnormal object is identified in the fixed point inspection process, detecting the motion track of the abnormal object, and controlling the inspection robot to follow the abnormal object according to the motion track and perform alarm processing.
2. The method of claim 1, wherein the obtaining map data for the inspection area comprises:
controlling the inspection robot to move in the inspection area, sensing and positioning the environment by using a sensor of the inspection robot in the moving process, and constructing map data of the inspection area by using the environment information and the positioning information obtained by sensing; alternatively, the first and second electrodes may be,
and obtaining map data of the inspection area which is constructed in advance and providing the map data for the inspection robot to perform fixed-point inspection.
3. The method of claim 1, wherein identifying an anomalous object in the fixed point inspection process comprises:
acquiring image information acquired by the inspection robot by using a visual sensor in the fixed-point inspection process;
carrying out target identification by using the image information;
if the preset object type is identified, determining that an abnormal object is identified; or, if the identified object is not the preset object type, determining that the abnormal object is identified.
4. The method of claim 1, wherein detecting the motion trajectory of the abnormal object comprises:
determining the distance and direction of the abnormal object relative to the inspection robot at each moment by using information acquired by a visual sensor and/or an acoustic sensor of the inspection robot;
and determining the motion trail of the abnormal object by combining the position information obtained by the positioning device of the inspection robot at each moment and the distance and the direction of the abnormal object relative to the inspection robot.
5. The method of claim 1, wherein controlling the inspection robot to follow the abnormal object according to the motion trajectory comprises:
and controlling the inspection robot to move according to the movement track, and controlling the speed of the inspection robot so as to ensure the distance between the abnormal object and the abnormal object within a preset distance range.
6. The method of claim 1, wherein the performing alarm processing comprises at least one of:
playing an alarm audio;
displaying alarm information on a screen of the inspection robot;
establishing voice communication connection with a management terminal, and playing voice information sent by the management terminal and/or sending collected voice information to the management terminal;
and establishing video communication connection with a management terminal, and playing the video information sent by the management terminal and/or sending the collected video information to the management terminal.
7. The method according to any of claims 1 to 6, characterized in that after said alarm handling, the method further comprises:
and if the abnormal object cannot be identified within the set time length or the alarm processing is finished, continuing the fixed point inspection.
8. An inspection device, the device comprising:
a map acquisition unit configured to acquire map data of a patrol area;
the motion control unit is configured to control the inspection robot to perform fixed-point inspection according to the map data;
an abnormality identification unit configured to identify an abnormal object in the fixed point inspection process;
a trajectory detection unit configured to detect a motion trajectory of an abnormal object if the abnormal object is identified by the abnormality identification unit;
the motion control unit is further configured to control the inspection robot to follow the abnormal object according to the motion track;
and the alarm processing unit is configured to perform alarm processing if the abnormal object is identified by the abnormal identification unit.
9. An inspection robot, comprising:
one or more processors; and
a memory associated with the one or more processors for storing program instructions that, when read and executed by the one or more processors, perform the steps of the method of any of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 7.
CN202211024331.XA 2022-08-24 2022-08-24 Inspection method, inspection device and inspection robot Pending CN115431266A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211024331.XA CN115431266A (en) 2022-08-24 2022-08-24 Inspection method, inspection device and inspection robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211024331.XA CN115431266A (en) 2022-08-24 2022-08-24 Inspection method, inspection device and inspection robot

Publications (1)

Publication Number Publication Date
CN115431266A true CN115431266A (en) 2022-12-06

Family

ID=84243684

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211024331.XA Pending CN115431266A (en) 2022-08-24 2022-08-24 Inspection method, inspection device and inspection robot

Country Status (1)

Country Link
CN (1) CN115431266A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116125997A (en) * 2023-04-14 2023-05-16 北京安录国际技术有限公司 Intelligent inspection control method and system for robot
CN117708382A (en) * 2023-10-12 2024-03-15 广州信邦智能装备股份有限公司 Inspection data processing method, intelligent factory inspection system and related medium program

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106023251A (en) * 2016-05-16 2016-10-12 西安斯凯智能科技有限公司 Tracking system and tracking method
CN107680195A (en) * 2017-11-13 2018-02-09 国网内蒙古东部电力有限公司 A kind of transformer station intelligent robot inspection Computer Aided Analysis System and method
CN109571468A (en) * 2018-11-27 2019-04-05 深圳市优必选科技有限公司 Security protection crusing robot and security protection method for inspecting
CN109676618A (en) * 2018-12-10 2019-04-26 江门市蓬江区联诚达科技发展有限公司 Security protection crusing robot and its automatic detecting method
CN113084776A (en) * 2021-03-19 2021-07-09 上海工程技术大学 Intelligent epidemic prevention robot and system based on vision and multi-sensor fusion
CN113434728A (en) * 2021-08-25 2021-09-24 阿里巴巴达摩院(杭州)科技有限公司 Video generation method and device
WO2022021739A1 (en) * 2020-07-30 2022-02-03 国网智能科技股份有限公司 Humanoid inspection operation method and system for semantic intelligent substation robot

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106023251A (en) * 2016-05-16 2016-10-12 西安斯凯智能科技有限公司 Tracking system and tracking method
CN107680195A (en) * 2017-11-13 2018-02-09 国网内蒙古东部电力有限公司 A kind of transformer station intelligent robot inspection Computer Aided Analysis System and method
CN109571468A (en) * 2018-11-27 2019-04-05 深圳市优必选科技有限公司 Security protection crusing robot and security protection method for inspecting
CN109676618A (en) * 2018-12-10 2019-04-26 江门市蓬江区联诚达科技发展有限公司 Security protection crusing robot and its automatic detecting method
WO2022021739A1 (en) * 2020-07-30 2022-02-03 国网智能科技股份有限公司 Humanoid inspection operation method and system for semantic intelligent substation robot
CN113084776A (en) * 2021-03-19 2021-07-09 上海工程技术大学 Intelligent epidemic prevention robot and system based on vision and multi-sensor fusion
CN113434728A (en) * 2021-08-25 2021-09-24 阿里巴巴达摩院(杭州)科技有限公司 Video generation method and device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116125997A (en) * 2023-04-14 2023-05-16 北京安录国际技术有限公司 Intelligent inspection control method and system for robot
CN117708382A (en) * 2023-10-12 2024-03-15 广州信邦智能装备股份有限公司 Inspection data processing method, intelligent factory inspection system and related medium program

Similar Documents

Publication Publication Date Title
CN108615321B (en) Security pre-warning system and method based on radar detecting and video image behavioural analysis
CN115431266A (en) Inspection method, inspection device and inspection robot
CN204288415U (en) Intelligent early-warning firefighting robot
CN108297058A (en) Intelligent security guard robot and its automatic detecting method
CN106341661B (en) Patrol robot
CN108297059A (en) Novel intelligent security robot and its automatic detecting method
CN108284427A (en) Security robot and its automatic detecting method
CN103235562A (en) Patrol-robot-based comprehensive parameter detection system and method for substations
CN104299351A (en) Intelligent early warning and fire extinguishing robot
CN114155601A (en) Vision-based method and system for detecting dangerous behaviors of operating personnel
JP7505609B2 (en) Optical fiber sensing system and behavior identification method
CN111753780B (en) Transformer substation violation detection system and violation detection method
JP2013131159A (en) Area monitoring system
CN108638082A (en) Security robot system based on Internet of Things
CN115659452B (en) Intelligent patrol method, intelligent patrol system and computer readable storage medium
CN115376269B (en) Fire monitoring system based on unmanned aerial vehicle
JP2022526071A (en) Situational awareness monitoring
CN110703760A (en) Newly-increased suspicious object detection method for security inspection robot
CN111064935B (en) Intelligent construction site personnel posture detection method and system
JP2005253189A (en) Distribution line patrol system and method
CN115082765B (en) On-site operation management method and on-site operation management system
CN108416953B (en) Intelligent optical fiber perimeter alarm system
CN115661966A (en) Inspection system and method based on augmented reality
CN115802002A (en) Safety monitoring method for electric power operation
CN115857536A (en) Unmanned aerial vehicle intelligent inspection method, device, equipment and medium for workshop equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination