CN112697134A - Environment sensing method, system and equipment of indoor inspection robot and computer readable storage medium thereof - Google Patents

Environment sensing method, system and equipment of indoor inspection robot and computer readable storage medium thereof Download PDF

Info

Publication number
CN112697134A
CN112697134A CN202011591796.4A CN202011591796A CN112697134A CN 112697134 A CN112697134 A CN 112697134A CN 202011591796 A CN202011591796 A CN 202011591796A CN 112697134 A CN112697134 A CN 112697134A
Authority
CN
China
Prior art keywords
inspection robot
information
indoor
environment
perception
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202011591796.4A
Other languages
Chinese (zh)
Inventor
梁之立
潘太
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Yusheng Robot Technology Co Ltd
Original Assignee
Nanjing Yusheng Robot Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Yusheng Robot Technology Co Ltd filed Critical Nanjing Yusheng Robot Technology Co Ltd
Priority to CN202011591796.4A priority Critical patent/CN112697134A/en
Publication of CN112697134A publication Critical patent/CN112697134A/en
Withdrawn legal-status Critical Current

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses an environment perception method, a system, equipment and a computer readable storage medium of an indoor inspection robot, wherein the method comprises the following steps: constructing a coordinate system of the inspection robot to acquire coordinate information of the inspection robot through an included angle between the inspection robot and a coordinate axis; acquiring environment perception data according to the coordinate position depth visual perception and audio recognition and feature analysis of the inspection robot; constructing a local map of the inspection robot according to the acquired environment sensing data, and judging the running track of the inspection robot; planning a running angle of the inspection robot according to the feedback information of the acquired environment sensing data; the method extracts the light beams to project to the grid map unit through the probability heuristic model, extracts the consistent characteristic information by utilizing the redundant information in the laser and visual information, performs information fusion of characteristic levels, and further improves the information response of the perception system.

Description

Environment sensing method, system and equipment of indoor inspection robot and computer readable storage medium thereof
Technical Field
The invention relates to an intelligent sensing control technology, in particular to an environment sensing method, system and equipment of an indoor inspection robot and a computer readable storage medium thereof.
Background
With the development of information technology and social progress requirements in China, information equipment rooms such as a control center, a computer room, a network room and a program control room are more and more applied, the control of temperature, humidity change and cleanliness in the data center room is a premise for ensuring normal operation of equipment, environment perception and operation control of the data center room are not mature at present, and an operation system is incomplete to cause unstable operation of the data center room, so that large-scale maintenance is formed, and many problems in management are caused.
The existing 2D lidar working mode is limited, and for a scene with approximate indoor environment characteristics, characteristic matching failure is easy to occur, so that a positioning error or failure occurs, the problems of low precision, interference and insufficient reliability exist when a single sensor is adopted for simultaneous positioning and map creation, the existing data interaction cannot realize transmission of single-step operation instructions and completion notification of tasks, fault points cannot be accurately obtained when data are in error, and the operation efficiency of the system is further reduced.
Disclosure of Invention
In view of the foregoing, it is desirable to provide an environment sensing method, system, device and computer readable storage medium for an indoor inspection robot, which can globally manage system operation through multi-sensing data fusion and multi-task device management.
The technical scheme of the invention is as follows:
an environment perception method of an indoor inspection robot, the method comprising:
step S100: constructing a coordinate system of the inspection robot to acquire coordinate information of the inspection robot through an included angle between the inspection robot and a coordinate axis;
step S200: acquiring environment perception data according to the coordinate position depth visual perception and audio recognition and feature analysis of the inspection robot;
step S300: constructing a local map of the inspection robot according to the acquired environment sensing data, and judging the running track of the inspection robot;
step S400: and planning the running angle of the inspection robot according to the feedback information of the acquired environment sensing data.
Specifically, when coordinate information of the inspection robot is matched with the collected environment sensing data, an indoor coordinate system and a local coordinate system are formed, wherein a local pose vector of the inspection robot is as follows:
Figure BDA0002869398340000021
obtaining the position of the inspection robot according to the pose vector, wherein the position of the inspection robot is (X)2,Y1) Is a local coordinate system, and is characterized in that,
Figure BDA0002869398340000022
the included angle between the running direction of the inspection robot and the X axis in the indoor coordinate system is set;
the position of the indoor coordinate system is (X, Y),
Figure BDA0002869398340000023
the included angle between the running direction of the robot and the Y axis in the indoor coordinate system is inspected.
Specifically, the collected environment perception data form a Bayesian method through depth visual perception and multi-sensor data fusion, the Bayesian method extracts light beams to be projected to a grid map unit through a probability heuristic model, extracts consistent characteristic information by using redundant information in laser and visual information, and performs information fusion at characteristic level.
Specifically, the Bayesian method is formed by the depth visual perception and the multi-sensor data fusion, and the probability of the time K is set to be X through the vector Z and the unknown vector XKFurther, the following manner is obtained:
Figure BDA0002869398340000024
wherein, P (Z)K|XK) Representing a given sensor measurement model, P (X)K|ZK-1) Model representing a given transformation System, (Z)K|ZK-1) Representing a probability density function;
extracting light beams through a probability heuristic model according to a Bayes method and projecting the light beams to a grid map unit, further establishing a model by using the uncertainty probability of a sensor to obtain a grid map, wherein the representation mode is as follows:
Figure BDA0002869398340000025
Figure BDA0002869398340000026
where r represents the sensor measurement distance, ε represents the sensor observation distance error, u represents the sensor minimum measurement, δ represents the ray distance from the sensor to the grid cell,
Figure BDA0002869398340000027
and
Figure BDA0002869398340000028
representing a fixed probability.
Specifically, the method comprises the following steps of obtaining a travel path of the inspection robot according to a local coordinate system and an indoor coordinate system of the inspection robot and collected environment perception data, and comprises the following steps:
step S210: segmenting and collecting image information, carrying out binarization processing on an image and filtering noise;
step S220: inverse projection transformation, namely converting the result of segmenting the acquired image information into a top view space through an inverse projection transformation algorithm;
step S230: and matching characteristic information fusion, namely calculating the coordinate position relation by utilizing the environment sensing part acquired by the inspection robot to realize matching fusion of lane line data of a local coordinate system and an indoor coordinate system.
Still provide an indoor environmental perception system who patrols and examines robot, the system includes:
the position information unit is used for constructing a coordinate system of the inspection robot and acquiring coordinate information of the inspection robot through an included angle between the inspection robot and a coordinate axis;
the environment acquisition unit is used for acquiring environment perception data according to the depth visual perception of the coordinate position of the inspection robot and audio recognition and characteristic analysis;
the environment acquisition and judgment unit is used for constructing a local map of the inspection robot according to the acquired environment sensing data and judging the running track of the inspection robot;
and the environment acquisition feedback unit is used for planning the running angle of the inspection robot according to the feedback information of the acquired environment sensing data.
Specifically, the environment acquisition unit includes, but is not limited to, a laser radar, a millimeter wave radar, or a camera.
A computer device comprises a memory and a processor, wherein the memory stores a computer program, and the processor is characterized in that the processor realizes the steps of the environment sensing method of the indoor inspection robot when executing the computer program.
A computer-readable storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the steps of the above-described environment sensing method of an indoor inspection robot.
Has the advantages that: the invention designs an environment sensing method, a system and equipment of an indoor inspection robot and a computer readable storage medium thereof, which realize multi-item feature matching data by executing inspection task points, local coordinate points and indoor coordinate points one by one, thereby reducing positioning errors and reducing the phenomenon of failure in detecting and detecting information; and extracting light beams to project to a grid map unit through a probability heuristic model, extracting consistent characteristic information by using redundant information in laser and visual information, and performing characteristic-level information fusion to ensure the stability of data transmission and the consistency of output data.
Drawings
Fig. 1 is a top view of an application scenario of an environment sensing method of an indoor inspection robot in an embodiment.
Fig. 2 is a schematic direction diagram of an environment sensing method of the indoor inspection robot in one embodiment.
Fig. 3 is a schematic flow chart of an environment sensing method of the indoor inspection robot in one embodiment.
Fig. 4 is a schematic flow chart of an image acquisition process in the environment sensing method of the indoor inspection robot in another embodiment.
FIG. 5 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The environment sensing method of the indoor inspection robot provided by the embodiment of the application can be applied to the application environment shown in fig. 1; the environment sensing device 101 of the indoor inspection robot is arranged on the indoor inspection robot, the device 102 is a target inspected by the device 101, the lane line 103 is a target track line inspected by the device 101, and the environment sensing device 101 of the indoor inspection robot firstly acquires coordinate information of the inspection robot through an included angle between the inspection robot and a coordinate axis;
according to coordinate information rectifies patrolling and examining robot coordinate system and indoor coordinate system to acquire patrolling and examining robot position information in real time, wherein, indoor patrolling and examining robot's environment perception device 101 realizes the operation of route and patrolling and examining to device 102 according to the lane line 103 that sets up, then, indoor patrolling and examining robot's environment perception device 101 acquires the surrounding environment of patrolling and examining robot operation according to the environment acquisition unit, will hinder patrolling and examining robot operation information and feed back to the treater and then, replans and examines robot angle of traveling, guarantees the even running of patrolling and examining robot.
In one embodiment, as shown in fig. 3, a method for sensing an environment of an indoor inspection robot includes:
step S100: constructing a coordinate system of the inspection robot to acquire coordinate information of the inspection robot through an included angle between the inspection robot and a coordinate axis;
specifically, in the running process of the inspection robot, the position information changes in real time, and the indoor position and the inspection robot position also change; therefore, coordinate information of the inspection robot is obtained through the included angle between the inspection robot and the coordinate axis in the step, real-time orientation data updating is convenient to provide, the inspection robot which continuously operates is adjusted based on an indoor coordinate system, and the accuracy of position information is improved.
Step S200: acquiring environment perception data according to the coordinate position depth visual perception and audio recognition and feature analysis of the inspection robot;
specifically, in this step, pass through the robot coordinate information is patrolled and examined in the acquisition of the contained angle between robot and the coordinate axis of patrolling and examining, and pass through robot surrounding environment is patrolled and examined in coordinate position degree of depth vision perception and audio frequency identification and feature analysis's collection, and the surrounding environment when judging to patrol and examine the robot and go further adjusts the direction of travel according to surrounding environment's transform.
Step S300: constructing a local map of the inspection robot according to the acquired environment sensing data, and judging the running track of the inspection robot;
specifically, in the step, the surrounding environment of the inspection robot is collected, the surrounding environment of the inspection robot during running is judged, a local map of the inspection robot is constructed through information collection of multiple sensors, image information is collected through segmentation, binarization processing is carried out on the image, noise is filtered, and the result of the segmented and collected image information is converted into a top view space through an inverse projection transformation algorithm; and in the matching characteristic information fusion, the coordinate position relation is calculated by utilizing the environment sensing part collected by the inspection robot, so that the matching fusion of lane line data of a local coordinate system and an indoor coordinate system is realized.
Step S400: and planning the running angle of the inspection robot according to the feedback information of the acquired environment sensing data.
Specifically, in the step, based on the collected surrounding environment of the inspection robot, the surrounding environment of the inspection robot during running is judged, the surrounding environment is converted into a top view space, and a running azimuth angle of the inspection robot is planned; the safety in the inspection process is ensured.
In one embodiment, as shown in fig. 2, when the coordinate information of the inspection robot is matched with the collected environment sensing data, an indoor coordinate system and a local coordinate system are formed, wherein the local pose vector of the inspection robot is:
Figure BDA0002869398340000051
obtaining the position of the inspection robot according to the pose vector, wherein the position of the inspection robot is (X)2,Y1) Is a local coordinate system, and is characterized in that,
Figure BDA0002869398340000052
the included angle between the running direction of the inspection robot and the X axis in the indoor coordinate system is set;
the position of the indoor coordinate system is (X, Y),
Figure BDA0002869398340000053
the included angle between the running direction of the robot and the Y axis in the indoor coordinate system is inspected.
In one embodiment, the collected environment perception data is subjected to depth visual perception and multi-sensor data fusion to form a Bayesian method, the Bayesian method extracts light beams to be projected to a grid map unit through a probability heuristic model, consistent characteristic information is extracted by using redundant information in laser and visual information, and information fusion at characteristic level is carried out.
In one embodiment, the depth visual perception and multi-sensor data fusion form a Bayesian method, and the probability of the time K is set to be X through a vector Z and an unknown vector XKFurther, the following manner is obtained:
Figure BDA0002869398340000054
wherein, P (Z)K|XK) Representing a given sensor measurement model, P (X)K|ZK-1) Model representing a given transformation System, (Z)K|ZK-1) Representing a probability density function;
extracting light beams through a probability heuristic model according to a Bayes method and projecting the light beams to a grid map unit, further establishing a model by using the uncertainty probability of a sensor to obtain a grid map, wherein the representation mode is as follows:
Figure BDA0002869398340000055
Figure BDA0002869398340000056
where r represents the sensor measurement distance, ε represents the sensor observation distance error, u represents the sensor minimum measurement, δ represents the ray distance from the sensor to the grid cell,
Figure BDA0002869398340000061
and
Figure BDA0002869398340000062
representing a fixed probability.
In one embodiment, as shown in fig. 4, the step of obtaining the travel path of the inspection robot according to the local coordinate system and the indoor coordinate system of the inspection robot and the collected environment sensing data includes:
step S210: segmenting and collecting image information, carrying out binarization processing on an image and filtering noise;
specifically, in this step, the image binarization processing sets the gray value of the pixel point on the image to 0 or 255, so as to present an obvious black-and-white effect to the whole image.
Step S220: inverse projection transformation, namely converting the result of segmenting the acquired image information into a top view space through an inverse projection transformation algorithm;
specifically, in the step, the image binarization processing is used for converting the obvious black and white effect of the whole image into a top view space, and the walking direction route of the inspection robot is estimated.
Step S230: and matching characteristic information fusion, namely calculating the coordinate position relation by utilizing the environment sensing part acquired by the inspection robot to realize matching fusion of lane line data of a local coordinate system and an indoor coordinate system.
Specifically, in the step, the coordinate position relation is calculated by the inspection robot collecting environment sensing part, and the matching of local coordinates of the inspection robot to indoor coordinate information is achieved.
In one embodiment, an environment sensing system for an indoor inspection robot, the system comprising:
the position information unit is used for constructing a coordinate system of the inspection robot and acquiring coordinate information of the inspection robot through an included angle between the inspection robot and a coordinate axis;
the environment acquisition unit is used for acquiring environment perception data according to the coordinate position depth visual perception and audio recognition and feature analysis of the inspection robot;
the environment acquisition and judgment unit is used for constructing a local map of the inspection robot according to the acquired environment sensing data and judging the running track of the inspection robot;
and the environment acquisition feedback unit is used for planning the running angle of the inspection robot according to the feedback information of the acquired environment sensing data.
In one embodiment, the environment acquisition unit includes, but is not limited to, a laser radar, a millimeter wave radar, or a camera, and further, the environment acquisition unit is selected according to actual requirements, that is, one or more combinations of the laser radar, the millimeter wave radar, or the camera are selected, so as to obtain external information of the inspection robot in real time during operation.
In one embodiment, there is provided a computer apparatus including a memory storing a computer program and a processor implementing the steps of the above-described environment sensing method of an indoor inspection robot when the processor executes the computer program.
In one embodiment, as shown in fig. 5, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the above-described environment sensing method of the indoor inspection robot
In one embodiment, the computer program when executed by a processor implements the steps of: acquiring coordinate information of the inspection robot through an included angle between the inspection robot and a coordinate axis based on the construction of an inspection robot coordinate system;
acquiring environment perception data according to the coordinate position depth visual perception and audio recognition and feature analysis of the inspection robot;
constructing a local map of the inspection robot according to the acquired environment sensing data, and judging the running track of the inspection robot;
and planning the running angle of the inspection robot according to the feedback information of the acquired environment sensing data.
In one embodiment, the computer program when executed by a processor implements the steps of: the Bayesian method is formed by depth visual perception and multi-sensor data fusion based on collected environment perception data, light beams are extracted by the Bayesian method through a probability heuristic model and projected to a grid map unit, consistent characteristic information is extracted by utilizing redundant information in laser and visual information, and information fusion of characteristic levels is carried out.
In one embodiment, the computer program when executed by a processor implements the steps of: obtaining the running path of the inspection robot based on the local coordinate system and the indoor coordinate system of the inspection robot and the collected environment sensing data,
segmenting and collecting image information, carrying out binarization processing on an image and filtering noise;
inverse projection transformation, namely converting the result of segmenting the acquired image information into a top view space through an inverse projection transformation algorithm;
and matching characteristic information fusion, namely calculating the coordinate position relation by utilizing the environment sensing part acquired by the inspection robot to realize matching fusion of lane line data of a local coordinate system and an indoor coordinate system.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above.
Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others.
Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory.
Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. An environment perception method of an indoor inspection robot is characterized by comprising the following steps:
constructing a coordinate system of the inspection robot to acquire coordinate information of the inspection robot through an included angle between the inspection robot and a coordinate axis;
acquiring environment perception data according to the coordinate position depth visual perception and audio recognition and feature analysis of the inspection robot;
constructing a local map of the inspection robot according to the acquired environment sensing data, and judging the running track of the inspection robot;
and planning the running angle of the inspection robot according to the feedback information of the acquired environment sensing data.
2. The environment perception method for the indoor inspection robot according to claim 1, wherein when coordinate information of the inspection robot is matched with collected environment perception data, an indoor coordinate system and a local coordinate system are formed, wherein local pose vectors of the inspection robot are as follows:
Figure FDA0002869398330000011
obtaining the position of the inspection robot according to the pose vector, wherein the position of the inspection robot is (X)2,Y1) Is a local coordinate system, and is characterized in that,
Figure FDA0002869398330000012
the included angle between the running direction of the inspection robot and the X axis in the indoor coordinate system is set;
the position of the indoor coordinate system is (X, Y),
Figure FDA0002869398330000013
the included angle between the running direction of the robot and the Y axis in the indoor coordinate system is inspected.
3. The environment perception method for the indoor inspection robot according to claim 1, wherein the collected environment perception data is subjected to depth vision perception and multi-sensor data fusion to form a Bayesian method, the Bayesian method extracts light beams through a probability heuristic model and projects the light beams to a grid map unit, and consistent feature information is extracted by using redundant information in laser and vision information, and feature-level information fusion is performed.
4. The environmental perception method for the indoor inspection robot according to claim 1, wherein the depth vision perception and the multi-sensor data fusion form a Bayesian method, and the probability of the moment K is set to be X through a vector Z and an unknown vector XKFurther, the following manner is obtained:
Figure FDA0002869398330000014
wherein, P (Z)K|XK) Representing a given sensor measurement model, P (X)K|ZK-1) Model representing a given transformation System, (Z)K|ZK-1) Representing a probability density function;
extracting light beams through a probability heuristic model according to a Bayes method and projecting the light beams to a grid map unit, further establishing a model by using the uncertainty probability of a sensor to obtain a grid map, wherein the representation mode is as follows:
Figure FDA0002869398330000021
Figure FDA0002869398330000022
where r represents the sensor measurement distance, ε represents the sensor observation distance error, u represents the sensor minimum measurement, δ represents the ray distance from the sensor to the grid cell,
Figure FDA0002869398330000023
and
Figure FDA0002869398330000024
representing a fixed probability.
5. The environment perception method for the indoor inspection robot according to claim 1, wherein the certainty factor judgment of the grid map is updated according to a Bayesian method, so that observation value information is obtained according to the following manner:
Figure FDA0002869398330000025
Figure FDA0002869398330000026
where PE denotes the prior probability, P (O | E) and
Figure FDA0002869398330000027
represents the observation model, O represents the occupied grid, O represents the unoccupied grid, E represents the obstacle present event, P x (E | O) and
Figure FDA0002869398330000028
representing the overall incoming observation information; further guarantee to patrol and examine the accuracy of robot detection information.
6. The environment sensing method for the indoor inspection robot according to claim 1, wherein a traveling path of the inspection robot is obtained according to the local coordinate system and the indoor coordinate system of the inspection robot and the collected environment sensing data, and the method comprises the following steps:
segmenting and collecting image information, carrying out binarization processing on an image and filtering noise;
inverse projection transformation, namely converting the result of segmenting the acquired image information into a top view space through an inverse projection transformation algorithm;
and matching characteristic information fusion, namely calculating the coordinate position relation by utilizing the environment sensing part acquired by the inspection robot to realize matching fusion of lane line data of a local coordinate system and an indoor coordinate system.
7. The utility model provides an indoor environment perception system who patrols and examines robot, a serial communication port, the system includes:
the position information unit is used for constructing a coordinate system of the inspection robot and acquiring coordinate information of the inspection robot through an included angle between the inspection robot and a coordinate axis;
the environment acquisition unit is used for acquiring environment perception data according to the depth visual perception of the coordinate position of the inspection robot and audio recognition and characteristic analysis;
the environment acquisition and judgment unit is used for constructing a local map of the inspection robot according to the acquired environment sensing data and judging the running track of the inspection robot;
and the environment acquisition feedback unit is used for planning the running angle of the inspection robot according to the feedback information of the acquired environment sensing data.
8. The environment sensing system of the indoor inspection robot according to claim 7, wherein the environment acquisition unit includes, but is not limited to, a laser radar, a millimeter wave radar, or a camera.
9. A computer device comprising a memory and a processor, the memory storing a computer program, wherein the processor when executing the computer program performs the steps of the method of environmental awareness for an indoor inspection robot according to any one of claims 1 to 6.
10. A computer-readable storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, performs the steps of the method of environmental awareness for an indoor inspection robot of any of claims 1 to 6.
CN202011591796.4A 2020-12-29 2020-12-29 Environment sensing method, system and equipment of indoor inspection robot and computer readable storage medium thereof Withdrawn CN112697134A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011591796.4A CN112697134A (en) 2020-12-29 2020-12-29 Environment sensing method, system and equipment of indoor inspection robot and computer readable storage medium thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011591796.4A CN112697134A (en) 2020-12-29 2020-12-29 Environment sensing method, system and equipment of indoor inspection robot and computer readable storage medium thereof

Publications (1)

Publication Number Publication Date
CN112697134A true CN112697134A (en) 2021-04-23

Family

ID=75511627

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011591796.4A Withdrawn CN112697134A (en) 2020-12-29 2020-12-29 Environment sensing method, system and equipment of indoor inspection robot and computer readable storage medium thereof

Country Status (1)

Country Link
CN (1) CN112697134A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113640802A (en) * 2021-07-30 2021-11-12 国网上海市电力公司 Robot space positioning method and system based on multiple fusion sensors
CN114814877A (en) * 2022-06-21 2022-07-29 山东金宇信息科技集团有限公司 Tunnel data acquisition method, equipment and medium based on inspection robot
CN115309119A (en) * 2022-08-20 2022-11-08 深圳市鹏翔运达机械科技有限公司 Control method and system of workshop inspection robot, computer equipment and medium
CN115468560A (en) * 2022-11-03 2022-12-13 国网浙江省电力有限公司宁波供电公司 Quality inspection method, robot, device and medium based on multi-sensor information fusion
CN115955296A (en) * 2023-03-14 2023-04-11 北京城市轨道交通咨询有限公司 Unmanned inspection-based rail transit operation and maintenance data transmission method and device

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113640802A (en) * 2021-07-30 2021-11-12 国网上海市电力公司 Robot space positioning method and system based on multiple fusion sensors
CN113640802B (en) * 2021-07-30 2024-05-17 国网上海市电力公司 Robot space positioning method and system based on multiple fusion sensors
CN114814877A (en) * 2022-06-21 2022-07-29 山东金宇信息科技集团有限公司 Tunnel data acquisition method, equipment and medium based on inspection robot
CN115309119A (en) * 2022-08-20 2022-11-08 深圳市鹏翔运达机械科技有限公司 Control method and system of workshop inspection robot, computer equipment and medium
CN115468560A (en) * 2022-11-03 2022-12-13 国网浙江省电力有限公司宁波供电公司 Quality inspection method, robot, device and medium based on multi-sensor information fusion
CN115955296A (en) * 2023-03-14 2023-04-11 北京城市轨道交通咨询有限公司 Unmanned inspection-based rail transit operation and maintenance data transmission method and device
CN115955296B (en) * 2023-03-14 2023-05-12 北京城市轨道交通咨询有限公司 Unmanned inspection-based rail transit operation and maintenance data transmission method and device

Similar Documents

Publication Publication Date Title
CN112697134A (en) Environment sensing method, system and equipment of indoor inspection robot and computer readable storage medium thereof
CN111337941B (en) Dynamic obstacle tracking method based on sparse laser radar data
US10885352B2 (en) Method, apparatus, and device for determining lane line on road
US20200160061A1 (en) Automatic ship tracking method and system based on deep learning network and mean shift
WO2022099511A1 (en) Method and apparatus for ground segmentation based on point cloud data, and computer device
CN112417926B (en) Parking space identification method and device, computer equipment and readable storage medium
CN113424079A (en) Obstacle detection method, obstacle detection device, computer device, and storage medium
CN113284144B (en) Tunnel detection method and device based on unmanned aerial vehicle
CN110796104A (en) Target detection method and device, storage medium and unmanned aerial vehicle
CN114972490B (en) Automatic data labeling method, device, equipment and storage medium
Chen et al. Heuristic monte carlo algorithm for unmanned ground vehicles realtime localization and mapping
CN114721008A (en) Obstacle detection method and device, computer equipment and storage medium
CN114495045A (en) Sensing method, sensing device, sensing system and related equipment
CN117115784A (en) Vehicle detection method and device for target data fusion
CN111813882B (en) Robot map construction method, device and storage medium
CN116363319B (en) Modeling method, modeling device, equipment and medium for building roof
CN116222579B (en) Unmanned aerial vehicle inspection method and system based on building construction
US20230314169A1 (en) Method and apparatus for generating map data, and non-transitory computer-readable storage medium
CN115686073A (en) Unmanned aerial vehicle-based power transmission line inspection control method and system
WO2023091730A1 (en) Building envelope remote sensing drone system and method
CN115512098A (en) Electronic bridge inspection system and inspection method
CN116109047A (en) Intelligent scheduling method based on three-dimensional intelligent detection
CN113569954A (en) Intelligent wild animal classification and identification method
CN117809297B (en) Three-dimensional reconstruction-based intelligent identification method for dangerous source of power transmission line
Hroob et al. Learned long-term stability scan filtering for robust robot localisation in continuously changing environments

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20210423