CN113008241A - Robot positioning method, device, robot and storage medium - Google Patents

Robot positioning method, device, robot and storage medium Download PDF

Info

Publication number
CN113008241A
CN113008241A CN202110252006.8A CN202110252006A CN113008241A CN 113008241 A CN113008241 A CN 113008241A CN 202110252006 A CN202110252006 A CN 202110252006A CN 113008241 A CN113008241 A CN 113008241A
Authority
CN
China
Prior art keywords
point cloud
cloud data
robot
machine room
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110252006.8A
Other languages
Chinese (zh)
Other versions
CN113008241B (en
Inventor
姚秀军
桂晨光
王超
马福强
陈建楠
王峰
崔丽华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jingdong Shuke Haiyi Information Technology Co Ltd
Original Assignee
Jingdong Shuke Haiyi Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jingdong Shuke Haiyi Information Technology Co Ltd filed Critical Jingdong Shuke Haiyi Information Technology Co Ltd
Priority to CN202110252006.8A priority Critical patent/CN113008241B/en
Publication of CN113008241A publication Critical patent/CN113008241A/en
Application granted granted Critical
Publication of CN113008241B publication Critical patent/CN113008241B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C3/00Registering or indicating the condition or the working of machines or other apparatus, other than vehicles

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)

Abstract

The embodiment of the invention provides a robot positioning method, a robot positioning device, a robot and a storage medium, wherein the method comprises the following steps: under the condition that the robot is located in a machine room inspection area, required point cloud data corresponding to a reference section of the robot in the motion direction are screened from the fan-shaped point cloud data; marking the required point cloud data, screening residual point cloud data in the sector point cloud data, and performing enhancement processing on the required point cloud data to obtain enhanced point cloud data; and inputting the required point cloud data, the enhanced point cloud data and the residual point cloud data into a preset optimizer, and acquiring the position of the robot output by the preset optimizer. Therefore, in the embodiment of the invention, the reference section at the end of the air duct of the machine room is used for robot-assisted positioning, the reference section can provide positioning constraint for the air duct direction of the machine room, and the positioning precision of the air duct direction of the machine room of the robot is obviously improved.

Description

Robot positioning method, device, robot and storage medium
Technical Field
The invention relates to the technical field of artificial intelligence, in particular to a robot positioning method and device, a robot and a storage medium.
Background
With the continuous development of artificial intelligence, the robot is greatly expanded in business scenes, industrial scenes and the like, and brings great convenience to the production and life of people. The machine room environment is one of typical application scenarios, and with the development of cloud services and financial services, the server machine room is larger and larger in scale and higher in standardization degree, so that the inspection of the machine room by using a robot becomes possible and is increasingly popularized.
The self-positioning precision and the repeated positioning precision of the robot in a laser environment are key technical indexes of the machine room robot in inspection, the currently common positioning technology comprises an inside-out positioning mode, the inside-out positioning mode directly utilizes environment information to carry out active positioning, the environment does not need to be reformed, and the robot is more suitable for the machine room environment.
However, with the inside-out positioning method, the self-positioning accuracy and the repeated positioning accuracy of the robot still face difficulty in the machine room environment, and especially in the machine room environment with multiple rows of long corridors, as shown in fig. 1, the positioning deviation in the air duct direction of the machine room is very easy to occur.
Disclosure of Invention
The embodiment of the invention aims to provide a robot positioning method, a robot positioning device, a robot and a storage medium, so as to achieve the beneficial effect of remarkably improving the positioning precision of the robot in the air duct direction of a machine room. The specific technical scheme is as follows:
in a first aspect of embodiments of the present invention, there is provided, first, a robot positioning method, including:
under the condition that the robot is located in a machine room inspection area, screening required point cloud data corresponding to a reference section of the robot in the motion direction from the fan-shaped point cloud data;
the robot calls electromagnetic wave equipment to perform electromagnetic wave scanning on a robot view angle area, and echo signals generated by scanning the robot view angle area are processed to form the sector point cloud data;
marking the required point cloud data, screening residual point cloud data in the sector point cloud data, and performing enhancement processing on the required point cloud data to obtain enhanced point cloud data;
and inputting the required point cloud data, the enhanced point cloud data and the residual point cloud data into a preset optimizer, and acquiring the robot position output by the preset optimizer.
In an optional embodiment, the screening, from the sector point cloud data, desired point cloud data corresponding to a reference cross section of the robot motion direction includes:
and screening point cloud data meeting a linear equation from the fan-shaped point cloud data in a linear fitting mode to obtain required point cloud data corresponding to the reference section of the robot in the motion direction.
In an optional embodiment, the enhancing the desired point cloud data to obtain enhanced point cloud data includes:
and determining the distance between the robot and the reference section, and enhancing the required point cloud data based on the distance to obtain enhanced point cloud data.
In an optional embodiment, the enhancing the desired point cloud data based on the distance to obtain enhanced point cloud data includes:
judging whether the distance exceeds a distance threshold value;
under the condition that the distance exceeds the distance threshold value, increasing the intensity value of the required point cloud data to obtain enhanced point cloud data;
and under the condition that the distance does not exceed the distance threshold, increasing the number of the required point cloud data to obtain enhanced point cloud data.
In an alternative embodiment, the increasing the number of the desired point cloud data to obtain enhanced point cloud data includes:
and increasing the number of the required point cloud data in an interpolation mode according to a linear equation to obtain enhanced point cloud data, wherein the linear equation comprises a least square linear equation.
In an optional embodiment, before the required point cloud data corresponding to the reference section of the robot moving direction is screened from the fan-shaped point cloud data under the condition that the robot is located in the machine room inspection area, the method further includes:
and acquiring the mass center position of the current robot, and judging whether the robot is positioned in the machine room inspection area or not according to the mass center position of the robot.
In an optional implementation manner, the determining, according to the position of the center of mass of the robot, whether the robot is located in a machine room inspection area includes:
judging whether the robot centroid position is located in the machine room inspection area;
and determining that the robot is located in the machine room inspection area under the condition that the center of mass of the robot is located in the machine room inspection area.
In a second aspect of embodiments of the present invention, there is also provided a robot positioning apparatus, the apparatus including:
the data screening module is used for screening required point cloud data corresponding to a reference section of the robot in the motion direction from the fan-shaped point cloud data under the condition that the robot is located in a machine room inspection area;
the robot calls electromagnetic wave equipment to perform electromagnetic wave scanning on a robot view angle area, and echo signals generated by scanning the robot view angle area are processed to form the sector point cloud data;
the data labeling module is used for labeling the required point cloud data and screening residual point cloud data in the sector point cloud data;
the data enhancement module is used for enhancing the required point cloud data to obtain enhanced point cloud data;
and the position acquisition module is used for inputting the required point cloud data, the enhanced point cloud data and the residual point cloud data into a preset optimizer and acquiring the robot position output by the preset optimizer.
In an optional embodiment, the data filtering module is specifically configured to:
and screening point cloud data meeting a linear equation from the fan-shaped point cloud data in a linear fitting mode to obtain required point cloud data corresponding to the reference section of the robot in the motion direction.
In an optional embodiment, the data enhancement module specifically includes:
the distance determining submodule is used for determining the distance between the robot and the reference cross section;
and the data enhancement submodule is used for enhancing the required point cloud data based on the distance to obtain enhanced point cloud data.
In an alternative embodiment, the data enhancer module comprises:
a distance judgment subunit, configured to judge whether the distance exceeds a distance threshold;
the intensity value increasing unit is used for increasing the intensity value of the required point cloud data to obtain enhanced point cloud data under the condition that the distance exceeds the distance threshold;
and the number increasing unit is used for increasing the number of the required point cloud data to obtain enhanced point cloud data under the condition that the distance does not exceed the distance threshold.
In an optional embodiment, the number increasing unit is specifically configured to:
and increasing the number of the required point cloud data in an interpolation mode according to a linear equation to obtain enhanced point cloud data, wherein the linear equation comprises a least square linear equation.
In an optional embodiment, the apparatus further comprises:
and the robot judgment module is used for acquiring the current robot mass center position and judging whether the robot is located in the machine room inspection area or not according to the robot mass center position.
In an optional embodiment, the robot determination module is specifically configured to:
judging whether the robot centroid position is located in the machine room inspection area;
and determining that the robot is located in the machine room inspection area under the condition that the center of mass of the robot is located in the machine room inspection area.
In a third aspect of the embodiments of the present invention, there is further provided a robot, including a processor, a communication interface, a memory, and a communication bus, where the processor, the communication interface, and the memory complete mutual communication through the communication bus;
a memory for storing a computer program;
a processor for implementing the robot positioning method according to any one of the first aspect described above when executing a program stored in the memory.
In a fourth aspect of the embodiments of the present invention, there is also provided a storage medium having instructions stored therein, which when run on a computer, cause the computer to perform the robot positioning method according to any one of the first aspect.
In a fifth aspect of embodiments of the present invention, there is also provided a computer program product containing instructions which, when run on a computer, cause the computer to perform the robot positioning method of any one of the above first aspects.
According to the technical scheme provided by the embodiment of the invention, under the condition that the robot is located in a machine room inspection area, required point cloud data corresponding to a reference section of the robot in the motion direction are screened from the fan-shaped point cloud data, wherein the robot calls an electromagnetic wave device to perform electromagnetic wave scanning on the robot view angle area, echo signals generated by scanning the robot view angle area are processed to form the fan-shaped point cloud data, the required point cloud data are marked, residual point cloud data in the fan-shaped point cloud data are screened, the required point cloud data are enhanced to obtain enhanced point cloud data, the required point cloud data, the enhanced point cloud data and the residual point cloud data are input to a preset optimizer, and the position of the robot output by the preset optimizer is obtained. Therefore, in the embodiment of the invention, the reference section at the end of the air duct of the machine room is used for robot-assisted positioning, the reference section can provide positioning constraint for the air duct direction of the machine room, and the positioning precision of the air duct direction of the machine room of the robot is obviously improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
FIG. 1 is a schematic illustration of a machine room environment shown in an embodiment of the present invention;
fig. 2 is a schematic flow chart illustrating an implementation of a robot positioning method according to an embodiment of the present invention;
fig. 3 is a schematic flow chart illustrating another robot positioning method according to an embodiment of the present invention;
FIG. 4 is a schematic view of a distance sensor positioned at a wall section in an embodiment of the present invention;
fig. 5 is a schematic diagram illustrating a distance between a machine room inspection robot and a wall section right in front of the machine room inspection robot according to an embodiment of the present invention;
FIG. 6 is a schematic structural diagram of a robot positioning device according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of a robot shown in the embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, are within the scope of the present invention.
As shown in fig. 2, an implementation flow diagram of a robot positioning method provided in an embodiment of the present invention is shown, where the method may be executed by a processor, and specifically may include the following steps:
s201, under the condition that the robot is located in a machine room inspection area, required point cloud data corresponding to a reference section of the robot in the motion direction are screened from the fan-shaped point cloud data.
In the embodiment of the invention, for the robot, under the condition that the robot is positioned in the machine room routing inspection area, the positioning deviation of the machine room air channel direction is easy to generate, so that the robot auxiliary positioning is carried out by using the reference section at the end of the machine room air channel, the reference section can provide the positioning constraint of the machine room air channel direction, the positioning precision of the machine room air channel direction of the robot is obviously improved, and the required point cloud data corresponding to the reference section in the robot motion direction is screened from the fan-shaped point cloud data.
For example, in the embodiment of the present invention, as shown in fig. 1, for the machine room inspection robot, when the machine room inspection robot is located in the machine room air duct area, the machine room air duct direction, that is, the longitudinal positioning deviation is easily generated, so that the wall section at the end of the machine room air duct is used for robot-assisted positioning, the wall section can provide longitudinal positioning constraint, the longitudinal positioning accuracy of the machine room inspection robot is significantly improved, and therefore, the required point cloud data corresponding to the wall section right in front of the machine room inspection robot is screened from the fan-shaped point cloud data.
It should be noted that, for the moving direction of the robot, the moving direction may be right in front of the robot, for the reference cross section, the wall cross section shown in fig. 1 may be used, or the reference object cross section arranged at the end of the air duct of the machine room may be used, for the inspection area of the machine room, the air duct area of the machine room shown in fig. 1 may be used, which is not limited in the embodiment of the present invention.
Here, for the robot, the electromagnetic wave device may be invoked to perform the following steps to form sector-shaped point cloud data: electromagnetic wave scanning is carried out on a robot view angle area (the robot view angle area is a sector area), and echo signals generated by scanning the robot view angle area are processed to form sector point cloud data. The sector point cloud data may be point cloud data in a two-dimensional space.
For example, for a machine room inspection robot, the robot carries a microwave radar, the microwave radar is called to perform microwave scanning on a view angle area of the robot, when the microwave encounters an obstacle, an echo signal is generated, the microwave radar receives the echo signal, and data such as frequency difference, angle and ToF (Time of Flight) of the echo signal are converted into sector point cloud data.
It should be noted that, for the robot, the robot inspects the robot in the machine room inspection area along the air duct direction of the machine room, so it can be considered that the reference section corresponding to the machine room inspection area should appear in the moving direction of the robot, that is, the required point cloud data corresponding to the reference section should be in the middle of the whole sector point cloud data, and should be continuous, and the distance value is approximate.
S202, labeling the required point cloud data, screening residual point cloud data in the sector point cloud data, and performing enhancement processing on the required point cloud data to obtain enhanced point cloud data.
In the embodiment of the invention, the required point cloud data corresponding to the reference section of the robot in the motion direction is marked (namely marked), so that the residual point cloud data in the sector point cloud data can be screened out, namely the part of point cloud data which is remained in the sector point cloud data and is planed out of the required point cloud data.
In addition, in the embodiment of the invention, for the required point cloud data corresponding to the reference section of the robot in the motion direction, the required point cloud data can be enhanced to obtain corresponding enhanced point cloud data. It should be noted that the desired point cloud data should be continuous, and the distance values are approximate.
S203, inputting the required point cloud data, the enhanced point cloud data and the residual point cloud data into a preset optimizer, and acquiring the position of the robot output by the preset optimizer.
And inputting the required point cloud data, the enhanced point cloud data and the residual point cloud data into a preset optimizer, and acquiring the position of the robot output by the preset optimizer.
For example, for required point cloud data corresponding to a wall section right in front of the machine room inspection robot, residual point cloud data in the fan-shaped point cloud data, and enhanced point cloud data obtained by enhancing the required point cloud data, the enhanced point cloud data, and the residual point cloud data are input to a CSM (correlation Scan match) optimizer, and the position of the machine room inspection robot output by the CSM optimizer is obtained.
According to the technical scheme provided by the embodiment of the invention, under the condition that the robot is located in the machine room inspection area, required point cloud data corresponding to the reference section of the movement direction of the robot is screened from the fan-shaped point cloud data, wherein the robot calls an electromagnetic wave device to perform electromagnetic wave scanning on the view angle area of the robot, processes echo signals generated by scanning the view angle area of the robot to form fan-shaped point cloud data, marks the required point cloud data, screens residual point cloud data in the fan-shaped point cloud data, performs enhancement processing on the required point cloud data to obtain enhanced point cloud data, inputs the required point cloud data, the enhanced point cloud data and the residual point cloud data to a preset optimizer, and obtains the position of the robot output by the preset optimizer.
Therefore, in the embodiment of the invention, the reference section at the end of the air duct of the machine room is used for robot-assisted positioning, the reference section can provide positioning constraint for the air duct direction of the machine room, and the positioning precision of the air duct direction of the machine room of the robot is obviously improved.
In addition, as shown in fig. 3, an implementation flow diagram of another robot positioning method provided in the embodiment of the present invention is shown, where the method may be executed by a processor, and specifically may include the following steps:
s301, obtaining the center of mass position of the current robot, and judging whether the robot is located in the machine room inspection area or not according to the center of mass position of the robot.
In the embodiment of the invention, the machine room routing inspection area in the whole machine room environment can be labeled in advance in a manual labeling mode, so that the current robot centroid position is obtained by using a positioning algorithm in the robot positioning process, and whether the robot is located in the machine room routing inspection area or not is judged according to the robot centroid position.
For example, a machine room air duct area under the whole machine room environment is marked in advance in a manual marking mode, as shown in fig. 1, in the robot positioning process, the current machine room patrol robot centroid position is obtained through a positioning algorithm, and as shown in fig. 1, whether the machine room patrol robot is located in the machine room air duct area or not is judged according to the machine room patrol robot centroid position.
It should be noted that, as for the positioning algorithm, any current robot positioning algorithm may be used, and the current robot centroid position is obtained by the positioning algorithm, which is not limited in the embodiment of the present invention.
In addition, in the embodiment of the invention, whether the robot centroid position is located in the machine room routing inspection area is judged, the robot is determined to be located in the machine room routing inspection area under the condition that the robot centroid position is located in the machine room routing inspection area, and the robot is determined not to be located in the machine room routing inspection area under the condition that the robot centroid position is not located in the machine room routing inspection area.
For example, as shown in fig. 1, the center of mass of the machine room inspection robot is located in the machine room air duct area, so that it can be determined that the machine room inspection robot is located in the machine room air duct area.
And S302, under the condition that the robot is located in the machine room inspection area, screening required point cloud data corresponding to the reference section of the robot in the motion direction from the fan-shaped point cloud data.
And for the robot, under the condition that the robot is positioned in the machine room inspection area, screening point cloud data meeting a linear equation from the fan-shaped point cloud data in a linear fitting mode to serve as required point cloud data corresponding to a reference section of the robot in the motion direction. The line fitting manner includes a ransac (random Sample consensus) manner, and the line equation includes a least square line equation, which is not limited in the embodiment of the present invention.
For example, for a machine room inspection robot, in the case that the machine room inspection robot is located in a machine room air duct area, whether the number of target point cloud data satisfying the least square linear equation threshold T1 satisfies the threshold T2 is determined from the fan-shaped point cloud data by means of RANSAC, and if the threshold T2 is satisfied, the target point cloud data is considered to belong to a wall section, and it should be noted that all the target point cloud data should be continuous and the distance values are approximate. Therefore, required point cloud data corresponding to the wall section in front of the machine room inspection robot can be screened out.
And S303, marking the required point cloud data, and screening residual point cloud data in the sector point cloud data.
In the embodiment of the present invention, this step is similar to the step S202, and the details of the embodiment of the present invention are not repeated herein.
S304, determining the distance between the robot and the reference cross section, and enhancing the required point cloud data based on the distance to obtain enhanced point cloud data.
In the embodiment of the invention, the required point cloud data corresponding to the reference section of the robot in the motion direction can be selectively enhanced according to actual requirements to obtain corresponding enhanced point cloud data.
Therefore, the distance between the robot and the reference cross section of the robot in the moving direction can be determined, and the required point cloud data corresponding to the reference cross section of the robot in the moving direction is enhanced according to the distance between the robot and the reference cross section of the robot in the moving direction to obtain enhanced point cloud data.
For example, as shown in fig. 4, a distance sensor is arranged at a wall section in advance, and a distance D between the machine room inspection robot and the wall section in front of the machine room inspection robot is determined by the distance sensor, as shown in fig. 5, so that according to the distance D between the machine room inspection robot and the wall section in front of the machine room inspection robot, required point cloud data corresponding to the wall section in front of the machine room inspection robot is subjected to enhancement processing to obtain enhanced point cloud data.
In addition, as for the distance between the robot and the reference section in the moving direction of the robot, the embodiment of the invention can judge whether the distance exceeds the distance threshold value; under the condition that the distance exceeds the distance threshold value, the intensity value of the required point cloud data is increased to obtain enhanced point cloud data, so that the phenomenon that the intensity value of the required point cloud data is weak can be avoided under the condition that the robot is far away from the reference section; under the condition that the distance does not exceed the distance threshold, the number of the required point cloud data is increased to obtain the enhanced point cloud data, so that the phenomenon that the number of the required point cloud data is small can be avoided under the condition that the robot is close to the reference section.
For example, for the distance D between the machine room inspection robot and the wall section right in front of the machine room inspection robot, the embodiment of the invention judges whether the distance D exceeds a distance threshold (for example, 5m), and increases the intensity value of the required point cloud data to obtain enhanced point cloud data under the condition that the distance D exceeds the distance threshold, so that the phenomenon that the intensity value of the required point cloud data is weak can be avoided under the condition that the machine room inspection robot is far away from the wall section.
In addition, under the condition that the distance D does not exceed the distance threshold, the number of required point cloud data is increased to obtain enhanced point cloud data, and therefore the phenomenon that the number of the required point cloud data is small can be avoided under the condition that the machine room inspection robot is close to the reference section. Therefore, the enhancement mode can be automatically adjusted according to the distance, so that the required point cloud data can be enhanced by adopting different enhancement modes to obtain corresponding enhanced point cloud data.
It should be noted that, in the embodiment of the present invention, the number of the required point cloud data may be increased by interpolation according to a line equation to obtain the enhanced point cloud data, where the line equation includes a least square line equation, which is not limited in the embodiment of the present invention.
S305, inputting the required point cloud data, the enhanced point cloud data and the residual point cloud data into a preset optimizer, and acquiring the robot position output by the preset optimizer.
And inputting the required point cloud data, the enhanced point cloud data and the residual point cloud data into a preset optimizer, and acquiring the position of the robot output by the preset optimizer.
Here, the optimization of the preset optimizer is mainly embodied in the following two parts: firstly, the search length of the air duct direction of the machine room is increased, namely the robot is considered to have static map data matched with the robot in the air duct direction, and the matched point cloud should contain most of required point cloud data; and secondly, when least square optimization is carried out, the result of the machine room air channel based on the matching is ensured to be carried out.
Corresponding to the above method embodiment, an embodiment of the present invention further provides a robot positioning apparatus, as shown in fig. 6, the apparatus may include: the system comprises a data screening module 610, a data labeling module 620, a data enhancing module 630 and a position obtaining module 640.
The data screening module 610 is used for screening required point cloud data corresponding to a reference section of the robot in the motion direction from the fan-shaped point cloud data under the condition that the robot is located in a machine room inspection area;
the robot calls electromagnetic wave equipment to perform electromagnetic wave scanning on a robot view angle area, and echo signals generated by scanning the robot view angle area are processed to form the sector point cloud data;
a data labeling module 620, configured to label the required point cloud data, and screen remaining point cloud data in the sector point cloud data;
a data enhancement module 630, configured to perform enhancement processing on the required point cloud data to obtain enhanced point cloud data;
a position obtaining module 640, configured to input the required point cloud data, the enhanced point cloud data, and the remaining point cloud data into a preset optimizer, and obtain a robot position output by the preset optimizer.
In a specific implementation manner of the embodiment of the present invention, the data filtering module 610 is specifically configured to:
and screening point cloud data meeting a linear equation from the fan-shaped point cloud data in a linear fitting mode to obtain required point cloud data corresponding to the reference section of the robot in the motion direction.
In a specific implementation manner of the embodiment of the present invention, the data enhancement module 630 specifically includes:
the distance determining submodule is used for determining the distance between the robot and the reference cross section;
and the data enhancement submodule is used for enhancing the required point cloud data based on the distance to obtain enhanced point cloud data.
In a specific implementation of an embodiment of the invention, the data enhancer module comprises:
a distance judgment subunit, configured to judge whether the distance exceeds a distance threshold;
the intensity value increasing unit is used for increasing the intensity value of the required point cloud data to obtain enhanced point cloud data under the condition that the distance exceeds the distance threshold;
and the number increasing unit is used for increasing the number of the required point cloud data to obtain enhanced point cloud data under the condition that the distance does not exceed the distance threshold.
In a specific implementation manner of the embodiment of the present invention, the number increasing unit is specifically configured to:
and increasing the number of the required point cloud data in an interpolation mode according to a linear equation to obtain enhanced point cloud data, wherein the linear equation comprises a least square linear equation.
In a specific implementation manner of the embodiment of the present invention, the apparatus further includes:
and the robot judgment module is used for acquiring the current robot mass center position and judging whether the robot is located in the machine room inspection area or not according to the robot mass center position.
In a specific implementation manner of the embodiment of the present invention, the robot determination module is specifically configured to:
judging whether the robot centroid position is located in the machine room inspection area;
and determining that the robot is located in the machine room inspection area under the condition that the center of mass of the robot is located in the machine room inspection area.
The embodiment of the present invention further provides a robot, as shown in fig. 7, including a processor 71, a communication interface 72, a memory 73 and a communication bus 74, where the processor 71, the communication interface 72, and the memory 73 complete mutual communication through the communication bus 74,
a memory 73 for storing a computer program;
the processor 71, when executing the program stored in the memory 73, implements the following steps:
under the condition that the robot is located in a machine room inspection area, screening required point cloud data corresponding to a reference section of the robot in the motion direction from the fan-shaped point cloud data; the robot calls electromagnetic wave equipment to perform electromagnetic wave scanning on a robot view angle area, and echo signals generated by scanning the robot view angle area are processed to form the sector point cloud data; marking the required point cloud data, screening residual point cloud data in the sector point cloud data, and performing enhancement processing on the required point cloud data to obtain enhanced point cloud data; and inputting the required point cloud data, the enhanced point cloud data and the residual point cloud data into a preset optimizer, and acquiring the robot position output by the preset optimizer.
The communication bus mentioned in the robot may be a Peripheral Component Interconnect (PCI) bus or an Extended Industry Standard Architecture (EISA) bus. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus.
The communication interface is used for communication between the robot and other equipment.
The Memory may include a Random Access Memory (RAM) or a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the Integrated Circuit may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, or a discrete hardware component.
In yet another embodiment of the present invention, a storage medium is further provided, where instructions are stored, and when the instructions are executed on a computer, the computer is caused to execute the robot positioning method in any one of the above embodiments.
In a further embodiment provided by the present invention, there is also provided a computer program product comprising instructions which, when run on a computer, cause the computer to perform the robot positioning method of any of the above embodiments.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the invention to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a storage medium or transmitted from one storage medium to another, for example, from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The storage medium may be any available medium that can be accessed by a computer or a data storage device including one or more available media integrated servers, data centers, and the like. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The above description is only for the preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention shall fall within the protection scope of the present invention.

Claims (10)

1. A method of robot positioning, the method comprising:
under the condition that the robot is located in a machine room inspection area, screening required point cloud data corresponding to a reference section of the robot in the motion direction from the fan-shaped point cloud data;
the robot calls electromagnetic wave equipment to perform electromagnetic wave scanning on a robot view angle area, and echo signals generated by scanning the robot view angle area are processed to form the sector point cloud data;
marking the required point cloud data, screening residual point cloud data in the sector point cloud data, and performing enhancement processing on the required point cloud data to obtain enhanced point cloud data;
and inputting the required point cloud data, the enhanced point cloud data and the residual point cloud data into a preset optimizer, and acquiring the robot position output by the preset optimizer.
2. The method of claim 1, wherein the step of screening the sector point cloud data for the desired point cloud data corresponding to the reference cross section of the robot motion direction comprises:
and screening point cloud data meeting a linear equation from the fan-shaped point cloud data in a linear fitting mode to obtain required point cloud data corresponding to the reference section of the robot in the motion direction.
3. The method of claim 1, wherein the enhancing the desired point cloud data to obtain enhanced point cloud data comprises:
and determining the distance between the robot and the reference section, and enhancing the required point cloud data based on the distance to obtain enhanced point cloud data.
4. The method of claim 3, wherein the enhancing the desired point cloud data based on the distance results in enhanced point cloud data, comprising:
judging whether the distance exceeds a distance threshold value;
under the condition that the distance exceeds the distance threshold value, increasing the intensity value of the required point cloud data to obtain enhanced point cloud data;
and under the condition that the distance does not exceed the distance threshold, increasing the number of the required point cloud data to obtain enhanced point cloud data.
5. The method of claim 4, wherein increasing the amount of the desired point cloud data results in enhanced point cloud data, comprising:
and increasing the number of the required point cloud data in an interpolation mode according to a linear equation to obtain enhanced point cloud data, wherein the linear equation comprises a least square linear equation.
6. The method according to claim 1, wherein before the step of screening the sector point cloud data for the required point cloud data corresponding to the reference section of the robot moving direction under the condition that the robot is located in the machine room inspection area, the method further comprises the following steps:
and acquiring the mass center position of the current robot, and judging whether the robot is positioned in the machine room inspection area or not according to the mass center position of the robot.
7. The method according to claim 6, wherein the judging whether the robot is located in a machine room inspection area according to the position of the center of mass of the robot comprises the following steps:
judging whether the robot centroid position is located in the machine room inspection area;
and determining that the robot is located in the machine room inspection area under the condition that the center of mass of the robot is located in the machine room inspection area.
8. A robot positioning device, characterized in that the device comprises:
the data screening module is used for screening required point cloud data corresponding to a reference section of the robot in the motion direction from the fan-shaped point cloud data under the condition that the robot is located in a machine room inspection area;
the robot calls electromagnetic wave equipment to perform electromagnetic wave scanning on a robot view angle area, and echo signals generated by scanning the robot view angle area are processed to form the sector point cloud data;
the data labeling module is used for labeling the required point cloud data and screening residual point cloud data in the sector point cloud data;
the data enhancement module is used for enhancing the required point cloud data to obtain enhanced point cloud data;
and the position acquisition module is used for inputting the required point cloud data, the enhanced point cloud data and the residual point cloud data into a preset optimizer and acquiring the robot position output by the preset optimizer.
9. A robot is characterized by comprising a processor, a communication interface, a memory and a communication bus, wherein the processor and the communication interface are used for realizing mutual communication by the memory through the communication bus;
a memory for storing a computer program;
a processor for implementing the method steps of any one of claims 1 to 7 when executing a program stored in the memory.
10. A storage medium on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 7.
CN202110252006.8A 2021-03-08 2021-03-08 Robot positioning method, device, robot and storage medium Active CN113008241B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110252006.8A CN113008241B (en) 2021-03-08 2021-03-08 Robot positioning method, device, robot and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110252006.8A CN113008241B (en) 2021-03-08 2021-03-08 Robot positioning method, device, robot and storage medium

Publications (2)

Publication Number Publication Date
CN113008241A true CN113008241A (en) 2021-06-22
CN113008241B CN113008241B (en) 2022-11-08

Family

ID=76408702

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110252006.8A Active CN113008241B (en) 2021-03-08 2021-03-08 Robot positioning method, device, robot and storage medium

Country Status (1)

Country Link
CN (1) CN113008241B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105203551A (en) * 2015-09-11 2015-12-30 尹栋 Car-mounted laser radar tunnel detection system, autonomous positioning method based on tunnel detection system and tunnel hazard detection method
US20170374342A1 (en) * 2016-06-24 2017-12-28 Isee, Inc. Laser-enhanced visual simultaneous localization and mapping (slam) for mobile devices
CN109902425A (en) * 2019-03-11 2019-06-18 南京林业大学 The tunnel cross-section extracting method of ground formula point cloud
CN110297256A (en) * 2019-07-23 2019-10-01 国网四川省电力公司信息通信公司 Emergency route generation method is maked an inspection tour based on the man-machine room of laser radar scanning data machine
US20200110410A1 (en) * 2018-10-04 2020-04-09 Nidec Corporation Device and method for processing map data used for self-position estimation, mobile body, and control system for mobile body
CN111882694A (en) * 2020-07-15 2020-11-03 中国工商银行股份有限公司 Machine room inspection method and device and robot
CN112000109A (en) * 2020-09-10 2020-11-27 广西亚像科技有限责任公司 Position correction method for power inspection robot, power inspection robot and medium
CN112445215A (en) * 2019-08-29 2021-03-05 阿里巴巴集团控股有限公司 Automatic guided vehicle driving control method, device and computer system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105203551A (en) * 2015-09-11 2015-12-30 尹栋 Car-mounted laser radar tunnel detection system, autonomous positioning method based on tunnel detection system and tunnel hazard detection method
US20170374342A1 (en) * 2016-06-24 2017-12-28 Isee, Inc. Laser-enhanced visual simultaneous localization and mapping (slam) for mobile devices
US20200110410A1 (en) * 2018-10-04 2020-04-09 Nidec Corporation Device and method for processing map data used for self-position estimation, mobile body, and control system for mobile body
CN109902425A (en) * 2019-03-11 2019-06-18 南京林业大学 The tunnel cross-section extracting method of ground formula point cloud
CN110297256A (en) * 2019-07-23 2019-10-01 国网四川省电力公司信息通信公司 Emergency route generation method is maked an inspection tour based on the man-machine room of laser radar scanning data machine
CN112445215A (en) * 2019-08-29 2021-03-05 阿里巴巴集团控股有限公司 Automatic guided vehicle driving control method, device and computer system
CN111882694A (en) * 2020-07-15 2020-11-03 中国工商银行股份有限公司 Machine room inspection method and device and robot
CN112000109A (en) * 2020-09-10 2020-11-27 广西亚像科技有限责任公司 Position correction method for power inspection robot, power inspection robot and medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
纪思源等: "基于激光扫描点云的隧道断面提取方法", 《测绘工程》 *

Also Published As

Publication number Publication date
CN113008241B (en) 2022-11-08

Similar Documents

Publication Publication Date Title
US9576375B1 (en) Methods and systems for detecting moving objects in a sequence of image frames produced by sensors with inconsistent gain, offset, and dead pixels
CN109946680B (en) External parameter calibration method and device of detection system, storage medium and calibration system
CN110749901B (en) Autonomous mobile robot, map splicing method and device thereof, and readable storage medium
Henson et al. Attitude-trajectory estimation for forward-looking multibeam sonar based on acoustic image registration
CN109974699B (en) Robot and map autonomous exploration method and device thereof
CN112987724B (en) Path optimization method, path optimization device, robot and storage medium
KR20190084460A (en) Method and system for noise-robust sound-based respiratory disease detection
CN108121018B (en) Detection point positioning accuracy evaluation method and device
Zhang et al. Uncertainty model for template feature matching
JP7224592B2 (en) Information processing device, information processing method, and program
CN113091736B (en) Robot positioning method, device, robot and storage medium
CN113008241B (en) Robot positioning method, device, robot and storage medium
CN113759348A (en) Radar calibration method, device, equipment and storage medium
US20230206402A1 (en) Context aware object geotagging
CN111113405A (en) Method for robot to obtain position service and robot
CN113822372A (en) Unmanned aerial vehicle detection method based on YOLOv5 neural network
CN115063461A (en) Error elimination method and device and electronic equipment
Ghadian et al. Recursive sparsity-based MVDR algorithm for interference cancellation in sensor arrays
CN110068834B (en) Road edge detection method and device
CN109831737B (en) Bluetooth positioning method, device, equipment and system based on confidence degree
CN110399892B (en) Environmental feature extraction method and device
Almanza-Medina et al. Motion estimation of an underwater platform using images from two sonar sensors
CN115908243B (en) Method, device, equipment and storage medium for dividing nondestructive testing image
CN110471077B (en) Positioning method and device
CN113128516B (en) Edge extraction method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 601, 6 / F, building 2, No. 18, Kechuang 11th Street, Daxing District, Beijing, 100176

Applicant after: Jingdong Technology Information Technology Co.,Ltd.

Address before: 601, 6 / F, building 2, No. 18, Kechuang 11th Street, Daxing District, Beijing, 100176

Applicant before: Jingdong Shuke Haiyi Information Technology Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant