CN115164882A - Laser distortion removing method, device and system and readable storage medium - Google Patents

Laser distortion removing method, device and system and readable storage medium Download PDF

Info

Publication number
CN115164882A
CN115164882A CN202210826873.2A CN202210826873A CN115164882A CN 115164882 A CN115164882 A CN 115164882A CN 202210826873 A CN202210826873 A CN 202210826873A CN 115164882 A CN115164882 A CN 115164882A
Authority
CN
China
Prior art keywords
laser
robot
error
projection
distortion removal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210826873.2A
Other languages
Chinese (zh)
Inventor
刘心怡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ubtech Robotics Corp
Original Assignee
Ubtech Robotics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ubtech Robotics Corp filed Critical Ubtech Robotics Corp
Priority to CN202210826873.2A priority Critical patent/CN115164882A/en
Publication of CN115164882A publication Critical patent/CN115164882A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention provides a laser distortion removal method, a device and a system and a readable storage medium. Wherein the method comprises the following steps: confirming whether the robot is in a tilting state currently or not based on the inertia measuring device; calculating a horizontal distance of an erroneous laser projection of the robot when the robot is in an inclined state; and screening out the range of the error laser data by using a least square method based on the horizontal distance of the error laser projection, and taking the range as the error projection laser and filtering the error projection laser. The method provided by the invention can realize real-time or timing acquisition of the posture of the robot, and filter out the wrong target position by a least square method according to the calculated horizontal distance of the wrong laser projection when the robot is in an inclined state, namely, filter out the wrong projection laser, so that the condition that the robot takes an uneven ground as an obstacle to cause navigation failure in the process of traveling can be avoided, and the positioning precision and the navigation planning capability of the robot in navigation are improved.

Description

Laser distortion removing method, device and system and readable storage medium
Technical Field
The present invention relates to the field of laser technology, and more particularly, to a method, an apparatus, a system and a readable storage medium for removing laser distortion.
Background
The laser radar sensor has the characteristics of accurate distance measurement and wide range, and is widely applied to various fields of robots, such as the fields of robot mapping, positioning and navigation.
According to the difference of the visual field of the laser radar sensor, the two-dimensional laser radar and the 3D laser radar can be divided. The 2D laser radar sensor is widely applied to image building, positioning and navigation of indoor robots, and compared with the 3D laser radar sensor, the 2D laser radar sensor has no height information, only sets 2D coordinate mapping for detected objects, and is mainly used for building two-dimensional maps and identifying obstacles.
However, when the robot is applied to a small robot and the installation position is low, the robot posture is easily affected by the environment, when the robot passes through a floor with a height such as a carpet, a slope, a floor tile with a special decorative surface and the like, the robot body is easily inclined, and the laser radar sensor is easily affected by the inclination of the body, so that the laser is projected to the position close to the ground and is wrongly recognized as an obstacle, and the positioning precision and the navigation plan of the robot in the navigation are seriously affected.
Disclosure of Invention
In view of this, the present invention provides a laser distortion removing method, including:
confirming whether the robot is in a tilting state currently or not based on the inertia measuring device;
calculating a horizontal distance of an erroneous laser projection of the robot when the robot is in an inclined state;
and screening out the range of the error laser data by using a least square method based on the horizontal distance of the error laser projection, and taking the range as the error projection laser and filtering the error projection laser.
Preferably, the confirming the current inclination state of the robot based on the inertial measurement device comprises:
acquiring current pose data of the robot based on the inertial measurement device; wherein the current pose data comprises a pitch angle; wherein the pitch angle is the pitch angle of the inertial measurement unit;
judging whether the pitch angle is larger than a preset angle threshold value or not;
and if so, judging that the robot is in an inclined state.
Preferably, the current pose data further includes a vertical height of the inertial measurement unit from the ground;
the calculating of the horizontal distance of the erroneous laser projection of the robot comprises:
and calculating the horizontal distance of the error laser projection according to the vertical height and the pitch angle.
Preferably, the horizontal distance of the erroneous laser projection is calculated using the following formula:
Figure BDA0003744296870000021
wherein L is the horizontal distance of the error laser projection, H is the vertical height, and theta is the pitch angle.
Preferably, the screening out the range of the error laser data by using a least square method based on the horizontal distance of the error laser projection includes:
acquiring a plurality of actual position points of the advancing direction of the robot based on a polar coordinate system based on the horizontal distance of the error laser projection;
and screening out wrong points in the actual position points by using a least square method as positions of data of wrong laser projection.
Preferably, the acquiring a plurality of actual position points of the robot advancing direction based on a polar coordinate system based on the horizontal distance of the erroneous laser projection includes:
and acquiring a plurality of actual position points in a rectangular area with the length of the robot in the advancing direction being L based on the polar coordinate system.
Preferably, the screening out the wrong point in the actual position points by using a least square method as the position where the data of the wrong laser projection is located includes:
calculating a linear equation by using a least square method to obtain a theoretical value;
substituting the actual position point into the linear equation to obtain an actual value;
judging whether the difference value between the actual value and the theoretical value is smaller than a preset comparison threshold value or not;
and if so, judging that the actual position point corresponding to the actual value is an error point, and taking the error point as the error target position.
In addition, in order to solve the above problem, the present application also provides a laser distortion removal apparatus, including:
the confirming module is used for confirming whether the robot is in an inclined state at present based on the inertia measuring device;
the calculation module is used for calculating the horizontal distance of the wrong laser projection of the robot under the condition that the robot is in an inclined state;
and the filtering module is used for screening out the range of error laser data by using a least square method based on the horizontal distance of the error laser projection, and taking the range as error projection laser and filtering the error projection laser.
In addition, in order to solve the above problem, the present application further provides a laser distortion removal system, which includes a memory and a processor, wherein the memory is used for storing a laser distortion removal program, and the processor runs the laser distortion removal program to make the laser distortion removal system execute the laser distortion removal method.
In addition, to solve the above problem, the present application also provides a computer readable storage medium having a laser distortion removal program stored thereon, which when executed by a processor, implements the laser distortion removal method as described above.
The invention provides a laser distortion removal method, a device, a system and a readable storage medium, wherein the laser distortion removal method comprises the following steps: confirming whether the robot is in a tilting state at present based on the inertia measuring device; calculating a horizontal distance of an erroneous laser projection of the robot when the robot is in an inclined state; and screening out the range of the error laser data by using a least square method based on the horizontal distance of the error laser projection, and taking the range as the error projection laser and filtering the error projection laser. The method provided by the invention can realize real-time or timing acquisition of the posture of the robot, and filter out the wrong target position by a least square method according to the calculated horizontal distance of the wrong laser projection when the robot is in an inclined state, namely, filter out the wrong projection laser, so that the condition that the robot takes an uneven ground as an obstacle to cause navigation failure in the process of traveling can be avoided, and the positioning precision and the navigation planning capability of the robot in navigation are improved.
Drawings
FIG. 1 is a schematic structural diagram of a hardware operating environment related to an embodiment of a laser distortion removal method of the present invention;
FIG. 2 is a schematic flow chart of a laser distortion removal method 1 according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a laser channel when a robot is tilted according to the first embodiment of the laser distortion removal method of the present invention;
FIG. 4 is a schematic flow chart of a laser distortion removal method 2 according to an embodiment of the present invention;
FIG. 5 is a schematic flow chart of a laser distortion removal method 3 according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a method for calculating L when a robot tilts according to an embodiment of a laser distortion removal method 3 of the present invention;
FIG. 7 is a schematic flow chart of a laser distortion removal method according to embodiment 4 of the present invention;
FIG. 8 is a diagram illustrating calculation of a linear equation in the 4 th embodiment of the laser distortion removal method of the present invention;
FIG. 9 is a schematic diagram of the module connection of the laser distortion removal apparatus of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
Reference will now be made in detail to the embodiments of the present invention, wherein like reference numerals refer to like or similar elements or elements having like or similar functions throughout.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
In the present invention, unless otherwise expressly stated or limited, the terms "mounted," "connected," "secured," and the like are to be construed broadly and can, for example, be fixedly connected, detachably connected, or integrally formed; can be mechanically or electrically connected; they may be directly connected or indirectly connected through intervening media, or may be connected through the use of two elements or the interaction of two elements. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Fig. 1 is a schematic structural diagram of a hardware operating environment of a terminal according to an embodiment of the present invention.
The laser distortion removal system provided by the invention can be a PC, and can also be a mobile terminal device such as a smart phone, a tablet computer or a portable computer. The laser distortion removal system can comprise: a processor 1001, e.g., a CPU, a network interface 1004, a user interface 1003, a memory 1005, and a communication bus 1002. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may comprise a display screen, an input unit such as a keyboard, a remote control, and the optional user interface 1003 may also comprise a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1005 may be a high speed RAM memory or a stable memory such as a disk memory. The memory 1005 may alternatively be a storage device separate from the processor 1001 described previously. Optionally, the laser distortion removal system may further include RF (Radio Frequency) circuitry, audio circuitry, a WiFi module, and the like. In addition, the laser distortion removing system can be also provided with other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor and the like, and the details are not repeated herein.
Those skilled in the art will appreciate that the laser distortion removal system shown in fig. 1 is not so limited and may include more or fewer components than shown, or some components in combination, or a different arrangement of components. As shown in fig. 1, a memory 1005, which is a kind of computer-readable storage medium, may include therein an operating system, a data interface control program, a network connection program, and a laser distortion removal program.
The invention provides a laser distortion removal method, a device and a system and a readable storage medium. The method can confirm the state of the robot in real time according to the inertia measuring device, and when the robot is in an inclined state, the horizontal distance of the obtained error laser projection is used for calculating and filtering out the unwanted error projection laser through a least square method.
Example 1:
referring to fig. 2, embodiment 1 of the present invention provides a laser distortion removal method, including:
step S100, confirming whether the robot is in an inclined state currently or not based on an inertia measuring device;
the Inertial Measurement Unit is an IMU (Inertial Measurement Unit), also called an Inertial Measurement Unit, for measuring the three-axis attitude angle (or angular velocity) and the acceleration of the object. Gyroscopes and accelerometers are the main components of the IMU, the accuracy of which directly affects the accuracy of the inertial system. Generally, the IMU is mounted at the center of gravity of the object being tested.
The robot may be a robot capable of moving in a plane within a certain range, and examples of the robot include, but are not limited to, a floor sweeping robot, a floor mopping robot, a cleaning robot, a robot for exhibition in a hall, a disinfecting robot, and the like.
The inclined state refers to a state in which the robot body is inclined relative to a horizontal plane due to the fact that the robot encounters an uneven plane in the process of traveling. For example, the sweeping robot is in a low position, and when the sweeping robot moves forward, the robot encounters a plane such as a carpet, a ground mat, a fiber fabric and the like, or travels on a floor tile with an uneven front surface, or a small area on the ground plane has an inclination angle, so that the robot is inclined in the area, and the state is an inclined state.
In this embodiment, the present invention is particularly directed to a robot provided with a 2D laser radar. In the moving process of the robot, laser is projected towards the advancing direction through the 2D laser radar, and the robot does not have height information, and only can be used for marking a detected object with a two-dimensional coordinate mapping relative to the advancing plane of the robot, if the robot is in an inclined state due to the fact that the robot meets an uneven plane, if the position of the laser radar is taken as a reference, when the robot moves towards the advancing direction, two conditions can exist: in the state a (see fig. 3), the lidar of the robot projects to the near-ground, and at this time, the whole robot is inclined forward and downward, for example, the tail end (relative to the advancing direction) tilts up to form an included angle of 30 ° with the horizontal plane; and in the state B, the whole robot inclines forwards and upwards, the laser radar projects in the air, and the whole robot tilts forwards at the moment, for example, the front segment of the robot tilts and forms an included angle of 30 degrees with the horizontal plane.
If in the A state, the lidar of robot projects near-ground, and the whole below slope that goes forward of robot this moment, tail end perk, the lidar that then throws to the advancing direction should be when projecting the dead ahead of horizontal plane, but because the slope leads to only projecting near-ground, causes the system because misidentification, judges the barrier that should not appear appearing, and is this kind of condition promptly for in this embodiment.
The B state can also be corrected by the algorithm in the present embodiment.
Step S200, calculating the horizontal distance of the wrong laser projection of the robot under the condition that the robot is in an inclined state;
in the above, the horizontal distance of the erroneous laser projection is a distance between a point where the laser still projects forward (near the ground) to the plane where the laser is located according to the original path and a cross point where the laser radar is located and the plane when the laser is in the inclined state (a state), which is the horizontal distance of the laser projection.
And step S300, based on the horizontal distance of the error laser projection, screening out the range of error laser data by using a least square method, and filtering the range as error projection laser.
It should be noted that the least square method is a mathematical tool widely applied in many subject fields of data processing such as error estimation, uncertainty, system identification and prediction, and prediction.
As described above, calculation and screening can be performed by the least square method based on the horizontal distance of the erroneous laser projection. For example, the calculation is performed by using the least square method, and the predicted data and the actual data are compared, so that the screening is further performed, the laser path or point in which the distortion occurs can be found, and the removal and the filtering of the wrong target position are realized.
The method provided by the embodiment can acquire the posture of the robot in real time or at regular time, and filter out the wrong target position by the least square method according to the calculated horizontal distance of the wrong laser projection when the robot is in an inclined state, namely, filter out the wrong projection laser, so that the condition that the robot is considered as an obstacle in the process of traveling to cause navigation failure can be avoided, and the positioning precision of the robot in navigation and the navigation planning capability are improved.
Example 2:
referring to fig. 4, a second embodiment of the present invention provides a laser distortion removal method according to embodiment 1, where in step S100, the method for confirming a current tilt state of a robot based on an inertial measurement unit includes:
step S110, acquiring current pose data of the robot based on the inertial measurement unit; wherein the current pose data comprises a pitch angle; wherein the pitch angle is the pitch angle of the inertial measurement unit;
the inertial measurement unit, i.e. the IMU sensor unit, includes components such as an accelerometer and a gyroscope in an IMU sensor unit. For example, three single-axis accelerometers and three single-axis gyroscopes may be included, the accelerometers detecting acceleration signals of the object in three independent axes of the carrier coordinate system, and the gyroscopes detecting angular velocity signals of the carrier relative to the navigation coordinate system, measuring angular velocity and acceleration of the object in three-dimensional space, and solving the attitude of the object accordingly.
In the above way, the inertial measurement unit can acquire the pose data of the current robot, namely the pitch angle data of the robot, in real time or at regular time.
Step S120, judging whether the pitch angle is larger than a preset angle threshold value;
in the above, in the judgment, the obtained value of the pitch angle is compared with the angle threshold.
The pitch angle is an included angle between an x axis of the robot coordinate system and a horizontal plane. When the x axis of the robot body coordinate system is above the plane of the inertial coordinate system XOY, the pitch angle is positive, the robot body inclines upwards, and the condition of error recognition obstacle cannot be generated and is not considered; otherwise, the machine body inclines downwards, and the laser is obliquely hit to the bottom surface at the moment and is negative. I.e. the angle between the vector parallel to the axis of the base of the robot and directed towards the direction of advance of the robot and the ground.
The angle threshold is a preset threshold used for evaluating the state of the pose of the current robot. The threshold may be a value or a range of values. For example, the angle threshold may be-3 °.
And step S130, if yes, judging that the robot is in an inclined state.
In this embodiment, when the pitch angle obtained by the inertial measurement unit is greater than or exceeds the angle threshold, it is determined that the robot is in an inclined state, the IMU is used to obtain data in real time or at regular time, and the angle threshold is preset and compared for determination to determine the current posture of the robot, so that the defect that the 2D laser radar does not have height information can be avoided, and the accuracy of robot state identification is improved.
Example 3:
referring to fig. 5, a laser distortion removal method is provided in embodiment 3 of the present invention based on embodiment 2. And the current pose data further comprises the vertical height of the inertial measurement unit from the ground.
In step S200, calculating a horizontal distance of an erroneous laser projection of the robot includes:
and step S210, calculating to obtain the horizontal distance of the error laser projection according to the vertical height and the pitch angle.
The vertical height is the distance from the laser radar in the robot to the horizontal plane in the vertical direction, namely the vertical height.
The horizontal distance of the erroneous laser projection is a distance between a point projected by a ray projected by the laser in the advancing direction of the robot and a point at which the laser radar is located and which is located at a position facing a horizontal plane in the vertical direction. If the laser is projected wrongly, the laser is located at a distance of the fuselage
In this way, the horizontal distance of the wrong laser projection can be calculated from the vertical height and the pitch angle.
Further, referring to fig. 6, in the step S210, the horizontal distance of the erroneous laser projection is calculated by using the following formula:
Figure BDA0003744296870000101
wherein L is the horizontal distance of the error laser projection, H is the vertical height, and theta is the pitch angle.
As described above, the value of L can be calculated by using the tan tangent in the trigonometric function.
In the embodiment, the triangular function is utilized, the vertical height H (namely the height from the sensor to the bottom surface) and the pitch angle obtained by the inertia measuring device are calculated by utilizing the tangent function to obtain the horizontal distance L of wrong laser projection, so that the L value of projection is calculated in real time or at regular time when the robot is in a tilting state and the laser is projected to the bottom surface in the advancing direction, the defect that no height information exists in a 2D laser radar can be avoided, and the accuracy of robot state identification is improved.
Example 4:
referring to fig. 7, a method for removing laser distortion according to embodiment 4 of the present invention is provided based on embodiment 3. In step S300, screening out a range of the error laser data by using a least square method based on the horizontal distance of the error laser projection, includes:
step S310, acquiring a plurality of actual position points of the robot in the advancing direction based on a polar coordinate system based on the horizontal distance of the error laser projection;
a polar coordinates system (polar coordinates) is a coordinate system consisting of poles, polar axes, and polar diameters in a plane. A point O is taken on the plane and is called a pole. Starting from O, a ray Ox is drawn, called the polar axis. Further, a unit length is defined, and the predetermined angle is usually positive in the counterclockwise direction. Thus, the position of any point P on the plane can be determined by the length rho of the line segment OP and the angle theta from Ox to OP, and the ordered number pair (rho, theta) is called the polar coordinate of the point P and is marked as P (rho, theta); ρ is the polar diameter of the point P, and θ is the polar angle of the point P.
The actual position point in the advance direction of the robot is obtained based on the polar coordinate system by using the horizontal distance L of the erroneous laser projection as the length distance.
The actual position points are position points actually obtained by a plurality of laser beams projected by the laser radar directly in front of the robot while the robot is traveling.
And step S320, screening out error points in the actual position points by using a least square method as positions where data of error laser projection are located.
Further, the step S310 of obtaining a plurality of actual position points of the robot advancing direction based on the polar coordinate system based on the horizontal distance of the erroneous laser projection includes:
step S311 is to acquire a plurality of actual position points in a rectangular region having a length L in the robot traveling direction based on the polar coordinate system.
Since the moving direction of the robot is determined, a virtual rectangular region is set in order to more accurately screen and process data, and the virtual rectangular region is a rectangular region having a length of the horizontal distance L of the erroneous laser projection from the body of the robot.
As described above, all the actual position points in the forward direction and within the rectangular area are acquired.
Further, referring to fig. 8, in step S320, screening out an erroneous point in the actual position points by using a least square method as a position where data of erroneous laser projection is located, includes:
step S321, calculating a linear equation by using a least square method to obtain a theoretical value;
the filtering step corresponds to a process of converting the points projected by the laser into a polar coordinate system.
In the above, the linear equation y = ax + b in the rectangular region can be calculated by the least square method, and the values of a and b can be obtained. Wherein x and y are corresponding coordinates of the theoretical value.
Step S322, substituting the actual position point into the linear equation to obtain an actual value;
in the above, the points projected by the laser to the advancing direction, that is, all the actual position points are brought into the linear equation, and the actual coordinates corresponding to the actual position points are obtained, that is, the actual values.
Step S323, judging whether the difference value between the actual value and the theoretical value is smaller than a preset comparison threshold value;
the comparison threshold is a preset threshold for evaluating the difference between the theoretical value and the actual value. For example, it may be 0.2m.
In step S324, if yes, it is determined that the actual position point corresponding to the actual value is an error point, and the error point is used as the target position of the error.
If the difference between the actual value and the theoretical value is smaller than the comparison threshold, it can be said that the actual value is close to the theoretical value, and it indicates that the actual position point corresponding to the actual value is the point of the laser error projection, and the point is the error target position, so that the point needs to be filtered out further.
For example, referring to fig. 3, when the laser sensor is at a distance H from the ground (machine intrinsic characteristic data), from the IMU (inertial measurement unit) data orientation reading, it can be known that the pitch angle is theta, and if the theta angle is greater than the preset angle threshold, it can be determined that there is a possibility of the projected ground being inclined. At this time according to
Figure BDA0003744296870000121
And obtaining the approximate distance between the laser error projection and the robot, and screening out corresponding possible error laser data according to the distance. All potentially erroneous laser data is decomposed into corresponding x, y coordinates based on the fuselage coordinate system. A matrix [ x ]]+ b = matrix [ y]The most likely A and b can be found. Then, each data which is possibly wrong is brought into the equation, if the difference between the predicted data and the actual data of the y is too large, the data point is not on the straight line of the laser wrong projection; otherwise, on the straight line of the laser error projection; and finally, filtering data on the laser error projection straight line.
In summary, the technical problems to be solved by the present embodiment are: the problem that when a small service robot navigates through an inclined ground, laser is projected onto the ground in an error mode due to inclination, so that the robot detects an error obstacle, blocks a path and cannot plan is solved.
In the embodiment, by adopting the method provided in the embodiment, a simple and effective detection robot inclination is provided based on the mobile robot equipment, and the laser projected to the ground by mistake due to the inclination is filtered; the method adopts a least square method for detecting the filtered laser; the method can be used for slope, moving robot, drawing, path planning and control.
In the embodiment, whether the robot tilts is detected by reading data of the inertia measuring device; calculating the approximate position of the laser error projection on the ground according to the height of the robot laser radar sensor and the inclination angle of the robot; an equation of a point which is projected onto the ground in error is calculated according to a least square method, and then distortion points are filtered according to the equation, so that the robot is not influenced by an inclined state caused by uneven ground in the process of traveling, and the positioning precision and the navigation planning capability of the robot in navigation are improved.
Further, referring to fig. 9, the present invention also provides a laser distortion removal apparatus including:
a confirming module 10 for confirming whether the robot is currently in an inclined state based on the inertial measurement device;
a calculating module 20, configured to calculate a horizontal distance of an erroneous laser projection of the robot when the robot is in an inclined state;
and the filtering module 30 is configured to screen a range of the error laser data by using a least square method based on the horizontal distance of the error laser projection, and filter the range as the error projection laser.
In addition, the invention also provides a laser distortion removal system, which comprises a memory and a processor, wherein the memory is used for storing a laser distortion removal program, and the processor runs the laser distortion removal program to enable the laser distortion removal system to execute the laser distortion removal method.
Furthermore, the present invention also provides a computer-readable storage medium having stored thereon a laser distortion removal program which, when executed by a processor, implements the laser distortion removal method as described above.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the description of the foregoing embodiments, it is clear to those skilled in the art that the method of the foregoing embodiments may be implemented by software plus a necessary general hardware platform, and certainly may also be implemented by hardware, but in many cases, the former is a better implementation. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) as described above and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention. The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. A laser distortion removal method, comprising:
confirming whether the robot is in a tilting state currently or not based on the inertia measuring device;
calculating a horizontal distance of an erroneous laser projection of the robot when the robot is in an inclined state;
and screening out the range of the error laser data by using a least square method based on the horizontal distance of the error laser projection, and taking the range as the error projection laser and filtering the error projection laser.
2. The laser distortion removal method of claim 1, wherein the confirming a current tilt state of the robot based on the inertial measurement unit comprises:
acquiring current pose data of the robot based on the inertial measurement device; wherein the current pose data comprises a pitch angle; wherein the pitch angle is the pitch angle of the inertial measurement unit;
judging whether the pitch angle is larger than a preset angle threshold value or not;
and if so, judging that the robot is in an inclined state.
3. The laser distortion removal method as claimed in claim 2,
the current pose data further comprises the vertical height of the inertial measurement unit from the ground;
the calculating of the horizontal distance of the erroneous laser projection of the robot comprises:
and calculating the horizontal distance of the error laser projection according to the vertical height and the pitch angle.
4. A laser distortion removal method as set forth in claim 3, wherein the horizontal distance of the erroneous laser projection is calculated using the following formula:
Figure FDA0003744296860000011
wherein L is the horizontal distance of the error laser projection, H is the vertical height, and theta is the pitch angle.
5. The laser distortion removal method of claim 4, wherein the screening out the range of the erroneous laser data by using a least square method based on the horizontal distance of the erroneous laser projection comprises:
acquiring a plurality of actual position points of the advancing direction of the robot based on a polar coordinate system based on the horizontal distance of the error laser projection;
and screening out wrong points in the actual position points by using a least square method to serve as positions where data of wrong laser projection are located.
6. The laser distortion removal method of claim 5, wherein the obtaining a plurality of actual position points of the robot advancing direction based on a polar coordinate system based on the horizontal distance of the erroneous laser projection comprises:
and acquiring a plurality of actual position points in a rectangular area with the length of the robot in the advancing direction being L based on the polar coordinate system.
7. The laser distortion removal method of claim 5, wherein the screening out the erroneous points in the actual position points by using a least square method as positions where data of erroneous laser projections are located comprises:
calculating a linear equation by using a least square method to obtain a theoretical value;
substituting the actual position point into the linear equation to obtain an actual value;
judging whether the difference value between the actual value and the theoretical value is smaller than a preset comparison threshold value or not;
and if so, judging that the actual position point corresponding to the actual value is an error point, and taking the error point as the error target position.
8. A laser distortion removal apparatus, comprising:
the confirming module is used for confirming whether the robot is in an inclined state at present based on the inertia measuring device;
the calculation module is used for calculating the horizontal distance of the wrong laser projection of the robot under the condition that the robot is in an inclined state;
and the filtering module is used for screening out the range of the error laser data by using a least square method based on the horizontal distance of the error laser projection, and taking the range as the error projection laser and filtering the error projection laser.
9. A laser distortion removal system comprising a memory for storing a laser distortion removal program and a processor running the laser distortion removal program to cause the laser distortion removal system to perform the laser distortion removal method of any one of claims 1-7.
10. A computer-readable storage medium having stored thereon a laser distortion removal program which, when executed by a processor, implements the laser distortion removal method according to any one of claims 1 to 7.
CN202210826873.2A 2022-07-13 2022-07-13 Laser distortion removing method, device and system and readable storage medium Pending CN115164882A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210826873.2A CN115164882A (en) 2022-07-13 2022-07-13 Laser distortion removing method, device and system and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210826873.2A CN115164882A (en) 2022-07-13 2022-07-13 Laser distortion removing method, device and system and readable storage medium

Publications (1)

Publication Number Publication Date
CN115164882A true CN115164882A (en) 2022-10-11

Family

ID=83492629

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210826873.2A Pending CN115164882A (en) 2022-07-13 2022-07-13 Laser distortion removing method, device and system and readable storage medium

Country Status (1)

Country Link
CN (1) CN115164882A (en)

Similar Documents

Publication Publication Date Title
US20210012520A1 (en) Distance measuring method and device
CN111990929B (en) Obstacle detection method and device, self-walking robot and storage medium
JP7082545B2 (en) Information processing methods, information processing equipment and programs
US20220036574A1 (en) System and method for obstacle avoidance
KR102016636B1 (en) Calibration apparatus and method of camera and rader
US8467612B2 (en) System and methods for navigation using corresponding line features
JP2009068951A (en) Aerial wire controlling system
KR101880185B1 (en) Electronic apparatus for estimating pose of moving object and method thereof
CN110736456A (en) Two-dimensional laser real-time positioning method based on feature extraction in sparse environment
KR102125538B1 (en) Efficient Map Matching Method for Autonomous Driving and Apparatus Thereof
CN112525147A (en) Distance measurement method for automatic driving equipment and related device
CN113768419B (en) Method and device for determining sweeping direction of sweeper and sweeper
EP3088983B1 (en) Moving object controller and program
JP4116116B2 (en) Ranging origin recognition device for moving objects
CN111553342B (en) Visual positioning method, visual positioning device, computer equipment and storage medium
CN103744110A (en) Ultrasonic and single-eye vision sensor combined barrier recognition device
CN115164882A (en) Laser distortion removing method, device and system and readable storage medium
JP7437930B2 (en) Mobile objects and imaging systems
JP7179687B2 (en) Obstacle detector
CN114911223A (en) Robot navigation method and device, robot and storage medium
JP2019148456A (en) Calculation device, self-location calculation method and program
JP7203938B1 (en) Estimation device, estimation method and estimation program
KR102408478B1 (en) Finding Method of route and device using the same
CN116559888A (en) Indoor positioning method and device of robot, electronic equipment and storage medium
JP7278637B2 (en) Self-propelled moving device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination