CN117169848A - Method for filtering glass noise, laser radar and robot - Google Patents

Method for filtering glass noise, laser radar and robot Download PDF

Info

Publication number
CN117169848A
CN117169848A CN202311120813.XA CN202311120813A CN117169848A CN 117169848 A CN117169848 A CN 117169848A CN 202311120813 A CN202311120813 A CN 202311120813A CN 117169848 A CN117169848 A CN 117169848A
Authority
CN
China
Prior art keywords
laser
point
spot
glass noise
glass
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311120813.XA
Other languages
Chinese (zh)
Inventor
何昌传
陈悦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Camsense Technologies Co Ltd
Original Assignee
Shenzhen Camsense Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Camsense Technologies Co Ltd filed Critical Shenzhen Camsense Technologies Co Ltd
Priority to CN202311120813.XA priority Critical patent/CN117169848A/en
Publication of CN117169848A publication Critical patent/CN117169848A/en
Pending legal-status Critical Current

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The embodiment of the application relates to the technical field of robots and discloses a method for filtering glass noise points. If the laser point is marked as a suspected glass noise point, the point cloud form near the laser point is analyzed to determine whether the laser point is a glass noise point. If the laser point is a glass noise point, the laser point is removed from the point cloud data. In the embodiment, the mode of screening the spot form and the point cloud form in two steps is combined, so that not only can the glass noise be accurately filtered, but also the calculation force can be effectively saved, therefore, the method can be applied to devices with low calculation force (such as a singlechip and the like), is beneficial to implementation on a laser radar, ensures that the point cloud data finally output by the laser radar is accurate, reduces noise interference, and is beneficial to the accuracy of robot image construction and obstacle avoidance.

Description

Method for filtering glass noise, laser radar and robot
Technical Field
The embodiment of the application relates to the technical field of robots, in particular to a method for filtering glass noise, a laser radar and a robot.
Background
With the continuous development of technology, the laser radar is widely applied to the fields of robots, unmanned vehicles and the like. Lidar (Laser Detection and Ranging, LADAR) is a radar system that emits a laser beam to detect a characteristic quantity such as a position or a speed of a target. The lidar includes a transmitter that transmits a detection signal (laser) to a target and a receiver that receives a signal (reflected light) reflected from the target, and then compares the received signal with the transmitted signal, and after appropriate processing, the lidar obtains information about the target, such as parameters such as target distance, azimuth, altitude, speed, attitude, and even shape.
Among them, the ranging methods used for the lidar are a triangulation ranging method and a TOF (Time of Flight, TOF) method. In the triangular ranging method, laser radar emits laser to strike an object to reflect, reflected light is received by a receiver, a light spot is formed on the receiver, and the centroid of the light spot is extracted to obtain a laser spot. As the laser point is continuously generated, point cloud data is generated. However, laser light striking the multiple layers of glass is subject to multiple reflections and effects of sunlight, dust on the glass, etc., which can introduce significant uncertainty in the centroid extraction. For objects such as glass, a large number of noise points (namely glass noise points) exist in the point cloud data generated by scanning, and interference is caused to indoor SLAM mapping and positioning.
Disclosure of Invention
In view of the above, some embodiments of the present application provide a method for filtering glass noise, a laser radar and a robot, where the method is applied to the laser radar, and the laser radar can accurately filter glass noise in point cloud data, reduce noise interference, and is beneficial to the accuracy of robot map construction and obstacle avoidance.
In a first aspect, an embodiment of the present application provides a method for filtering glass noise, including:
for each laser point in the point cloud data, marking the attribute category of the laser point according to the light spot form of the laser point, wherein the attribute category reflects whether the laser point is a suspected glass noise point or not;
if the laser point is marked as a suspected glass noise point, analyzing the point cloud form near the laser point, and determining whether the laser point is the glass noise point;
if the laser point is a glass noise point, the laser point is removed from the point cloud data.
In some embodiments, the marking the attribute category of the laser spot according to the spot shape of the laser spot includes:
carrying out gray statistics on the light spots of the laser points to obtain light spot curves;
determining a light spot effective area in a light spot curve;
and marking the attribute category of the laser spot according to the waveform characteristics of the curve in the effective area of the light spot.
In some embodiments, marking the attribute category of the laser spot according to the waveform characteristic of the curve in the effective area of the spot includes:
respectively counting the number of pixels with outer gray values larger than a first gray threshold value at two sides of the effective area of the light spot to obtain the number of pixels at the left side and the number of pixels at the right side;
respectively determining whether gradient direction changes occur at two sides of the effective area of the light spot to obtain a left gradient direction change result and a right gradient direction change result;
marking the attribute type of the laser spot according to the number of the left pixels, the number of the right pixels, the left gradient direction change result and the right gradient direction change result.
In some embodiments, the marking the attribute category of the laser spot according to the number of left pixels, the number of right pixels, the left gradient direction change result, and the right gradient direction change result includes:
if the left gradient direction is changed and the number of left pixels is greater than a first threshold value, determining that the left attribute type is a suspected glass noise point;
if the right gradient direction is changed and the number of the right pixels is larger than a second threshold value, determining that the right attribute type is a suspected glass noise point;
when the left attribute type is the suspected glass noise and the right attribute type is the suspected glass noise, the attribute type of the marking laser point is the suspected glass noise.
In some embodiments, marking the attribute category of the laser spot according to the waveform characteristic of the curve in the effective area of the spot includes:
respectively counting the number of pixels with gray values larger than a second gray threshold value in the effective area of the light spot to obtain the number of pixels in the area;
marking attribute categories of laser points according to the number of the regional pixels; or alternatively, the first and second heat exchangers may be,
and marking the attribute category of the laser spot according to the width of the effective area of the spot.
In some embodiments, marking the attribute categories of the laser spot according to the waveform characteristics of the curve in the effective area of the spot includes:
if the wave crest number of the curve in the light spot effective area is larger than or equal to a third threshold value, marking the attribute type of the laser spot as a suspected glass noise point.
In some embodiments, the analyzing the point cloud morphology near the laser point to determine whether the laser point is a glass noise point includes:
and sliding the sliding window with the preset length on the point cloud data, and determining whether the laser point is a glass noise point according to the average distance of the laser point in the sliding window and/or the effective laser point in the sliding window when the laser point is positioned at the middle position of the sliding window.
In some embodiments, determining whether the laser spot is a glass noise spot according to the average distance of the laser spot in the sliding window and/or the effective laser spot in the sliding window includes:
Determining an average distance between the laser spot and the effective laser spot in the sliding window;
if the average distance is greater than the distance threshold, determining the laser point as a glass noise point; or alternatively, the first and second heat exchangers may be,
and if the number of the effective laser points in the sliding window is smaller than the number threshold value, determining the laser points as glass noise points.
In a second aspect, an embodiment of the present application provides a lidar, including:
at least one processor, and
a memory communicatively coupled to the at least one processor, wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of the first aspect.
In a third aspect, embodiments of the present application provide a robot comprising a lidar of the second aspect.
The embodiment of the application has the beneficial effects that: different from the situation in the prior art, the method for filtering glass noise points provided by the embodiment of the application marks the attribute type of the laser points according to the light spot shape of the laser points for each laser point in the point cloud data, and the attribute type reflects whether the laser points are suspected glass noise points or not. If the laser point is marked as a suspected glass noise point, the point cloud form near the laser point is analyzed to determine whether the laser point is a glass noise point. If the laser point is a glass noise point, the laser point is removed from the point cloud data.
In this embodiment, glass noise is screened in two steps, and first, a laser spot with a suspected glass noise is marked according to the spot shape. Then, the laser point of the suspected glass noise is further analyzed for the nearby point cloud form, and whether the laser point is the glass noise is determined. And after screening out the glass noise points, removing the glass noise point laser points from the point cloud data. That is, the rough screening is performed according to the light spot shape, the suspected glass noise point can be marked quickly, and the fine screening is performed according to the nearby point cloud shape for the suspected glass noise point, so that whether the suspected glass noise point is the glass noise point can be determined accurately. Compared with the method for carrying out coarse screening or fine screening on all laser point clouds in point cloud data, the method provided by the embodiment of the application combines the light spot form and the point cloud form to carry out two-step screening, so that not only can glass noise be accurately filtered, but also the calculation force can be effectively saved, therefore, the method can be applied to devices with low calculation force (such as a singlechip and the like), is beneficial to implementation on a laser radar, ensures that the point cloud data finally output by the laser radar is accurate, reduces noise interference, and is beneficial to the accuracy of robot map building and obstacle avoidance.
Drawings
One or more embodiments are illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements, and in which the figures of the drawings are not to be taken in a limiting sense, unless otherwise indicated.
FIG. 1 is a schematic illustration of an application environment according to some embodiments of the application;
FIG. 2 is a schematic diagram of a map created by a robot in some embodiments of the application;
FIG. 3 is a flow chart of a method of filtering glass noise in some embodiments of the application;
FIG. 4 is a schematic diagram of a triangulation method according to some embodiments of the present application;
FIG. 5 is a schematic diagram of a spot curve according to some embodiments of the application;
FIG. 6 is a schematic diagram of a spot curve with multiple peaks in some embodiments of the application;
FIG. 7 is a schematic diagram of a light spot curve with peak shifting in some embodiments of the application;
FIG. 8 is a graph illustrating a spot curve with burrs around peaks in some embodiments of the application;
FIG. 9 is a schematic diagram of a sliding window in some embodiments of the application;
FIG. 10 is a schematic view of a point cloud before and after filtering glass noise in some embodiments of the application;
FIG. 11 is a schematic diagram of a lidar according to some embodiments of the application;
fig. 12 is a schematic structural view of a robot according to some embodiments of the present application.
Detailed Description
The present application will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the present application, but are not intended to limit the application in any way. It should be noted that variations and modifications could be made by those skilled in the art without departing from the inventive concept. These are all within the scope of the present application.
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
It should be noted that, if not in conflict, the features of the embodiments of the present application may be combined with each other, which is within the protection scope of the present application. In addition, while functional block division is performed in a device diagram and logical order is shown in a flowchart, in some cases, the steps shown or described may be performed in a different order than the block division in the device, or in the flowchart. Moreover, the words "first," "second," "third," and the like as used herein do not limit the data and order of execution, but merely distinguish between identical or similar items that have substantially the same function and effect.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used in the description of the application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. The term "and/or" as used in this specification includes any and all combinations of one or more of the associated listed items.
In addition, the technical features of the embodiments of the present application described below may be combined with each other as long as they do not collide with each other.
In the embodiment of the application, the robot may be a mobile device capable of providing a functional service, for example: the robot can be a cleaning robot, a pet robot, a distribution robot, a nursing robot, a remote monitoring robot, a sweeping robot and the like. The robot is provided with a laser radar, and the robot establishes a map, avoids barriers or positions based on laser point clouds obtained by laser radar scanning.
The method of filtering glass noise in the present application will be exemplarily described with respect to a robot as a cleaning robot.
Referring to fig. 1, fig. 1 is a schematic view of an application environment for filtering glass noise according to an embodiment of the present application. As shown in fig. 1, the cleaning robot 100 is located on the floor, which may be the floor of a living room, office, or outdoor, etc. As shown in fig. 1, the cleaning robot 100 is located on the floor, which may be the floor of a living room or office or the like. The place where the cleaning robot 100 is located includes an obstacle such as a desk, a flowerpot, a sofa, or a glass door.
The free walking of the cleaning robot 100 is mainly accomplished by means of several modules of mapping, positioning, navigation and obstacle avoidance. It will be appreciated that these modules are implemented by the sensors and corresponding control programs.
In some embodiments, the cleaning robot 100 is provided with a laser radar and a visible light camera, wherein the laser radar scans the surrounding environment where the cleaning robot 100 is located to obtain a laser point cloud. The visible light camera photographs the surrounding environment where the cleaning robot 100 is located, and acquires an image. The laser radar and the visible light camera are respectively in communication connection with the controller, the laser point cloud and the image are respectively sent to the controller, the controller calls a program which is preset in advance in a memory of the cleaning robot 100 and constructs a semantic image, and the semantic map is suggested based on the laser point cloud and the image. The program for constructing the semantic image may include a program corresponding to a SLAM algorithm (Simultaneous Localization and Mapping, SLAM), which will not be described in detail herein. The semantic map is saved in a memory of the cleaning robot 10. When the cleaning robot moves to work, the controller calls the semantic map as the basis of autonomous positioning, path planning and obstacle avoidance.
It is understood that the SLAM algorithm has both positioning and navigation. In the positioning process, the laser radar is controlled to rotate at a high speed to emit laser, the distance between the cleaning robot and the obstacle is measured, and the relative position between the cleaning robot and the obstacle is judged by combining a semantic map, so that positioning is realized. In some embodiments, the cleaning robot 100 may be visually positioned based on a visible light camera. In the navigation process, path planning is performed based on the positioning and cleaning destination, and navigation is performed according to the path.
The cleaning robot 100 may detect an obstacle based on image information collected by the visible light camera and point cloud information collected by the laser radar. For example, an obstacle is identified from the image information using a pre-trained object detection model or semantic segmentation model. For example, an obstacle is detected based on the distance calculated from the point cloud information. When the cleaning robot 100 approaches an obstacle to some extent, the cleaning robot is controlled to change the traveling direction to avoid the obstacle.
The cleaning robot 100 may be configured in any suitable shape in order to achieve a specific business function operation, for example, in some embodiments, the cleaning robot 100 may be a SLAM system-based mobile robot. Among them, the cleaning robot 100 includes, but is not limited to, a sweeping robot, a dust collecting robot, a mopping robot, a washing robot, or the like.
In some embodiments, the cleaning robot 100 includes a main body and a driving wheel part, a sensing unit, a laser radar, and a controller. The body may be generally oval, triangular, D-shaped or other shape in shape. The controller is arranged on the main body, the main body is a main body structure of the cleaning robot 100, and corresponding shape structures and manufacturing materials (such as hard plastics or metals of aluminum, iron and the like) can be selected according to the actual needs of the cleaning robot 100, for example, the cleaning robot is arranged into a flat cylinder common to sweeping robots.
Wherein, drive wheel part installs in the main part, and drive robot is on waiting to clean the face and remove. In some embodiments, the drive wheel assembly includes a left drive wheel, a right drive wheel, and an omni wheel, the left and right drive wheels being mounted to opposite sides of the body, respectively. The omnidirectional wheel is arranged at the front position of the bottom of the main body, is a movable castor and can rotate horizontally by 360 degrees, so that the cleaning robot can flexibly turn. The left driving wheel, the right driving wheel and the omni-wheel are installed to form a triangle so as to improve the walking stability of the robot.
In some embodiments, the sensing unit is used for collecting some motion parameters of the robot and various data of the environmental space, and the sensing unit comprises various suitable sensors, such as a gyroscope, an odometer, a magnetic field meter, an accelerometer or a speedometer, and the like.
In some embodiments, the lidar is provided to the body of the cleaning robot 100, for example: the lidar is provided to a moving chassis of the body of the cleaning robot 100. The lidar is used to sense the condition of obstacles in the surrounding environment of the mobile cleaning robot 100, obtain the distance of surrounding objects, and send to the controller so that the controller controls the robot to walk based on the distance of the surrounding objects. In some embodiments, the lidar comprises a pulsed lidar, a continuous wave lidar, or the like, and the mobile chassis comprises a robotic mobile chassis such as a universal chassis, a vaulted mobile chassis, or the like.
In some embodiments, the controller is disposed inside the main body, is an electronic computing core built in the robot main body, and is configured to perform a logic operation step to implement intelligent control of the robot. The controller is electrically connected with the left driving wheel, the right driving wheel and the omnidirectional wheel respectively. The controller is used as a control core of the robot and is used for controlling the robot to walk, retreat, avoid the obstacle and process some business logic.
It is to be appreciated that the controller may be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), a single-chip, ARM (Acorn RISC Machine) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof. The controller may also be any conventional processor, controller, microcontroller, or state machine. A controller may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP and/or any other such configuration, or one or more of a micro-control unit (Microcontroller Unit, MCU), a Field-programmable gate array (Field-Programmable Gate Array, FPGA), a System on Chip (SoC).
It will be appreciated that the memory of the cleaning robot 10 in the embodiments of the present application includes, but is not limited to: FLASH memory, NAND FLASH memory, vertical NAND FLASH memory (VNAND), NOR FLASH memory, resistive Random Access Memory (RRAM), magnetoresistive Random Access Memory (MRAM), ferroelectric Random Access Memory (FRAM), spin transfer torque random access memory (STT-RAM), and the like.
It should be noted that, according to the task to be completed, besides the above functional modules, one or more other different functional modules (such as a water storage tank, a cleaning device, etc.) may be mounted on the main body of the cleaning robot, and cooperate with each other to perform the corresponding task.
However, the above-described lidar belongs to an optical detection sensor, and generates or point cloud information by collecting reflected light. At present, more and more building outer walls are built by two or more layers of toughened glass, and laser is hit on the multi-layer glass to be subjected to multiple reflection and influence of sunlight, dust on the glass and the like, so that great uncertainty is brought to the extraction of the centroid of the light spot. For objects such as glass, a large number of noise points (namely glass noise points) exist in the point cloud data generated by scanning, and interference is caused to SLAM (cleaning robot mapping), positioning or obstacle avoidance and the like.
In some scenarios, a laser radar transmitter emits laser light, the laser light strikes a surface of a multi-layer glass, the reflected laser light is reflected off the surface of the glass, the reflected laser light is received by a receiver, a light spot is formed on the receiver, and a centroid of the light spot is extracted to obtain a laser spot. As scanning proceeds, laser points are continuously generated, generating point cloud data. The point cloud data appears as noisy points (i.e., glass noisy points) with jumps at both ends. As shown in fig. 2, the distance between the cleaning robot and the glass is misjudged, burrs can occur in SLAM mapping, and the wall for mapping is thicker, so that the normal operation of the cleaning robot is seriously affected, and the user experience is affected.
In some schemes for filtering glass noise known by the present inventors, post-processing is generally performed on point cloud data by an upper computer, however, the problem that the point cloud data is excessively filtered is easily caused, and a large computing power is required, so that the upper computer resources are occupied.
In view of the above problems, some embodiments of the present application provide a method for filtering glass noise, in which, for each laser point in point cloud data, the attribute type of the laser point is marked according to the spot shape of the laser point, and the attribute type reflects whether the laser point is a suspected glass noise. If the laser point is marked as a suspected glass noise point, the point cloud form near the laser point is analyzed to determine whether the laser point is a glass noise point. If the laser point is a glass noise point, the laser point is removed from the point cloud data.
In this embodiment, glass noise is screened in two steps, and first, a laser spot with a suspected glass noise is marked according to the spot shape. Then, the laser point of the suspected glass noise is further analyzed for the nearby point cloud form, and whether the laser point is the glass noise is determined. And after screening out the glass noise points, removing the glass noise point laser points from the point cloud data. That is, the rough screening is performed according to the light spot shape, the suspected glass noise point can be marked quickly, and the fine screening is performed according to the nearby point cloud shape for the suspected glass noise point, so that whether the suspected glass noise point is the glass noise point can be determined accurately. Compared with the method for carrying out coarse screening or fine screening on all laser point clouds in point cloud data, the method provided by the embodiment of the application combines the light spot form and the point cloud form to carry out two-step screening, so that not only can glass noise be accurately filtered, but also the calculation force can be effectively saved, therefore, the method can be applied to devices with low calculation force (such as a singlechip and the like), is beneficial to implementation on a laser radar, ensures that the point cloud data finally output by the laser radar is accurate, reduces noise interference, and is beneficial to the accuracy of robot map building and obstacle avoidance.
It will be appreciated from the foregoing that the method for filtering glass noise provided by the embodiments of the present application may be implemented by various types of electronic devices having computing processing capabilities, such as by lidar, by a controller of a robot, or by other devices having computing processing capabilities.
The method for filtering glass noise provided by the embodiment of the application is described below in connection with exemplary applications and implementations of the lidar provided by the embodiment of the application. Referring to fig. 3, fig. 3 is a flowchart illustrating a method for filtering glass noise according to an embodiment of the application.
It will be appreciated that the lidar is mounted to a robot, and in particular, the subject of execution of the method of filtering glass noise is one or more processors of the lidar.
As shown in fig. 3, the method S100 may specifically include the following steps:
s10: and marking the attribute type of the laser points according to the light spot form of each laser point in the point cloud data, wherein the attribute type reflects whether the laser points are suspected glass noise points or not.
The point cloud data is a laser point cloud obtained by scanning a laser radar in a working environment (such as a house). The spot shape refers to the shape feature of a spot formed on a receiver of the laser radar after the receiver receives the reflected laser. In some embodiments, the morphological feature may be a gray scale distribution feature of the light spot.
For easy understanding, the working principle of the lidar will be briefly described here: the lidar includes a transmitter, a receiver, a processor, and a rotation mechanism. The emitter is a device for emitting laser light, and may be, for example, a gas laser, a solid-state laser, a semiconductor laser, a free electron laser, or the like. The receiver is a device for receiving laser light and may be, for example, a photosensitive coupling assembly (Charge coupled Device, CCD).
The processor is mainly responsible for controlling the transmitter to transmit laser and processing the laser signals received by the receiver to calculate the distance information of the target object. The processor may be an integrated circuit chip with signal processing capabilities such as a general purpose processor, digital signal processor (DSP, digital Signal Processor), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like.
The rotating mechanism is a laser radar installation framework and is used for direction adjustment. In some embodiments, the rotation mechanism may include a rotating base that is rotated by a belt. The transmitter, receiver and processor are disposed on a rotating mechanism that rotates at a steady rotational speed, whereby the lidar can scan the surrounding environment and generate point cloud data.
The ranging method adopted by the laser radar is a triangular ranging method, and please refer to fig. 4, which is a schematic diagram of the principle of the triangular ranging method. The triangle ranging method mainly comprises the steps of emitting a beam of laser through a transmitter, irradiating the laser on a target object at a certain incident angle, reflecting and scattering the laser on the surface of the target object, converging and imaging the reflected laser at another angle by using a lens, and imaging a light spot on a receiver. Since the transmitter and receiver are spaced apart by a distance s, objects of different distances will be imaged at different locations on the receiver, depending on the optical path. As shown in fig. 4, s is a distance between focuses of the transmitter and the receiver (i.e., a reference line), d is a distance between the transmitter and the target object, α is a heading angle, p is a vertical distance between the target object and the reference line, and f is a focal length. The position of the intersection point A of the broken line with the focus parallel to the laser direction and the receiver is known, and the center of the imaging point B imaged on the receiver after laser reflection is at a distance x from the intersection point A. As can be seen from the geometry of the similar triangles, the distance p=f×s/x of the target object.
In some embodiments, the ranging formula for triangulation is as follows:
d=n1/(n2-cx) (1)
wherein n1 and n2 are distance measurement parameters obtained by the laser radar through calibration, cx is the centroid of the light spot (namely, the brightest place of the light spot), namely, the value of the abscissa of the light spot on the receiver is pixel, and when the width of the receiver is 480 pixel values, the value range of cx is 1-480. The size of the n1 parameter is related to the structure of the laser radar, and the size of the n2 parameter is related to the position of the light spot on the receiver.
And the point cloud data generated by the laser radar comprises angles, distances, brightness and the like of all laser points by adopting a triangular ranging method. The angle is the angle of the scanning point of the laser striking the target object in the polar coordinate, the distance is the distance from the scanning point of the laser striking the target object to the laser radar, and the brightness is the brightness of the light spot in the receiver, namely the gray scale.
In some application scenarios, the reflection and scattering conditions of the laser spot are different on the surfaces of objects made of different materials, for example, laser light is reflected on the multiple layers of glass for multiple times and is affected by sunlight, dust on the glass and the like, so that the shape of the light spot is changed. That is, it is possible to roughly distinguish whether a normal laser spot or a noise spot is formed from the spot shape.
Here, the attribute type of the laser spot is marked according to the spot form of the laser spot, and reflects whether the laser spot is a suspected glass noise.
Wherein the attribute category includes normal laser points or suspected glass noise points. It will be appreciated that if a laser spot is marked as a suspected glass noise, it is indicated that the laser spot may be a glass noise.
In some embodiments, the foregoing step S10 specifically includes:
s11: and carrying out gray statistics on the light spots of the laser spots to obtain light spot curves.
S12: and marking the attribute category of the laser spot according to the spot curve.
It can be understood that the light spot received by the receiver is formed by converging and imaging the reflected laser light through the lens, so that the light spot occupies a certain range of pixels (the width of the receiver is characterized by pixels from the above), and the light spot has a gray level distribution with high middle gray level (high brightness) and low surrounding gray level (low brightness). Therefore, gray statistics is carried out on the light spots, and a light spot curve is obtained. As shown in fig. 5, the abscissa of the spot curve is a pixel (unit pixel), and the ordinate is a gray value.
It will be appreciated that different spots correspond to different waveforms in the spot curve. For example, the peak in the spot curve is left-hand, resulting in a spot centroid that is left-hand; multiple peaks or smaller peaks may occur in the spot curve, resulting in inaccurate spot selection. Thus, the attribute type of the laser spot can be distinguished from the spot curve, and the attribute type of the laser spot can be marked.
In this embodiment, the light spot is converted into a light spot curve, which is beneficial to identifying the light spot characteristics and thus to the accuracy of the marking attribute category.
In some embodiments, the foregoing step S12 specifically includes:
s121: and determining the effective area of the light spot in the light spot curve.
As can be seen from fig. 5, the gray value in the middle area of the spot curve is high, corresponding to the spot center; the gray values on two sides of the light spot curve are low, and the gray values correspond to the light spot edges. Here, a spot effective area in the spot curve is extracted. The light spot effective area may be an area where the peak is located, i.e. an area with a large gray scale.
In some embodiments, a spot effective area (also called a blob area) is extracted from the spot curve. The extraction algorithm adopted is a BLOB extraction algorithm, which is an algorithm for dividing the foreground and the background of the image according to a certain rule. It will be appreciated that the BLOB extraction algorithm is an existing algorithm and will not be described in detail here.
As shown in fig. 5, the spot effective area (blob area) is the area between vertical lines 1# to 3+. The width blob_width of the spot active area is the abscissa of vertical line 3 minus the abscissa of vertical line 1. And (3) carrying out gray weighting calculation in the blob area to obtain a centroid cx, wherein the vertical line 2# in fig. 5 is the centroid position.
S122: and marking the attribute category of the laser spot according to the waveform characteristics of the curve in the effective area of the light spot.
Here, the attribute category of the laser spot (belonging to a normal laser spot or a glass noise spot) is identified from the waveform characteristics of the curve in the blob area, such as the peak characteristics, and the attribute category is marked for the laser spot.
In some embodiments, the inventor collects light spot data of different laser radars, different distances and different angles in a multi-layer glass scene, and summarizes three light spot characteristics causing glass noise points, and burrs appear on two sides of a plurality of wave crests, offset peaks and main peaks.
In some embodiments, the step S122 specifically includes:
(a) If the wave crest number of the curve in the light spot effective area is larger than or equal to a third threshold value, marking the attribute type of the laser spot as a suspected glass noise point.
The third threshold is an empirical value set by those skilled in the art based on the characteristics of the point cloud in the glass scene, for example, the third threshold may be 2 or 3.
As shown in fig. 6 (a) or (b), 3 peaks appear in the curve in the effective area of the spot. It will be appreciated that the plurality of peaks may introduce errors into the determination of the effective area of the spot, thereby resulting in a large deviation in the range finding. The type of light spots can determine the attribute category of the laser spot by counting the wave crest number, and then marking is carried out.
The number of peaks of the curve in the light spot effective area is larger than or equal to a third threshold value, which indicates that a plurality of peaks appear in the laser spot, the light spot brightness is not concentrated, and the laser spot is not a normal laser spot, and the suspected glass noise point is marked.
In some embodiments, the step S122 specifically includes:
(b) And respectively counting the number of pixels with outer gray values larger than a first gray threshold value at two sides of the effective area of the light spot to obtain the number of pixels at the left side and the number of pixels at the right side.
(c) And respectively determining whether gradient direction changes occur at two sides of the effective area of the light spot to obtain a left gradient direction change result and a right gradient direction change result.
(d) Marking the attribute type of the laser spot according to the number of the left pixels, the number of the right pixels, the left gradient direction change result and the right gradient direction change result.
It can be understood that, based on the characteristic that the gray scale (brightness) of the light spot is concentrated on the centroid, the gray scale value in the effective area of the light spot is large, the gray scale values on two sides are small, and the gray scale value change gradient at the boundary of the area is large under normal conditions. If the centroid of the light spot shifts as shown in fig. 7 (a) or (b), the light spot shifts on the light spot curve, the peak shifts, the gradient of gray value change at the boundary of the areas is small, and the light spot is not easy to distinguish.
In this embodiment, the attribute type of the laser spot is determined from the gray value size case of both sides and the gradient direction change case. Specifically, the number of pixels with outer gray values larger than a first gray threshold value at two sides of the effective area of the light spot is counted respectively, and the number of pixels at the left side and the number of pixels at the right side are obtained.
Wherein the first gray threshold is an empirical value determined by a person skilled in the art based on the spot characteristics in a glass scene. In some embodiments, the first gray threshold is determined using the following formula:
gray_thr1=min_gray+0.5*(max_gray-min_gray) (2)
where grayjhr 1 is a first gray threshold, and max_gray is the maximum gray value in the effective area of the flare or the maximum gray value in the area after the effective area of the flare is spread (for example, 50 pixels are spread on both sides). min_gray is the minimum gray value in the effective area of the spot or the minimum gray value in the area after the effective area of the spot is spread (for example, 50 pixels are spread on both sides).
In this embodiment, the first gray threshold is determined based on gray values in the spot effective area or its flare area, so that the first gray threshold is more reasonable.
And counting the number of pixels with gray values larger than a first gray threshold value on the left side of the light spot effective area or the left side of the outer expansion area, and obtaining the left pixel number left_pixel_cnt.
And counting the number of pixels with gray values larger than a first gray threshold value on the right side of the light spot effective area or the right side of the outer expansion area, and obtaining the right-side pixel_cnt.
And respectively determining whether gradient direction changes occur at two sides of the effective area of the light spot to obtain a left gradient direction change result and a right gradient direction change result. The gradient direction change refers to a change trend (gradient direction) of a curve in the light spot curve.
In some embodiments, the gradient value is calculated using the following formula:
delta_gradient(i)=P(i)-P(i+step) (2)
wherein P (i) is the gray value of the ith pixel, P (i+step) is the gray value of the (i+step) th pixel, step is the number of steps, and delta_gradient (i) is the gradient value. In some embodiments, step may be 3, taking into account local gray value fluctuations.
In some embodiments, the gradient direction change result is determined using the following formula:
the gray_thr is a gray threshold, for example, the first gray threshold. When the flag_change is 1, the gradient direction is changed; when flag_change is 0, it means that the gradient direction is not changed.
In this embodiment, if the gradient value of the i-th pixel and the gradient value of the i+1th pixel are different in positive and negative, it is indicated that the change trend of the curve is changed, that is, it is indicated that the gradient direction may be changed. On this basis, the gray value of the i+1th pixel is further compared with the gray threshold, and if P (i+1) > gray_thr indicates that a large gray value (anomaly) is present outside the effective spot area, it is determined that a gradient direction change occurs, that is, flag_change=1.
It can be understood that the left side and the right side of the effective area or the outer expansion area of the light spot are respectively judged by the above-mentioned flag_change formula, so as to obtain a left side gradient direction change result left_flag_change and a right side gradient direction change result right_flag_change.
Finally, determining the attribute category of the laser point according to the left pixel number left_pixel_cnt, the right pixel number right_pixel_cnt, the left gradient direction change result left_flag_change and the right gradient direction change result right_flag_change, and marking the attribute category of the laser point.
In some embodiments, determining and marking the attribute type of the laser spot according to the left pixel count left_pixel_cnt, the right pixel count right_pixel_cnt, the left gradient direction change result left_flag_change, and the right gradient direction change result right_flag_change specifically includes: if the left gradient direction is changed (i.e., left_flag_change=1) and the number of left pixels left_pixel_cnt is greater than the first threshold, determining that the left attribute type is a suspected glass noise point. If the right gradient direction is changed (namely, right_flag_change) and the number of the right pixels is larger than a second threshold value, determining that the right attribute type is a suspected glass noise point. When the left attribute type is the suspected glass noise and the right attribute type is the suspected glass noise, the attribute type of the marking laser point is the suspected glass noise.
The first threshold is a threshold for judging the number left_pixel_cnt of the left pixels, and when the number left_pixel_cnt of the left pixels is greater than the first threshold, the number of gray values indicating the left anomalies is greater, and the laser point may be a suspected glass noise point. In some embodiments, the first threshold may be 0.4 times blob width (the width of the spot active area). Here, in combination with the left gradient direction (whether or not a change has occurred) and the left pixel number (whether or not left_pixel_cnt is greater than a first threshold value), the left attribute category can be accurately determined.
The second threshold is a critical value for judging the number of right pixels_pixel_cnt, and when the number of right pixels_pixel_cnt is greater than the second threshold, the right abnormal gray value angle is indicated, and the laser point may be a suspected glass noise point. In some embodiments, the second threshold may be 0.4 times blob width (the width of the spot active area). Here, in combination with the right-side gradient direction (whether or not a change occurs) and the right-side pixel number (whether or not right_pixel_cnt is greater than the second threshold value), the right-side attribute category can be accurately determined.
In this embodiment, the suspected glass noise is determined on both sides, and when the attribute type on the left side is the suspected glass noise and the attribute type on the right side is the suspected glass noise, the attribute type of the marking laser spot is the suspected glass noise, so that the attribute type of the laser spot is more accurate.
In some embodiments, the following formula may be used to determine the attribute category of the laser spot:
wherein left_flag represents a left attribute category, and right_flag represents a right attribute category. It can be understood that the left attribute category is an attribute category determined according to the left side of the effective area of the light spot in the light spot curve, and the right attribute category is an attribute category determined according to the right side of the effective area of the light spot in the light spot curve. The flag represents the final determined attribute category, i.e. the attribute category to which the laser spot is marked.
For the left side of the effective area of the light spot in the light spot curve, if the left side gradient direction change result left_flag_change=0 (namely, the normal laser spot) and the number of left pixels left_pixel_cnt is greater than 0.4 times of blob_width (the width of the effective area of the light spot), determining that left_flag=1, namely, the left side attribute type is suspected glass noise point. Otherwise, left_flag=0 is determined, i.e., the left attribute category is normal.
Similarly, for the right side of the effective area of the light spot in the light spot curve, if the right side gradient direction change result is right_flag_change=0 (i.e. the normal laser spot), and the number of right pixels right_pixel_cnt is greater than 0.4 times of blob_width (the width of the effective area of the light spot), determining that the right_flag=1, i.e. the right attribute type is suspected glass noise. Otherwise, it is determined that right_flag=0, i.e., the right attribute category is normal.
Finally, the final attribute class flag is comprehensively determined according to the left attribute class, the right attribute class and the blob_width (the width of the effective area of the light spot). In this embodiment, if the attribute type on the left side is the suspected glass noise or the attribute type on the right side is the suspected glass noise, and the blob_width is smaller than the preset threshold, it is determined that flag=1, that is, the attribute type of the laser spot is the suspected glass noise, otherwise, flag=0, and the attribute type of the laser spot is the normal laser spot.
In some embodiments, the preset threshold is 0.6×ref_width, where ref_width is the fit blob width. In some embodiments, due to structural differences between radars, a large amount of radar data is collected to fit a functional relationship between centroid and ref_width, and in addition, to ensure uniformity of functions, offset due to the structure is considered. The fitted function is as follows:
ref_width=k (cx-offset) +b, where k, offset, b is the calibration parameter and cx is the centroid.
In this embodiment, the attribute class flag of the laser spot is determined by comprehensively considering the left attribute class, the blob_width and other dimensions, so that the laser spot with a bias peak (the centroid of the light spot is offset) can be effectively distinguished, and the attribute class of the laser spot is more accurate.
In some embodiments, the step S122 specifically includes:
(e) And counting the number of pixels with gray values larger than a second gray threshold value in the effective area of the light spot to obtain the number of pixels in the area.
(f) The attribute category of the laser spot is marked according to the number of area pixels.
It will be appreciated that in the spot curves, as shown in fig. 8 (a) or (b), the brightness of the main peak is relatively high, but not necessarily the region of the actual laser spot, and the calculated centroid will also deviate. Unlike the case of multiple peaks, the remaining peaks are relatively low except for the main peak. In this case the blob_width is relatively small and the resulting width is significantly larger in the blob bottom region than the blob_width.
In this embodiment, the attribute category of the laser spot is determined from the gray value size in the spot effective area. Specifically, counting the number of pixels with gray values larger than a second gray threshold in the effective area of the light spot to obtain the number of pixels in the area.
Wherein the second gray level threshold is an empirical value determined by one skilled in the art based on the characteristics of the spot in the glass scene. In some embodiments, the second gray level threshold is determined using the following equation:
gray_thr2=min_gray+0.2*(max_gray-min_gray)
where grayjhr 2 is the second gray level threshold, and max_gray is the maximum gray level value in the effective area of the spot or the maximum gray level value in the area after the effective area of the spot is spread (for example, 50 pixels are spread on both sides). min_gray is the minimum gray value in the effective area of the spot or the minimum gray value in the area after the effective area of the spot is spread (for example, 50 pixels are spread on both sides).
In this embodiment, the second gray level threshold is determined based on gray level values in the effective area of the spot or its flared area, so that the second gray level threshold is more reasonable.
And counting the number of pixels with gray values larger than a second gray threshold value for the light spot effective area to obtain the pixel number pixel_cnt of the area. Then, the attribute type of the laser spot is determined according to the number of the area pixels, and the attribute type of the laser spot is marked. In some embodiments, the number of pixels in the area pixel_cnt is compared with a threshold value, and when the number of pixels in the area pixel_cnt is smaller than the threshold value, the centroid of the light spot is abnormal and may be a glass noise point, so that the attribute category of the laser spot is determined to be a suspected glass noise point.
In some embodiments, the step S122 specifically includes:
(g) And marking the attribute category of the laser spot according to the width of the effective area of the spot.
It can be understood that if the width of the effective area of the light spot is smaller than the threshold value, the effective area of the light spot is narrower, and the centroid of the light spot is abnormal and possibly is a glass noise point, so that the attribute category of the laser spot is determined to be a suspected glass noise point.
In some embodiments, the following formula is used to determine the attribute category of the laser spot:
Wherein, the flag is the attribute category of the laser spot, when the flag=1, the suspected glass noise point, and when the flag=0, the normal laser spot. blob_width is the width of the effective area of the spot, ref_width is the fitting blob width, and pixel_cnt is the area pixel count.
In this embodiment, the above formula is adopted to determine the attribute type flag of the laser point from two dimensions of the number of pixels in the wide area of the effective area of the light spot, so that the laser point with burrs near the main peak (abnormal brightness near the centroid of the light spot) can be effectively distinguished, and the attribute type of the laser point is more accurate.
S20: if the laser point is marked as a suspected glass noise point, the point cloud form near the laser point is analyzed to determine whether the laser point is a glass noise point.
And further analyzing the nearby point cloud form of the laser points of the suspected glass noise points to determine whether the laser points are the glass noise points. Wherein the point cloud near the laser point comprises a plurality of laser points to the left and/or right of the laser point. The point cloud form comprises the distribution forms of distance, brightness and the like of each laser point in the point cloud.
That is, in the step S10, coarse screening is performed according to the spot shape, so that the suspected glass noise can be marked quickly. In this step S20, the suspected glass noise is finely screened according to the nearby point cloud morphology, so that it can be accurately determined whether it is a glass noise.
In some embodiments, the foregoing step S20 specifically includes: and sliding the sliding window with the preset length on the point cloud data, and determining whether the laser points are glass noise points according to the forms of a plurality of laser points in the sliding window if the laser points are positioned at the middle position of the sliding window.
Wherein the preset length is an empirical value set by a person skilled in the art based on practical situations, and in some embodiments, the preset length is an odd number, for example, the preset length is 11, so that the laser point can be accurately located in the middle of the sliding window. It will be appreciated that the sliding window is a virtual window that traverses the point cloud near the acquisition laser point, and that length and displacement characterization may be employed. For example, the sliding window has a preset length of 11, and a single step displacement of 1 laser spot or unit angular resolution.
As shown in fig. 9, the current laser spot located in the middle of the sliding window is a suspected glass noise spot, and the plurality of laser spots in the sliding window includes 5 consecutive laser spots on one side of the current laser spot and 5 consecutive laser spots on the other side.
Thus, it is possible to determine whether or not the laser spot is a glass noise spot based on the distribution pattern such as the distance between the 11 laser spots and the effective spot. Illustratively, if the distance of the current laser spot is large, the current laser spot is a glass noise spot; if the number of valid laser spots is small in the 11 laser spots, the current laser spot is a glass noise spot.
In some embodiments, the determining whether the laser spot is a glass noise spot according to the morphology of the plurality of laser spots in the sliding window specifically includes: and determining whether the laser spot is a glass noise spot according to the average distance of the laser spot in the sliding window and/or the effective laser spot in the sliding window.
Taking the sliding window in fig. 8 as an example for illustration, the average distance between the effective laser point and the current laser point in the sliding window is calculated, and when the average distance is greater than or equal to the distance threshold, the change of the point cloud distance in the sliding window is relatively large. Since the preliminary screening has been performed in step S10, when the average distance is too large, the current laser point is determined to be a glass noise point.
In some embodiments, it is determined whether the current laser spot is a glass noise spot based on the active laser spot in the sliding window. It will be appreciated that there may be invalid laser points in the point cloud data, such as points acquired for empty or radar blind spots, which are not distant. The effective laser point refers to the laser point of the output distance. Thus, based on the effective laser spot, it can be determined whether the current laser spot is a glass noise spot. For example, if the number of valid laser points is less than the number threshold, then the point cloud defect is indicated and the current laser point is determined to be a glass noise point.
In some embodiments, the following formula is used to determine whether the current laser spot is a glass noise spot:
wherein,
factor=2.5*2*sin(angle_resolution*0.5)
wherein, when result=1, it represents that the current laser spot is determined to be a glass noise spot, and when result=0, it represents that the current laser spot is determined to be a normal laser spot. sum_dist is the sum of the differences between the absolute distances of the active laser spot and the current laser spot within the sliding window. dist is the distance of an effective laser spot within the sliding window; current_dist is the distance of the current laser point; valid_idx is the index of the valid laser spot, i.e. the index number. factor is a distance coefficient that relates to angular resolution angle_resolution. Current_dist is the distance of the current laser point; valid_cnt is the number of valid laser points within the sliding window; cnt_thr is a number threshold, i.e. the number of least significant laser points within the sliding window.
By adopting the above formula, when sum_dist is greater than factor current_dist valid_cnt or valid_cnt < cnt_thr, result=1 is determined, that is, the current laser point is determined to be the glass noise point. Otherwise, result=0 is determined, i.e. the current laser spot is determined to be a normal laser spot. The above formula comprehensively considers a plurality of factors, and can accurately judge whether the laser point is a glass noise point.
S30: if the laser point is a glass noise point, the laser point is removed from the point cloud data.
It can be appreciated that in the case where the laser point is a glass noise point, the laser point is removed from the point cloud data to achieve filtering of the glass noise point. Therefore, the point cloud data after glass noise is filtered, noise interference is reduced, and the accuracy of robot image construction and obstacle avoidance is facilitated.
As shown in fig. 10, the point cloud data in fig. 10 is highlighted as white dotted lines, and before the glass noise is filtered, two sides of the point cloud data corresponding to the double-layer glass are disordered and are radial; after the glass noise is filtered, the whole shape of the filtered point cloud data is normal. That is, the method for filtering the glass noise points is further verified to have a good filtering effect, the glass noise points in a radial shape can be filtered, and normal glass point clouds are reserved.
In addition, the method can acquire more laser spot characteristic information, can solve the problem of glass noise from a radar bottom layer, and has good effects on subsequent SLAM mapping and positioning; more importantly, the calculation force requirement is lower, and the method is possible to be realized in a common singlechip.
In summary, the method for filtering glass noise provided by the embodiment of the application screens glass noise in two steps, and marks the laser spot of the suspected glass noise according to the light spot form. Then, the laser point of the suspected glass noise is further analyzed for the nearby point cloud form, and whether the laser point is the glass noise is determined. And after screening out the glass noise points, removing the glass noise point laser points from the point cloud data. That is, the rough screening is performed according to the light spot shape, the suspected glass noise point can be marked quickly, and the fine screening is performed according to the nearby point cloud shape for the suspected glass noise point, so that whether the suspected glass noise point is the glass noise point can be determined accurately. Compared with the method for carrying out coarse screening or fine screening on all laser point clouds in point cloud data, the method provided by the embodiment of the application combines the light spot form and the point cloud form to carry out two-step screening, so that not only can glass noise be accurately filtered, but also the calculation force can be effectively saved, therefore, the method can be applied to devices with low calculation force (such as a singlechip and the like), is beneficial to implementation on a laser radar, ensures that the point cloud data finally output by the laser radar is accurate, reduces noise interference, and is beneficial to the accuracy of robot map building and obstacle avoidance.
The embodiment of the application also provides a laser radar, referring to fig. 11, fig. 11 is a schematic hardware structure diagram of the laser radar according to the embodiment of the application.
As shown in fig. 11, the lidar 300 includes at least one processor 301 and a memory 302 (bus connection, one processor being an example in fig. 11) communicatively coupled.
The processor 301 is configured to provide computing and control capabilities to control the lidar 300 to perform corresponding tasks, for example, to control the lidar 300 to perform the method of filtering glass noise in any of the method embodiments described above.
The processor 301 may be a general purpose processor including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), a hardware chip, or any combination thereof; it may also be a digital signal processor (Digital Signal Processing, DSP), application specific integrated circuit (Application Specific Integrated Circuit, ASIC), programmable logic device (programmable logic device, PLD), or a combination thereof. The PLD may be a complex programmable logic device (complex programmable logic device, CPLD), a field-programmable gate array (field-programmable gate array, FPGA), general-purpose array logic (generic array logic, GAL), or any combination thereof.
The memory 302 serves as a non-transitory computer readable storage medium, and may be used to store non-transitory software programs, non-transitory computer executable programs, and modules, such as program instructions/modules corresponding to a method for filtering glass noise in an embodiment of the present application. The processor 301 may implement the method for filtering glass noise in any of the above method embodiments by running the non-transitory software program, instructions and modules stored in the memory 302, that is, may implement each of the processes implemented in fig. 3-9, and for avoiding repetition, the description is omitted here.
In particular, the memory 302 may include Volatile Memory (VM), such as random access memory (random access memory, RAM); the memory 302 may also include a non-volatile memory (NVM), such as read-only memory (ROM), flash memory (flash memory), hard disk (HDD) or Solid State Drive (SSD), or other non-transitory solid state storage devices; memory 302 may also include a combination of the types of memory described above.
In an embodiment of the application, the memory 302 may also include memory located remotely from the processor, which may be connected to the processor via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
In the embodiment of the present application, the lidar 300 may further have a wired or wireless network interface, a keyboard, an input/output interface, and other components for implementing the functions of the device, which are not described herein.
Referring to fig. 12 again, fig. 12 is a schematic structural diagram of a robot according to an embodiment of the application.
As shown in fig. 12, the robot 400 includes: the system comprises a laser radar 300 and a controller 401, wherein the laser radar 300 is in communication connection with the controller 401, and the controller 401 is used for performing mapping or positioning and the like based on point cloud data which is sent by the laser radar 300 and is used for enabling the robot 400 to complete related services (such as cleaning services).
It is understood that the lidar 300 has the same structure and function as the lidar in the above embodiment, and will not be described in detail herein.
It should be noted that the above-described apparatus embodiments are merely illustrative, and the units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
From the above description of embodiments, it will be apparent to those skilled in the art that the embodiments may be implemented by means of software plus a general purpose hardware platform, or may be implemented by hardware. Those skilled in the art will appreciate that all or part of the processes implementing the methods of the above embodiments may be implemented by a computer program for instructing relevant hardware, where the program may be stored in a computer readable storage medium, and where the program may include processes implementing the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), or the like.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and are not limiting; the technical features of the above embodiments or in the different embodiments may also be combined within the idea of the application, the steps may be implemented in any order, and there are many other variations of the different aspects of the application as described above, which are not provided in detail for the sake of brevity; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the application.

Claims (10)

1. A method of filtering glass noise, comprising:
for each laser point in the point cloud data, marking attribute categories of the laser points according to the light spot forms of the laser points, wherein the attribute categories reflect whether the laser points are suspected glass noise points or not;
if the laser point is marked as a suspected glass noise point, analyzing the point cloud form near the laser point, and determining whether the laser point is the glass noise point;
and if the laser point is a glass noise point, eliminating the laser point from the point cloud data.
2. The method of claim 1, wherein marking the property class of the laser spot according to the spot morphology of the laser spot comprises:
carrying out gray statistics on the light spots of the laser spots to obtain light spot curves;
determining a light spot effective area in the light spot curve;
and marking the attribute category of the laser point according to the waveform characteristics of the curve in the light spot effective area.
3. The method of claim 2, wherein marking the attribute category of the laser spot according to the waveform characteristics of the curve in the effective area of the spot comprises:
Respectively counting the number of pixels with outer gray values larger than a first gray threshold value at two sides of the light spot effective area to obtain the number of pixels at the left side and the number of pixels at the right side;
respectively determining whether gradient direction changes occur at two sides of the light spot effective area to obtain a left gradient direction change result and a right gradient direction change result;
marking the attribute category of the laser point according to the number of the left pixels, the number of the right pixels, the left gradient direction change result and the right gradient direction change result.
4. A method according to claim 3, wherein said marking the attribute category of the laser spot based on the number of left pixels, the number of right pixels, and the left and right gradient direction change results, comprises:
if the left gradient direction is changed and the number of the left pixels is larger than a first threshold value, determining that the left attribute type is a suspected glass noise point;
if the right gradient direction is changed and the number of the right pixels is larger than a second threshold value, determining that the right attribute type is a suspected glass noise point;
and marking the attribute category of the laser point as the suspected glass noise point under the condition that the left attribute category is the suspected glass noise point and the right attribute category is the suspected glass noise point.
5. The method of claim 2, wherein marking the attribute category of the laser spot according to the waveform characteristics of the curve in the effective area of the spot comprises:
respectively counting the number of pixels with gray values larger than a second gray threshold value in the effective area of the light spot to obtain the number of pixels in the area;
marking attribute categories of the laser points according to the number of pixels in the area; or alternatively, the first and second heat exchangers may be,
and marking the attribute category of the laser point according to the width of the effective area of the light spot.
6. The method of claim 2, wherein marking the attribute category of the laser spot according to the waveform characteristics of the curve in the effective area of the spot comprises:
and if the wave crest number of the curve in the light spot effective area is larger than or equal to a third threshold value, marking the attribute type of the laser spot as a suspected glass noise point.
7. The method of claim 1, wherein the analyzing the point cloud morphology in the vicinity of the laser point to determine whether the laser point is a glass noise point comprises:
and sliding a sliding window with preset length on the point cloud data, and determining whether the laser point is a glass noise point according to the average distance of the laser point in the sliding window and/or the effective laser point in the sliding window when the laser point is positioned at the middle position of the sliding window.
8. The method of claim 7, wherein the determining whether the laser spot is a glass noise spot based on an average distance of laser spots in the sliding window and/or an effective laser spot in the sliding window comprises:
determining an average distance between the laser spot and the effective laser spot in the sliding window;
if the average distance is greater than a distance threshold, determining that the laser point is a glass noise point; or alternatively, the first and second heat exchangers may be,
and if the number of the effective laser points in the sliding window is smaller than a number threshold value, determining that the laser points are glass noise points.
9. A lidar, comprising:
at least one processor, and
a memory communicatively coupled to the at least one processor, wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-8.
10. A robot comprising the lidar of claim 9.
CN202311120813.XA 2023-08-31 2023-08-31 Method for filtering glass noise, laser radar and robot Pending CN117169848A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311120813.XA CN117169848A (en) 2023-08-31 2023-08-31 Method for filtering glass noise, laser radar and robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311120813.XA CN117169848A (en) 2023-08-31 2023-08-31 Method for filtering glass noise, laser radar and robot

Publications (1)

Publication Number Publication Date
CN117169848A true CN117169848A (en) 2023-12-05

Family

ID=88944304

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311120813.XA Pending CN117169848A (en) 2023-08-31 2023-08-31 Method for filtering glass noise, laser radar and robot

Country Status (1)

Country Link
CN (1) CN117169848A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117368880A (en) * 2023-12-07 2024-01-09 中国气象局人工影响天气中心 Millimeter wave cloud radar turbulence clutter filtering method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117368880A (en) * 2023-12-07 2024-01-09 中国气象局人工影响天气中心 Millimeter wave cloud radar turbulence clutter filtering method
CN117368880B (en) * 2023-12-07 2024-02-06 中国气象局人工影响天气中心 Millimeter wave cloud radar turbulence clutter filtering method

Similar Documents

Publication Publication Date Title
JP7386173B2 (en) Data fusion method and related equipment
WO2021104497A1 (en) Positioning method and system based on laser radar, and storage medium and processor
CN108007452B (en) Method and device for updating environment map according to obstacle and robot
US11960028B2 (en) Determining specular reflectivity characteristics using LiDAR
JP6030405B2 (en) Planar detection device and autonomous mobile device including the same
JP6676814B2 (en) Object detection based on rider strength
CN112513679B (en) Target identification method and device
CN112051844B (en) Self-moving robot and control method thereof
WO2022179207A1 (en) Window occlusion detection method and apparatus
US11295521B2 (en) Ground map generation
JP7134368B2 (en) Object recognition device and object recognition method
CN117169848A (en) Method for filtering glass noise, laser radar and robot
WO2022027611A1 (en) Positioning method and map construction method for mobile robot, and mobile robot
CN115656984A (en) TOF point cloud processing method, point cloud optimization method, laser radar and robot
WO2022198637A1 (en) Point cloud noise filtering method and system, and movable platform
JP6595284B2 (en) Autonomous mobile robot
CN115267825A (en) Obstacle avoidance and navigation method and device of sweeper based on TOF sensor and storage medium
CN108873014A (en) Mirror surface detection method and device based on laser radar
CN115032618B (en) Blind area repairing method and device applied to laser radar and laser radar
CN114529539A (en) Method and device for detecting road surface obstacle of unmanned equipment, unmanned equipment and storage medium
Pavelka et al. Lidar based object detection near vehicle
WO2024060209A1 (en) Method for processing point cloud, and radar
WO2020056586A1 (en) Height determination method and apparatus, electronic device and computer-readable storage medium
Silver et al. Arc carving: obtaining accurate, low latency maps from ultrasonic range sensors
CN112489131B (en) Method, device, medium and robot for constructing cost map based on pavement detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 518000, Floor 1801, Block C, Minzhi Stock Commercial Center, North Station Community, Minzhi Street, Longhua District, Shenzhen City, Guangdong Province

Applicant after: Shenzhen Huanchuang Technology Co.,Ltd.

Address before: 518000, Floor 1801, Block C, Minzhi Stock Commercial Center, North Station Community, Minzhi Street, Longhua District, Shenzhen City, Guangdong Province

Applicant before: SHENZHEN CAMSENSE TECHNOLOGIES Co.,Ltd.

CB02 Change of applicant information