CN111427355B - Obstacle data processing method, device, equipment and storage medium - Google Patents

Obstacle data processing method, device, equipment and storage medium Download PDF

Info

Publication number
CN111427355B
CN111427355B CN202010284857.6A CN202010284857A CN111427355B CN 111427355 B CN111427355 B CN 111427355B CN 202010284857 A CN202010284857 A CN 202010284857A CN 111427355 B CN111427355 B CN 111427355B
Authority
CN
China
Prior art keywords
data
obstacle
data points
parallel threads
point group
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010284857.6A
Other languages
Chinese (zh)
Other versions
CN111427355A (en
Inventor
王超
姚秀军
桂晨光
郭新然
蔡禹丞
蔡小龙
马福强
李振
崔丽华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jingdong Technology Information Technology Co Ltd
Original Assignee
Jingdong Technology Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jingdong Technology Information Technology Co Ltd filed Critical Jingdong Technology Information Technology Co Ltd
Priority to CN202010284857.6A priority Critical patent/CN111427355B/en
Publication of CN111427355A publication Critical patent/CN111427355A/en
Priority to PCT/CN2021/085984 priority patent/WO2021208797A1/en
Application granted granted Critical
Publication of CN111427355B publication Critical patent/CN111427355B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas

Abstract

The application provides an obstacle data processing method, device, equipment and storage medium, wherein data points collected by each laser radar are grouped to obtain data point groups, then the data points in the data point groups are processed in parallel, so that the data processing efficiency is improved, the robot control part extracts the obstacle distance according to the obstacle data and performs path planning, the response rate of the robot is further improved, and the robot is prevented from colliding with the obstacle.

Description

Obstacle data processing method, device, equipment and storage medium
Technical Field
The present disclosure relates to the field of robotics, and in particular, to a method, an apparatus, a device, and a storage medium for processing obstacle data.
Background
Single-line lidar is a very important sensor in robot obstacle detection. According to the number of lidars, it is classified into single lidar detection and multiple lidar detection.
In order to be able to detect ground obstacles, in general, lidars are mounted on the robot belly, so that a single lidar can only detect obstacles in a single direction, and the detection range of multiple lidars is wider than that of single lidar detection. In multi-lidar detection, the principle of data collection by the lidar is as follows: the internal structure rotates by 360 degrees, the emission angle gamma of laser emitted each time is changed, the distance d between an obstacle and the laser radar is determined by detecting the time difference between laser emission and laser receiving, and the angle gamma and the distance d acquired by rotating the internal structure for one circle are output as one frame of data. After each laser radar collects data, the data of a plurality of laser radars are required to be fused, and then the fused data are analyzed and processed to realize obstacle detection in a plurality of directions. The common fusion method comprises the following steps: and sequentially processing data points in each frame of data detected by each laser radar so as to realize data fusion.
However, the existing data fusion method is long in data fusion time and slow in fusion rate due to the fact that data points in each frame of data of each laser radar are processed sequentially.
Disclosure of Invention
The application provides a barrier data processing method, device, equipment and storage medium, which are used for solving the problems that the data fusion time is long and the fusion rate is slow in the existing data fusion mode because data points in each frame of data of each laser radar are sequentially processed.
In a first aspect, the present application provides an obstacle data processing method, including:
acquiring current frame obstacle data acquired by each laser radar, wherein the current frame obstacle data comprises a plurality of data points;
grouping data points in the obstacle data of the current frame aiming at each obstacle data of the current frame to generate at least one data point group;
all data points in each data point group are processed in parallel to generate a converted data set.
Optionally, for each current frame of obstacle data, grouping data points in the current frame of obstacle data specifically includes: and grouping the data points in the obstacle data of the current frame according to the total number of the parallel threads to generate at least one data point group.
Optionally, the data points in the obstacle data of the current frame are grouped according to the total number of parallel threads, so as to generate at least one data point group, which specifically includes: judging whether the total number of data points in the barrier data of the current frame is an integral multiple of the total number of parallel threads, if so, equally dividing the barrier data of the current frame according to the total number of parallel threads so that the number of data points in each data point group is the total number of parallel threads; if not, grouping the obstacle data of the current frame according to the total number of parallel threads, so that the number of data points in one data point group is smaller than the total number of parallel threads, and the number of data points in each remaining data point group is the total number of parallel threads.
The embodiment has the following specific beneficial effects: the data points are grouped according to the total number of the parallel threads, so that the parallel threads can be fully utilized, idle threads caused by mismatching of the total number of the data points of the data point group and the total number of the parallel threads are avoided, and the processing efficiency can be further improved.
Optionally, all data points in each data point group are processed in parallel to generate a converted data set, specifically including: n parallel threads process N data points in each data point group to generate N conversion data points, wherein each thread is used for carrying the processing process of one data point, and N is the number of the data points in the data point group.
Optionally, the N parallel threads process N data points in each data point group to generate N converted data points, specifically including: the N parallel threads call the same conversion parameter and process N data points in each data point group.
Optionally, the N parallel threads call the same conversion parameter, and process N data points in each data point group, which specifically includes: the N parallel threads call rotation parameters, and the N data points are subjected to rotation transformation to generate N intermediate data points; the N parallel threads call translation parameters, and translate and convert the N data points to generate N converted data points.
Optionally, before the N parallel threads invoke the rotation parameter and perform rotation transformation on the N data points to generate N intermediate data points, the method further includes:
the N parallel threads call the same coordinate conversion function to convert the laser emission angle and the distance of the obstacle into the position coordinates of the obstacle, wherein the data points comprise: the laser firing angle and the distance of the obstruction.
Optionally, the rotation parameters are:
Figure BDA0002448131060000031
the translation parameters are:
Figure BDA0002448131060000032
wherein alpha is the mounting angle of the laser radar, (t) x ,t y ) Is the mounting position of the laser radar.
Optionally, the mounting angle and the mounting position of the lidar are both in the robot coordinate system.
The embodiment has the following specific beneficial effects: the same rotation parameter and the same translation parameter are called by N threads, so that the processing of N data points is completed, the memory usage amount can be reduced, and the processing efficiency is further improved.
Optionally, after parallel processing of the data points in each data point group to generate the transformed data set, the method further comprises: extracting a distance of at least one obstacle from the transformed data set; if the distance between any obstacle reaches a preset threshold value, generating a command for controlling the running of the robot.
The embodiment has the following specific beneficial effects: the processing speed can be increased by processing each frame of data through a plurality of parallel threads, so that the robot can determine the distance between the robot and the obstacle data, the robot is controlled to run according to the distance, the response speed of the robot is increased, the collision between the robot and the obstacle is avoided,
in a second aspect, the present application provides an obstacle data processing device comprising:
the acquisition module is used for acquiring current frame obstacle data acquired by each laser radar, wherein the current frame obstacle data comprises a plurality of data points;
the grouping module is used for grouping data points in the obstacle data of the current frame aiming at each obstacle data of the current frame to generate at least one data point group;
and the processing module is used for carrying out parallel processing on all data points in each data point group so as to generate a conversion data set.
In a third aspect, the present application provides an electronic device, comprising:
a memory for storing a program;
an image processor GPU for executing a program stored in a memory, the GPU being configured to execute the obstacle data processing method according to the first aspect and the alternative when the program is executed.
In a fourth aspect, the present application provides a robot comprising:
the laser radars are used for collecting obstacle data of the current frame;
an image processor GPU for executing the obstacle data processing method according to the first aspect and the optional aspects.
In a fifth aspect, the present application provides a computer readable storage medium comprising instructions which, when run on a computer, cause the computer to perform the obstacle data processing method of the first aspect and alternatives.
The application provides an obstacle data processing method, device, equipment and storage medium, wherein data points collected by each laser radar are grouped to obtain data point groups, and then the data points in the data point groups are processed in parallel, so that the data processing efficiency can be improved, the robot control part can extract the distance of an obstacle after receiving a conversion data set, and the path planning is performed, so that the response rate of the robot is improved, and the robot is prevented from colliding with the obstacle.
Drawings
Fig. 1 is a diagram of a robot system provided herein;
fig. 2 is an external schematic view of the multi-lidar robot provided in the present application;
FIG. 3 is a diagram illustrating a method for processing obstacle data according to an embodiment of the present disclosure;
fig. 4 is a schematic diagram of a data packet principle provided in a second embodiment of the present application;
fig. 5 is a schematic diagram of coordinate system rotation conversion provided in the third embodiment of the present application;
fig. 6 is a schematic diagram of translation conversion of a coordinate system according to a third embodiment of the present application;
fig. 7 is a schematic structural diagram of an obstacle data processing device according to a fourth embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present application.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the present application more apparent, the technical solutions in the present application will be clearly and completely described below with reference to the drawings in the present application, and it is apparent that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
Fig. 1 is a schematic diagram of a robot system provided in the present application, as shown in fig. 1, where the robot system is composed of a mechanical part, a sensing part and a control part, the sensing part is used for sensing an environment, and the task of the control system is to control the mechanical part of the robot to complete a specified motion and a specified function according to a working instruction of the robot and environmental information fed back from the sensing part.
Wherein, single line laser radar is common sensing part among the robot system, according to laser radar's quantity, and robot system can divide into many laser radar robots and single laser radar robot, and fig. 2 is the outward appearance schematic diagram of many laser radar robots that this application provided, as shown in fig. 2, is equipped with a plurality of mounting points in the robot belly, installs a single line laser radar on every mounting point. Each laser radar collects barrier data in one direction, and then fusion processing is carried out on the barrier data in multiple directions to obtain all-around barrier data. The control part of the robot generates corresponding control instructions according to the omnibearing barrier data so as to control the robot to run.
The principle of data acquisition of each single-line laser radar is as follows: the method comprises the steps of changing the emitting angle gamma of laser emitted each time through 360-degree rotation of an internal structure, determining the distance d between an obstacle and the laser radar when the emitting angle gamma is detected by detecting the time difference between laser emission and laser receiving, taking the distance d between the obstacle and the laser radar when the emitting angle gamma is the emitting angle gamma as a data point, and acquiring a plurality of data points as a frame data for output after one rotation of the internal structure. After each laser radar acquires a single-frame obstacle data point, the data points in the single-frame obstacle data are required to be processed and fused one by one so as to generate all-dimensional obstacle data. Based on the principle, the number of data points in obstacle data of each frame is huge, and the prior art adopts one-by-one processing, so that the data fusion time is long and the fusion rate is slow.
The application aims to provide an obstacle data processing method for solving the problems in the prior art. The invention idea of the application is as follows: aiming at a data set acquired by a laser radar, wherein the processing modes of all data points are the same, the conversion parameters are the same, the data set is grouped to generate at least one data point group, the same conversion parameters are called for each data point group to carry out parallel processing, and the parallel processing of a plurality of data points can be completed.
Fig. 3 is a diagram illustrating a method for processing obstacle data according to an embodiment of the present application, where, as shown in fig. 3, the method for processing obstacle according to the embodiment of the present application includes:
s101, acquiring obstacle data of a current frame acquired by each laser radar.
When the laser radars are required to sense the external environment, each laser radar scans the external environment to obtain current frame obstacle data, wherein the current frame obstacle data comprises a plurality of data points, and the data points are specifically a laser emission angle gamma and a distance d between the obstacle and the laser radars when the angle is gamma.
S102, data points in the obstacle data of each current frame are grouped according to the obstacle data of each current frame, and at least one data point group is generated.
Wherein after obtaining the current frame of obstacle data, all data points in the current frame of obstacle data are grouped to generate at least one data point group. The number of data points in each data point set is not limited.
S103, all data points in each data point group are processed in parallel to generate a conversion data set.
Wherein after obtaining at least one data point group, at least one data point group is processed in batches, that is, one data point group is processed first, and then the next data point group is processed until all data point groups are processed. For each data point group, all data points in the data point group are processed at the same time to obtain a conversion data point group. And combining the plurality of conversion data point groups to generate a conversion data set, so that the data point fusion of the plurality of laser radars can be completed.
In the obstacle data processing method provided in the first embodiment of the present application, by grouping all data points in the obstacle data set of the current frame to perform parallel processing on all data points in each data point group, compared with the scheme in the prior art, the processing rate can be improved by grouping all data points in the data point group at the same time.
The following focuses on describing a method for processing obstacle data according to a second embodiment of the present application, where the method for processing obstacle data according to the second embodiment of the present application includes the following steps:
s201, acquiring obstacle data of the current frame acquired by each laser radar.
Here, this step is already described in detail in S101, and the repetition is not described here again.
S202, data points in the obstacle data of each current frame are grouped according to the obstacle data of each current frame, and at least one data point group is generated.
As shown in fig. 4, because the processing procedure and the processing parameters of the data points collected by the same laser radar are the same, at least one data point group is generated by grouping the obstacle data of the current frame according to the total number of parallel threads, specifically:
and judging whether the total number of data points in the barrier data of the current frame is an integral multiple of the total number of parallel threads, if so, equally dividing the barrier data of the current frame according to the total number of parallel threads, so that the number of data points in each data point group is the total number of parallel threads. If not, grouping the obstacle data of the current frame according to the total number of parallel threads, so that the number of data points in one data point group is smaller than the total number of parallel threads, and the number of data points in each remaining data point group is the total number of parallel threads.
S203, all data points in each data point group are processed in parallel to generate a conversion data set.
Wherein, for each data point group, a thread is allocated to each data point in the data point group, and all data points of the data point group are processed by using N parallel threads to generate N converted data points, wherein N is the number of data points in the data point group. If the total number of data points in the data point group is the same, the number of parallel threads of the data point group is the same, and if the total number of data points in the data point group is different, the number of parallel threads of the data point group is different.
If the data points in the data point group are collected by the same laser radar, the parallel threads can call the same processing parameters when the parallel threads process all the data in the data point group, so that the processing speed is further improved.
In the obstacle data processing method provided in the second embodiment, the obstacle data of each current frame is grouped according to the total number of parallel threads, so that the parallel threads can be used for processing data points, and the parallel threads can call the same processing parameters, so that the processing rate is further improved.
The following illustrates an obstacle data processing method provided in the third embodiment of the present application, where the obstacle data processing method provided in the third embodiment of the present application includes the following steps:
s301, acquiring obstacle data of a current frame acquired by each laser radar.
The robot is provided with 2 laser radars, and two types of data points exist in the current frame obstacle data set, wherein one type of data points are data points collected by a first laser radar, and the other type of data points are data points collected by a second laser radar. The data points are specifically a laser emission angle gamma and a distance d of the obstacle from the laser radar when the angle gamma is the angle gamma.
S302, data points in the obstacle data of each current frame are grouped according to the obstacle data of each current frame, and at least one data point group is generated.
Wherein each frame of barrier data is grouped according to the total number of parallel threads, for example: the total number of parallel threads is 32, and the total number of data points in all data point groups is 32 by grouping obstacle data of each frame, or the total number of data points in only one data point group is kept to be smaller than 32, and the total number of data points in the rest all data point groups is 32.
S303, carrying out parallel processing on all data points in each data point group to generate a conversion data set.
All data points in the data point group are processed in parallel by using 32 parallel threads to generate a conversion data point group, and then the conversion data point groups are combined to generate a conversion data set.
The data points in each data point group are collected by the same laser radar, and the parallel threads can call the same conversion parameter when processing the data points. Wherein the conversion parameters include: rotation parameters and translation parameters. The 32 parallel threads call the rotation parameters, and the rotation transformation is performed on the 32 data points to generate 32 intermediate data points. The 32 parallel threads call the translation parameters, and translate the 32 data points to generate 32 converted data points.
In order to facilitate the parallel threads to process a plurality of data points, the 32 parallel threads call the same coordinate conversion function to convert the laser emission angle and the distance of the obstacle into the position coordinates of the obstacle. The coordinate conversion function specifically includes:
x=dcosγ
y=dsinγ
where (x, y) represents the position coordinates of the obstacle and d represents the distance of the obstacle when the emission angle is γ.
When converting the emission angle and the obstacle distance into obstacle coordinate positions, the following rotation parameters and translation parameters are invoked by 32 threads, generating converted data points.
Wherein, the rotation parameters are:
Figure BDA0002448131060000081
the translation parameters are:
Figure BDA0002448131060000082
wherein alpha is the mounting angle of the laser radar, (t) x ,t y ) The installation angle and the installation position of the laser radar are both in a robot coordinate system, so that conversion data points generated by calling the rotation and translation parameters are data points in the robot coordinate system.
The process of obtaining the above-described rotation parameters and translation parameters is emphasized below. In order to convert the data points in the laser coordinate system to the robot coordinate system, the conversion process is divided into a rotation process and a translation process.
As shown in fig. 5, the intermediate coordinate system is obtained by rotating the laser coordinate system by an angle α, and the coordinates of the same point P on the two coordinate systems are as follows:
x=OEcosα
=(x”-y”tanα)cosα
=x”cosα-y”sinα
y=EH+PE
=xtanα+y”/cosα
=(x”cosα-y”sinα)tanα+y”/cosα
=x”sinα+y”cosα
as shown in fig. 6, the laser coordinate system is translated by a distance (t x ,t y ) The relationship of the same point P on two coordinate systems is as follows:
x=x”+t x
y=y”+t y
after the translation conversion and the rotation conversion are overlapped, the following conversion relation can be obtained:
Figure BDA0002448131060000091
the CudaMat tool is locally loaded in the robot control chip to process each data point group and generate conversion data points.
S304, extracting the distance of at least one obstacle from the conversion data set.
The conversion data points in the conversion data set can reflect obstacle information in two directions of the robot, and the distance from at least one obstacle to the robot is extracted by carrying out semantic extraction on the conversion data points.
S305, if the distance of any obstacle reaches a preset threshold value, generating a command for controlling the running of the robot.
The control part of the robot judges whether the distance from the obstacle reaches a preset threshold after receiving the distance from the obstacle to the robot, and if the distance reaches the preset threshold, the control part of the robot generates an instruction for controlling the running of the robot so as to prevent the robot from colliding with the obstacle. If the preset threshold is not reached, no instruction is required to be generated.
In the obstacle data processing method provided in the third embodiment, data points collected by the same laser radar are grouped, so that the parallel threads call the same conversion parameter to process all data in the data point group, the data processing efficiency is improved, and after receiving the obstacle data, a control module of the robot can quickly generate an instruction to avoid collision between the robot and the obstacle.
Fig. 7 is a schematic structural diagram of an early warning device provided in a fourth embodiment of the present application, and as shown in fig. 7, the present application provides an obstacle data processing apparatus, where the obstacle data processing apparatus 400 includes:
an acquisition module 401, configured to acquire current frame obstacle data acquired by each lidar, where the current frame obstacle data includes a plurality of data points;
a grouping module 402, configured to group data points in the obstacle data of the current frame for each current frame obstacle data, and generate at least one data point group;
a processing module 403, configured to process all data points in each data point group in parallel to generate a converted data set.
Optionally, the grouping module 402 is specifically configured to: and grouping the data points in the obstacle data of the current frame according to the total number of the parallel threads to generate at least one data point group.
Optionally, the grouping module 402 is specifically configured to: judging whether the total number of data points in the barrier data of the current frame is an integral multiple of the total number of parallel threads, if so, equally dividing the barrier data of the current frame according to the total number of parallel threads so that the number of data points in each data point group is the total number of parallel threads; if not, grouping the obstacle data of the current frame according to the total number of parallel threads, so that the number of data points in one data point group is smaller than the total number of parallel threads, and the number of data points in each remaining data point group is the total number of parallel threads.
Optionally, the processing module 403 is specifically configured to: n parallel threads process N data points in each data point group to generate N conversion data points, wherein each thread is used for carrying the processing process of one data point, and N is the number of the data points in the data point group.
Optionally, the processing module 403 is specifically configured to: the N parallel threads call the same conversion parameter and process N data points in each data point group.
Optionally, the processing module 403 is specifically configured to: the N parallel threads call rotation parameters, and the N data points are subjected to rotation transformation to generate N intermediate data points; the N parallel threads call translation parameters, and translate and convert the N data points to generate N converted data points.
Optionally, the apparatus further comprises a preprocessing module 404, where the preprocessing module 404 is configured to: before N parallel threads call rotation parameters and perform rotation transformation on N data points to generate N intermediate data points, the N parallel threads call the same coordinate conversion function to convert the laser emission angle and the distance of the obstacle into the position coordinates of the obstacle, wherein the data points comprise: the laser firing angle and the distance of the obstruction.
Optionally, the rotation parameters are:
Figure BDA0002448131060000101
the translation parameters are:
Figure BDA0002448131060000102
wherein alpha is the mounting angle of the laser radar, (t) x ,t y ) Is the mounting position of the laser radar.
Optionally, the mounting angle and the mounting position of the lidar are both in the robot coordinate system.
Optionally, the apparatus further comprises a control module 405, where the control module 405 is configured to, after parallel processing the data points in each data point group to generate the converted data set, further comprise: extracting a distance of at least one obstacle from the transformed data set; for each obstacle, if the distance of the obstacle reaches a preset threshold value, a command for controlling the running of the robot is generated.
Fig. 8 is a schematic structural diagram of an electronic device in a fifth embodiment of the present application. As shown in fig. 8, the electronic device 500 provided in this embodiment includes: a transmitter 501, a receiver 502, a memory 503, and an image processor 504.
A transmitter 501 for transmitting instructions and data;
a receiver 502 for receiving instructions and data;
a memory 503 for storing computer-executable instructions;
an image processor (Graphics Processing Unit, GPU) for executing computer-executable instructions stored in the memory to implement the steps executed by the obstacle data processing method in the above embodiment. Reference may be made in particular to the description of the embodiments of the obstacle data processing method described above.
Alternatively, the memory 503 may be separate or integrated with the processor 504.
When the memory 503 is provided separately, the electronic device further comprises a bus for connecting the memory 503 and the processor 504.
The embodiment of the application also provides a robot, which comprises:
a plurality of lidars for acquiring obstacle data;
and the image processor GPU is used for executing computer-executed instructions stored in the memory to realize the steps executed by the obstacle data processing method in the embodiment. Reference may be made in particular to the description of the embodiments of the obstacle data processing method described above.
The embodiment of the application also provides a computer readable storage medium, wherein computer execution instructions are stored in the computer readable storage medium, and when a processor executes the computer execution instructions, the obstacle data processing method executed by the electronic equipment is realized.
Those of ordinary skill in the art will appreciate that: all or part of the steps for implementing the method embodiments described above may be performed by hardware associated with program instructions. The foregoing program may be stored in a computer readable storage medium. The program, when executed, performs steps including the method embodiments described above; and the aforementioned storage medium includes: various media that can store program code, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present application.

Claims (10)

1. A method of obstacle data processing, comprising:
acquiring current frame obstacle data acquired by each laser radar, wherein the current frame obstacle data comprises a plurality of data points;
judging whether the total number of data points in the barrier data of the current frame is an integral multiple of the total number of parallel threads, if so, equally dividing the barrier data of the current frame according to the total number of parallel threads so that the number of data points in each data point group is the total number of parallel threads;
if not, grouping the obstacle data of the current frame according to the total number of parallel threads so that the number of data points in one data point group is smaller than the total number of parallel threads, and the number of data points in each remaining data point group is the total number of parallel threads;
n parallel threads call the same conversion parameter, and process N data points in each data point group to generate N conversion data points, wherein each thread is used for carrying a processing process of one data point, and N is the number of the data points in the data point group.
2. The method of claim 1, wherein the N parallel threads call the same conversion parameter to process N data points in each data point group, and specifically comprising:
the N parallel threads call rotation parameters, and the N data points are subjected to rotation transformation to generate N intermediate data points;
and the N parallel threads call translation parameters, and translate and convert the N intermediate data points to generate N converted data points.
3. The method of claim 2, wherein before the N parallel threads invoke the rotation parameter to rotate N data points to generate N intermediate data points, the method further comprises:
the N parallel threads call the same coordinate conversion function to convert the laser emission angle and the distance of the obstacle into the position coordinates of the obstacle, wherein the data points comprise: the laser emission angle and the distance of the obstacle.
4. The method of claim 2, wherein the step of determining the position of the substrate comprises,
the rotation parameters are:
Figure FDF0000020650190000011
the translation parameters are:
Figure FDF0000020650190000021
wherein alpha is the mounting angle of the laser radar, (t) x ,t y ) Is the mounting position of the laser radar.
5. The method of claim 4, wherein the mounting angle and mounting position of the lidar are both in a robot coordinate system.
6. The method of claim 1, wherein after parallel processing of the data points in each data point group to generate the transformed data set, the method further comprises:
extracting a distance of at least one obstacle from the transformed data set;
if the distance between any obstacle reaches a preset threshold value, generating a command for controlling the running of the robot.
7. An obstacle data processing device, comprising:
the acquisition module is used for acquiring current frame obstacle data acquired by each laser radar, wherein the current frame obstacle data comprises a plurality of data points;
the grouping module is used for grouping data points in the obstacle data of each current frame aiming at the obstacle data of each current frame to generate at least one data point group;
the processing module is used for carrying out parallel processing on all data points in each data point group so as to generate a conversion data set;
the processing module is specifically configured to invoke the same conversion parameter by N parallel threads, and process N data points in each data point group to generate N conversion data points, where each thread is used to carry a processing procedure of one data point, and N is the number of data points in the data point group;
the grouping module is specifically configured to: judging whether the total number of data points in the barrier data of the current frame is an integral multiple of the total number of parallel threads, if so, equally dividing the barrier data of the current frame according to the total number of parallel threads so that the number of data points in each data point group is the total number of parallel threads; if not, grouping the obstacle data of the current frame according to the total number of parallel threads, so that the number of data points in one data point group is smaller than the total number of parallel threads, and the number of data points in each remaining data point group is the total number of parallel threads.
8. An electronic device, comprising:
a memory for storing a program;
an image processor GPU for executing the program stored in the memory, the GPU being adapted to perform the obstacle data processing method according to any one of claims 1 to 6 when the program is executed.
9. A robot, comprising:
the laser radars are used for collecting obstacle data of the current frame;
an image processor GPU for performing the obstacle data processing method according to any one of claims 1 to 6.
10. A computer readable storage medium comprising instructions which, when run on a computer, cause the computer to perform the obstacle data processing method as claimed in any one of claims 1 to 6.
CN202010284857.6A 2020-04-13 2020-04-13 Obstacle data processing method, device, equipment and storage medium Active CN111427355B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010284857.6A CN111427355B (en) 2020-04-13 2020-04-13 Obstacle data processing method, device, equipment and storage medium
PCT/CN2021/085984 WO2021208797A1 (en) 2020-04-13 2021-04-08 Obstacle data processing method and apparatus, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010284857.6A CN111427355B (en) 2020-04-13 2020-04-13 Obstacle data processing method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111427355A CN111427355A (en) 2020-07-17
CN111427355B true CN111427355B (en) 2023-05-02

Family

ID=71558256

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010284857.6A Active CN111427355B (en) 2020-04-13 2020-04-13 Obstacle data processing method, device, equipment and storage medium

Country Status (2)

Country Link
CN (1) CN111427355B (en)
WO (1) WO2021208797A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111427355B (en) * 2020-04-13 2023-05-02 京东科技信息技术有限公司 Obstacle data processing method, device, equipment and storage medium
US11624831B2 (en) 2021-06-09 2023-04-11 Suteng Innovation Technology Co., Ltd. Obstacle detection method and apparatus and storage medium
CN113255559B (en) * 2021-06-09 2022-01-11 深圳市速腾聚创科技有限公司 Data processing method, device and storage medium
CN115561736B (en) * 2022-10-25 2023-10-13 山东莱恩光电科技股份有限公司 Laser radar maintenance-free shield and radar

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103308897A (en) * 2013-05-14 2013-09-18 中国科学院电子学研究所 Method for fast realizing signal processing of passive radar based on GPU (Graphics Processing Unit)
CN105866790A (en) * 2016-04-07 2016-08-17 重庆大学 Laser radar barrier identification method and system taking laser emission intensity into consideration
CN106908783A (en) * 2017-02-23 2017-06-30 苏州大学 Obstacle detection method based on multi-sensor information fusion
JP2019516146A (en) * 2017-03-27 2019-06-13 平安科技(深▲せん▼)有限公司Ping An Technology (Shenzhen) Co.,Ltd. Robot obstacle avoidance control system, method, robot and storage medium
CN110908374A (en) * 2019-11-14 2020-03-24 华南农业大学 Mountain orchard obstacle avoidance system and method based on ROS platform

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104573646B (en) * 2014-12-29 2017-12-12 长安大学 Chinese herbaceous peony pedestrian detection method and system based on laser radar and binocular camera
CN109188394A (en) * 2018-11-21 2019-01-11 深圳市速腾聚创科技有限公司 Laser radar circuit system and laser radar
CN110667783A (en) * 2019-08-30 2020-01-10 安徽科微智能科技有限公司 Unmanned boat auxiliary driving system and method thereof
CN110927740B (en) * 2019-12-06 2023-09-08 合肥科大智能机器人技术有限公司 Mobile robot positioning method
CN111427355B (en) * 2020-04-13 2023-05-02 京东科技信息技术有限公司 Obstacle data processing method, device, equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103308897A (en) * 2013-05-14 2013-09-18 中国科学院电子学研究所 Method for fast realizing signal processing of passive radar based on GPU (Graphics Processing Unit)
CN105866790A (en) * 2016-04-07 2016-08-17 重庆大学 Laser radar barrier identification method and system taking laser emission intensity into consideration
CN106908783A (en) * 2017-02-23 2017-06-30 苏州大学 Obstacle detection method based on multi-sensor information fusion
JP2019516146A (en) * 2017-03-27 2019-06-13 平安科技(深▲せん▼)有限公司Ping An Technology (Shenzhen) Co.,Ltd. Robot obstacle avoidance control system, method, robot and storage medium
CN110908374A (en) * 2019-11-14 2020-03-24 华南农业大学 Mountain orchard obstacle avoidance system and method based on ROS platform

Also Published As

Publication number Publication date
CN111427355A (en) 2020-07-17
WO2021208797A1 (en) 2021-10-21

Similar Documents

Publication Publication Date Title
CN111427355B (en) Obstacle data processing method, device, equipment and storage medium
US11002840B2 (en) Multi-sensor calibration method, multi-sensor calibration device, computer device, medium and vehicle
CN109755995B (en) Robot automatic charging docking method based on ROS robot operating system
CN106950952B (en) Farmland environment sensing method for unmanned agricultural machinery
CN106951847B (en) Obstacle detection method, apparatus, device and storage medium
US20200110173A1 (en) Obstacle detection method and device
US10101448B2 (en) On-board radar apparatus and region detection method
CN107742304A (en) Determination method and device, mobile robot and the storage medium of motion track
CN112171675A (en) Obstacle avoidance method and device for mobile robot, robot and storage medium
EP4211651A1 (en) Efficient three-dimensional object detection from point clouds
US11443151B2 (en) Driving assistant system, electronic device, and operation method thereof
CN112561859A (en) Monocular vision-based steel belt drilling and anchor net identification method and device for anchoring and protecting
KR102310606B1 (en) Method for correcting difference of multiple sensors, and computer program recorded on record-medium for executing method therefor
CN113838125A (en) Target position determining method and device, electronic equipment and storage medium
CN116105721B (en) Loop optimization method, device and equipment for map construction and storage medium
CN112182042A (en) Point cloud feature matching method and system based on FPGA and path planning system
EP4202477A2 (en) Lidar control method, terminal apparatus, and computer-readable storage medium
CN104867184B (en) It is applied to the scene generating method of flying scene emulation
WO2023216555A1 (en) Obstacle avoidance method and apparatus based on binocular vision, and robot and medium
CN114964204A (en) Map construction method, map using method, map constructing device, map using equipment and storage medium
CN117677862A (en) Pseudo image point identification method, terminal equipment and computer readable storage medium
CN113611112A (en) Target association method, device, equipment and storage medium
CN117671648B (en) Obstacle point detection method, obstacle point detection device and storage medium
Bai Accurate obstacle prediction method in unmanned vehicle driving.
Liu et al. Real-time segmentation of sparse 3D point clouds on a TX2

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 601, 6 / F, building 2, No. 18, Kechuang 11th Street, Daxing District, Beijing, 100176

Applicant after: Jingdong Technology Information Technology Co.,Ltd.

Address before: 601, 6 / F, building 2, No. 18, Kechuang 11th Street, Daxing District, Beijing, 100176

Applicant before: Jingdong Shuke Haiyi Information Technology Co.,Ltd.

Address after: 601, 6 / F, building 2, No. 18, Kechuang 11th Street, Daxing District, Beijing, 100176

Applicant after: Jingdong Shuke Haiyi Information Technology Co.,Ltd.

Address before: 601, 6 / F, building 2, No. 18, Kechuang 11th Street, Beijing Economic and Technological Development Zone, Beijing 100176

Applicant before: BEIJING HAIYI TONGZHAN INFORMATION TECHNOLOGY Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant