CN110888144B - Laser radar data synthesis method based on sliding window - Google Patents
Laser radar data synthesis method based on sliding window Download PDFInfo
- Publication number
- CN110888144B CN110888144B CN201911228578.1A CN201911228578A CN110888144B CN 110888144 B CN110888144 B CN 110888144B CN 201911228578 A CN201911228578 A CN 201911228578A CN 110888144 B CN110888144 B CN 110888144B
- Authority
- CN
- China
- Prior art keywords
- data
- laser radar
- image
- point cloud
- dimensional
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000001308 synthesis method Methods 0.000 title claims abstract description 8
- 238000000034 method Methods 0.000 claims abstract description 8
- 238000005070 sampling Methods 0.000 claims description 10
- 230000000007 visual effect Effects 0.000 claims description 3
- 230000009286 beneficial effect Effects 0.000 abstract description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000007547 defect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/10—Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation
Abstract
The invention discloses a laser radar data synthesis method based on a sliding window. The method is beneficial to integrating the advantages of low cost of the low-beam laser radar and high precision of the multi-beam laser radar model.
Description
Technical Field
The invention relates to the technical field of automatic driving, in particular to a laser radar data synthesis method based on a sliding window.
Background
The laser radar plays an important role in automatic driving, and can be divided into a low-beam type and a multi-beam type in terms of the number of beams. The low-beam type laser radar has been developed for many years, the cost of the low-beam type laser radar is well controlled, the low-beam type laser radar has the advantage of low production cost, and the threshold for being deployed on a vehicle is low. The multi-beam type laser radar has a large number of laser beams, so that the measurement accuracy is greatly improved compared with the low-beam type laser radar, but the high cost caused by the increase of the beams is one of the obstacles of popularization.
Disclosure of Invention
Aiming at the defects of the prior art, the invention aims to provide a laser radar data synthesis method based on a sliding window, which can realize the data collected by a low-beam laser radar, is applied to a multi-beam laser radar model, and integrates the advantages of low cost of the low-beam laser radar and high precision of the multi-beam laser radar model.
In order to achieve the purpose, the invention adopts the following technical scheme:
the laser radar data synthesis method based on the sliding window comprises the following steps:
s1, performing two-dimension on data collected by a low-beam laser radar to obtain a two-dimensional point cloud image;
s2, after point cloud data obtained by the low-beam laser radar is subjected to two-dimensional processing, data sampling is carried out: sampling 512 pixels on each line of the two-dimensional point cloud image at an average interval; the sampling operation is carried out on the data of each frame of the low-beam laser radar, and a five-channel two-dimensional image A is obtained after each frame is sampled i I =0,1,2, \ 8230, i is the frame number of each frame, five-channel two-dimensional image a i The height of the laser radar is the same As the number As of the wire beams of the low-wire-beam laser radar for collecting data, and the width of the laser radar is 512;
s3, comparing the image A obtained in the step S2 i Performing data expansion to generate initial expansion data: the number of lines of the target multi-line laser radar is Bs, so that the image A needs to be recorded i Bs/As-1 line of blank expansion data is inserted under each line of the image A i B-As row blank expansion data are required to be expanded; newly inserted blank extension data are all represented by 0, namely, the five channel data of each newly inserted pixel are all 0; to A i The extended five-channel two-dimensional image is marked as C i The height is recorded as Ch, the value is the same as Bs, the width is recorded as Cw, and the value is 512;
determining the size of a sliding window and selecting frames to be merged by utilizing the sliding window:
determining the size of a sliding window according to the quantitative relation between As and Bs, wherein the size of the sliding window is recorded As S, and the sliding window is calculated by the following formula:
S=Bs/As;
sliding window five-channel two-dimensional image A obtained from step S2 i Starting from frame 0, i.e. from A 0 Determining S data frames as frames to be merged in sequence each time; the frame at the head in the window is recorded as A f Then A is f And A f+k (k =1,2 8230; S-1) are all frames to be merged;
s4, merging the frames to be merged selected in the step S3, and filling the frames into the initial expansion data:
note A i Data of each row is A i,n ,n=0,1,2,…,As-1,C i Data of each row is C i,m M =0,1,2, \ 8230;, bs-1, the frontmost frame selected by the sliding window is a f To be merged frame fill-in A f Corresponding extended image C f And (b) lining:
sequentially mixing A with f+k N row data fill-in C f In the blank expansion data of the S multiplied by n + k line, n =0,1,2, \8230, as-1, k =1,2, \8230, S-1; thus C after filling f Each row of data in (a) may be represented as:
merging S data frames to generate data as new data frame, time and A f The same; so far, the obtained new data can be used for a multi-beam laser radar model.
Further, the specific process of step S1 is as follows:
the data collected by the low-beam laser radar is a series of point cloud data under a three-dimensional rectangular coordinate system, each point in the point cloud has four data of an x-direction coordinate, a y-direction coordinate, a z-direction coordinate and the reflection intensity i of the point, and the linear distance r from the point to a low-beam laser radar collection viewpoint, namely a view angle center, can be calculated according to the three data of x, y and z;
based on five kinds of data of x, y, z, i and r of each point in the point cloud, regarding the data collected by the low beam laser radar as a multi-channel image, wherein each pixel in the image is the data of one point in the point cloud; the value of the image height is determined by the number As of the line beams of the low line beam laser radar for collecting input data, the value of the image width is 512, the number of channels of the image is 5, and the five channels are respectively the x-direction coordinate, the y-direction coordinate, the z-direction coordinate of the point cloud data, the reflection intensity i of the point cloud and the distance r from the visual angle center;
under a three-dimensional rectangular coordinate system, coordinates of each point in the point cloud are used for obtaining included angles relative to three coordinate axes and planes enclosed by the coordinate axes, the included angle relative to an xy plane is recorded as alpha, the included angle relative to the positive direction of an x axis is recorded as beta, the point cloud under the three-dimensional rectangular coordinate system is projected to a two-dimensional spherical coordinate system by using alpha and beta, the coordinates of each point are composed of (alpha, beta), and the relations between alpha and beta and x, y and z are expressed as:
for the obtained point cloud data in the two-dimensional spherical coordinate system, recording an included angle between each wire harness of the low-wire-harness laser radar as delta alpha, and recording an included angle of each rotation of the laser radar wire harness as delta beta, so that the point cloud data in the two-dimensional spherical coordinate system can be projected to the two-dimensional rectangular coordinate system by alpha, beta, delta alpha and delta beta, and the two-dimensional coordinate of each projected point is recorded as (p, q), and the point cloud data is obtained by calculation according to the following formula:
the invention has the beneficial effects that: the method can realize that the data acquired by the low-beam laser radar is applied to the multi-beam laser radar model, and integrates the advantages of low cost of the low-beam laser radar and high precision of the multi-beam laser radar model.
Drawings
FIG. 1 shows an embodiment of the present invention in which a sliding window is selected to be filled with a frame to be merged f Corresponding extended image C f A process schematic of (a);
FIG. 2 is a diagram illustrating a process of merging frames to be merged and filling the merged frames with initial expansion data according to an embodiment of the present invention.
Detailed Description
The present invention will be further described with reference to the accompanying drawings, and it should be noted that the present embodiment is based on the technical scheme, and a detailed implementation manner and a specific operation process are provided, but the protection scope of the present invention is not limited to the present embodiment.
The embodiment provides a laser radar data synthesis method based on a sliding window, which comprises the following steps:
s1, radar data bidimensionalization:
in this step, the input data collected by the low beam lidar data needs to be sampled according to the input data structure of the target model. The data collected by the low-beam laser radar is a series of point cloud data under a three-dimensional rectangular coordinate system, and each point in the point cloud has four data of an x-direction coordinate, a y-direction coordinate, a z-direction coordinate and the reflection intensity i of the point. In addition, according to the three data of x, y and z, the linear distance r from the point to the low beam laser radar acquisition viewpoint, namely the viewing angle center, can be calculated.
Based on five kinds of data of x, y, z, i and r of each point in the point cloud, the input supported by the method can be regarded as a multi-channel image, and each pixel in the image is data of one point in the point cloud; the value of the image height is determined by the number As of the line beams of the low line beam laser radar for collecting input data, the value of the image width is 512, the number of channels of the image is 5, and the five channels are respectively the x-direction coordinate, the y-direction coordinate, the z-direction coordinate of the point cloud data, the reflection intensity i of the point cloud and the distance r from the visual angle center.
Firstly, point cloud data under a three-dimensional rectangular coordinate system obtained by a low-beam laser radar is subjected to two-dimensional processing. Under a three-dimensional rectangular coordinate system, the coordinate of each point in the point cloud can be used to obtain an included angle relative to three coordinate axes and a plane enclosed by the coordinate axes, the included angle relative to an xy plane is recorded as alpha, the included angle relative to the positive direction of an x axis is recorded as beta, the point cloud under the three-dimensional rectangular coordinate system can be projected to a two-dimensional spherical coordinate system by using alpha and beta, the coordinate of each point consists of (alpha, beta), and the relations between alpha and beta and x, y and z are expressed as follows:
for the obtained point cloud data under the two-dimensional spherical coordinate system, recording the included angle between each wire harness of the low-wire-harness laser radar as delta alpha, and recording the included angle of each rotation of the laser radar wire harness as delta beta, so that the point cloud data under the two-dimensional spherical coordinate system can be projected under the two-dimensional rectangular coordinate system by alpha, beta, delta alpha and delta beta, and the two-dimensional coordinate of each point after projection is recorded as (p, q), and the point cloud data under the two-dimensional spherical coordinate system is obtained by the following calculation formula:
thereby obtaining a two-dimensional point cloud image;
and S2, sampling radar data.
After point cloud data obtained by a low-beam laser radar is subjected to two-dimensional processing, data sampling is carried out: sampling 512 pixels on each line of the two-dimensional point cloud image at an average interval; data for each frame of low beam lidarPerforming the sampling operation, and obtaining a five-channel two-dimensional image A after each frame is sampled i I =0,1,2, \ 8230, i is the frame number of each frame, five-channel two-dimensional image a i The height of the laser radar is the same As the number As of the wiring harnesses of the low wiring harness laser radar for acquiring data, and the width is 512;
s3, generating initial expansion data:
for the image A obtained in step S2 i Performing data expansion to generate initial expansion data:
the number of lines of the target multi-line laser radar is Bs, so that the image A needs to be recorded i Bs/As-1 line of blank expansion data is inserted under each line of the image A i B-As row blank expansion data are required to be expanded; newly inserted blank extension data are all represented by 0, namely, the five channel data of each newly inserted pixel are all 0; to A i The extended five-channel two-dimensional image is marked as C i The height is recorded as Ch, the value is the same as Bs, the width is recorded as Cw, and the value is 512.
Determining the size of a sliding window and selecting frames to be merged by utilizing the sliding window:
determining the size of a sliding window according to the quantity relation of As and Bs, and selecting the data frames required to be combined; the data collected by the low-beam laser radar is stored according to frames, a sliding window is used for sliding on the data frames, the data frames in the window are combined into one frame, and the time stamp of the new data frame generated by combination is based on the time stamp of the first frame in the window; the sliding window size is S, and is calculated by the following formula:
S=Bs/As;
sliding window five-channel two-dimensional image A obtained from step S2 i Starting from frame 0, i.e. from A 0 Firstly, determining S data frames as frames to be merged in sequence each time; the frame at the head in the window is recorded as A f Then A is f And A f+k (k =1,2 8230; S-1) are all frames to be merged.
S4, merging the frames to be merged selected in the step S3, and filling the frames into the initial expansion data:
for low line beam lidar dataA five-channel two-dimensional image A obtained from each frame data i Generate the corresponding extended five-channel two-dimensional image C i Record A i Data of each row is A i,n (n =0,1,2, \ 8230;, as-1), C i Data of each row is C i,m (m =0,1,2, \ 8230;, bs-1), the frontmost frame selected by the sliding window is a f To be merged frame fill-in A f Corresponding extended image C f In fig. 1:
because of C f Already contains A f So that the non-foremost frame a to be merged in the frames to be merged is f+k (k =1,2 \8230;, S-1) and A is sequentially added f+k Is filled with the nth row data C f In blank expansion data of the S × n + k (n =0,1,2, \8230;, as-1) th line; note a f+k Each action A in (1) f+k,n (n =0,1,2, \8230;, as-1), note C f Data of each row is C f,m (m =0,1,2, \ 8230;, bs-1), thus C after filling f Each row of data in (a) may be represented as:
as shown in fig. 2.
Merging S data frames to generate data as new data frame, time and A f The same is true.
So far, the obtained new data can be used for a multi-beam laser radar model.
Various corresponding changes and modifications can be made by those skilled in the art based on the above technical solutions and concepts, and all such changes and modifications should be included in the protection scope of the present invention.
Claims (1)
1. A laser radar data synthesis method based on a sliding window is characterized by comprising the following steps:
s1, performing two-dimension on data collected by a low-beam laser radar to obtain a two-dimensional point cloud image;
s2, after point cloud data obtained by the low-beam laser radar are subjected to two-dimensional processing, data sampling is carried out: sampling 512 pixels on each line of the two-dimensional point cloud image at an average interval; the sampling operation is carried out on the data of each frame of the low-beam laser radar, and a five-channel two-dimensional image A is obtained after each frame is sampled i I =0,1,2, \ 8230, i is the frame number of each frame, five-channel two-dimensional image a i The height of the laser radar is the same As the number As of the wire beams of the low-wire-beam laser radar for collecting data, and the width of the laser radar is 512;
s3, comparing the image A obtained in the step S2 i Performing data expansion to generate initial expansion data: the number of lines of the target multi-line laser radar is Bs, so that the image A needs to be recorded i Inserting Bs/As-1 line blank expansion data below each line of the image A i B-As row blank expansion data are required to be expanded; newly inserted blank extension data are all represented by 0, namely, the five channel data of each newly inserted pixel are all 0; to A i The extended five-channel two-dimensional image is marked as C i The height is recorded as Ch, the value is the same as Bs, the width is recorded as Cw, and the value is 512;
determining the size of a sliding window and selecting frames to be merged by utilizing the sliding window:
determining the size of a sliding window according to the quantitative relation between As and Bs, wherein the size of the sliding window is recorded As S, and the sliding window is calculated by the following formula:
S=Bs/As;
sliding window five-channel two-dimensional image A obtained from step S2 i Starting from frame 0, i.e. from A 0 Determining S data frames as frames to be merged in sequence each time; let the frame at the head of the window be A f Then A is f And A f+k (k =1,2 8230; S-1) are all frames to be merged;
s4, merging the frames to be merged selected in the step S3, and filling the frames into the initial expansion data:
note A i Data of each row is A i,n ,n=0,1,2,…,As-1,C i Data of each row is C i,m M =0,1,2, \8230;, bs-1, then the sliding window is selectedThe first frame is A f To be merged frame fill-in A f Corresponding extended image C f And (3) lining:
sequentially adding A to f+k N row data fill-in C f In the blank expansion data of the S × n + k-th line, n =0,1,2, \8230, as-1, k =1,2 \8230, S-1; thus filled C f Each row of data in (a) may be represented as:
merging S data frames to generate data as new data frame, time and A f The same; so far, the obtained new data can be used for a multi-beam laser radar model;
the specific process of step S1 is as follows:
the data collected by the low beam laser radar is a series of point cloud data under a three-dimensional rectangular coordinate system, each point in the point cloud has four data of an x-direction coordinate, a y-direction coordinate, a z-direction coordinate and the reflection intensity i of the point, and the linear distance r from the point to a low beam laser radar collection viewpoint, namely a view angle center, can be calculated according to the three data of x, y and z;
based on five kinds of data of x, y, z, i and r of each point in the point cloud, the data collected by the low-beam laser radar is regarded as a multi-channel image, and each pixel in the image is the data of one point in the point cloud; the value of the image height is determined by the number As of the line beams of the low line beam laser radar for collecting input data, the value of the image width is 512, the number of channels of the image is 5, and the five channels are respectively the x-direction coordinate, the y-direction coordinate, the z-direction coordinate of the point cloud data, the reflection intensity i of the point cloud and the distance r from the visual angle center;
under a three-dimensional rectangular coordinate system, coordinates of each point in the point cloud are used for obtaining included angles relative to three coordinate axes and planes enclosed by the coordinate axes, the included angle relative to an xy plane is recorded as alpha, the included angle relative to the positive direction of an x axis is recorded as beta, the point cloud under the three-dimensional rectangular coordinate system is projected to a two-dimensional spherical coordinate system by using alpha and beta, the coordinates of each point are composed of (alpha, beta), and the relations between alpha and beta and x, y and z are expressed as:
for the obtained point cloud data in the two-dimensional spherical coordinate system, recording an included angle between each wire harness of the low-wire-harness laser radar as delta alpha, and recording an included angle of each rotation of the laser radar wire harness as delta beta, so that the point cloud data in the two-dimensional spherical coordinate system can be projected to the two-dimensional rectangular coordinate system by alpha, beta, delta alpha and delta beta, and the two-dimensional coordinate of each projected point is recorded as (p, q), and the point cloud data is obtained by calculation according to the following formula:
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911228578.1A CN110888144B (en) | 2019-12-04 | 2019-12-04 | Laser radar data synthesis method based on sliding window |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911228578.1A CN110888144B (en) | 2019-12-04 | 2019-12-04 | Laser radar data synthesis method based on sliding window |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110888144A CN110888144A (en) | 2020-03-17 |
CN110888144B true CN110888144B (en) | 2023-04-07 |
Family
ID=69750386
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911228578.1A Active CN110888144B (en) | 2019-12-04 | 2019-12-04 | Laser radar data synthesis method based on sliding window |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110888144B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111881414B (en) * | 2020-07-29 | 2024-03-15 | 中南大学 | Synthetic aperture radar image quality assessment method based on decomposition theory |
WO2022032516A1 (en) * | 2020-08-12 | 2022-02-17 | 深圳市速腾聚创科技有限公司 | Laser radar and detection method therefor, storage medium, and detection system |
CN112347421A (en) * | 2020-10-16 | 2021-02-09 | 中国地质调查局沈阳地质调查中心 | Method and system for highlighting and enhancing broken-line-shaped gravity anomaly information |
CN116819489A (en) * | 2023-08-25 | 2023-09-29 | 摩尔线程智能科技(北京)有限责任公司 | Dynamic object detection method, model training method, device, equipment and medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5384573A (en) * | 1990-10-29 | 1995-01-24 | Essex Corporation | Image synthesis using time sequential holography |
WO2018205119A1 (en) * | 2017-05-09 | 2018-11-15 | 深圳市速腾聚创科技有限公司 | Roadside detection method and system based on laser radar scanning |
CN109003276A (en) * | 2018-06-06 | 2018-12-14 | 上海国际汽车城(集团)有限公司 | Antidote is merged based on binocular stereo vision and low line beam laser radar |
CN110221603A (en) * | 2019-05-13 | 2019-09-10 | 浙江大学 | A kind of long-distance barrier object detecting method based on the fusion of laser radar multiframe point cloud |
CN110264416A (en) * | 2019-05-28 | 2019-09-20 | 深圳大学 | Sparse point cloud segmentation method and device |
CN110363820A (en) * | 2019-06-28 | 2019-10-22 | 东南大学 | It is a kind of based on the object detection method merged before laser radar, image |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2430392B1 (en) * | 2009-05-15 | 2015-07-22 | Michigan Aerospace Corporation | Range imaging lidar |
-
2019
- 2019-12-04 CN CN201911228578.1A patent/CN110888144B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5384573A (en) * | 1990-10-29 | 1995-01-24 | Essex Corporation | Image synthesis using time sequential holography |
WO2018205119A1 (en) * | 2017-05-09 | 2018-11-15 | 深圳市速腾聚创科技有限公司 | Roadside detection method and system based on laser radar scanning |
CN109003276A (en) * | 2018-06-06 | 2018-12-14 | 上海国际汽车城(集团)有限公司 | Antidote is merged based on binocular stereo vision and low line beam laser radar |
CN110221603A (en) * | 2019-05-13 | 2019-09-10 | 浙江大学 | A kind of long-distance barrier object detecting method based on the fusion of laser radar multiframe point cloud |
CN110264416A (en) * | 2019-05-28 | 2019-09-20 | 深圳大学 | Sparse point cloud segmentation method and device |
CN110363820A (en) * | 2019-06-28 | 2019-10-22 | 东南大学 | It is a kind of based on the object detection method merged before laser radar, image |
Also Published As
Publication number | Publication date |
---|---|
CN110888144A (en) | 2020-03-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110888144B (en) | Laser radar data synthesis method based on sliding window | |
CN108764187B (en) | Method, device, equipment, storage medium and acquisition entity for extracting lane line | |
CN105160702B (en) | The stereopsis dense Stereo Matching method and system aided in based on LiDAR point cloud | |
Zhang | Automatic digital surface model (DSM) generation from linear array images | |
DE19882939C2 (en) | Method for determining the relative camera orientation position to produce visible 3-D images | |
EP1836455B1 (en) | Method and geodetic device for surveying at least one target | |
US9360307B2 (en) | Structured-light based measuring method and system | |
CN110705543A (en) | Method and system for recognizing lane lines based on laser point cloud | |
DE112006003361T5 (en) | Method and apparatus for recording / displaying three-dimensional shape data and method and apparatus for measuring a three-dimensional shape | |
DE102007037162A1 (en) | Artificial and natural objects detection method for vehicle, involves converting measuring information in common standard time, synchronizing information on trigger points, and orienting information on clock signal | |
CN112907573B (en) | Depth completion method based on 3D convolution | |
EP3633405A1 (en) | Measuring apparatus for geometric 3d-scanning of an environment having a plurality of emission channels and semiconductor photomultiplier sensors | |
DE112017008101T5 (en) | AUTONOMOUS ROBOTS AND METHOD FOR OPERATING THE SAME | |
EP2369296A2 (en) | Navigation method for a missile | |
CN110889899A (en) | Method and device for generating digital earth surface model | |
CN108444451B (en) | Planet surface image matching method and device | |
DE102016119626A1 (en) | Automatic three-dimensional geolocation of SAR targets and simultaneous estimation of tropospheric propagation delays using two long-aperture SAR images | |
CN107277384A (en) | High-resolution video satellite imaging equipment | |
CN110967679B (en) | Method for matching low-beam laser radar data with multi-beam laser radar model | |
CN116892944A (en) | Agricultural machinery navigation line generation method and device, and navigation method and device | |
CN112556596B (en) | Three-dimensional deformation measurement system, method, device and storage medium | |
CN114565720A (en) | Active three-dimensional reconstruction system and method based on line structured light rotation scanning | |
WO2023279131A1 (en) | Method for three-dimensional reconstruction of the course of the rail centre line of rails of a rail network for rail vehicles | |
CN111402171A (en) | Point cloud projection correction method based on tunnel general section | |
CN110888116B (en) | Laser radar data expansion method based on space point cloud generation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right |
Effective date of registration: 20240409 Address after: G0333, 2nd Floor, Building A, Innovation Building, No. 705 Asia Pacific Road, Daqiao Town, Nanhu District, Jiaxing City, Zhejiang Province, 314006 Patentee after: Jiaxing Suoya Intelligent Technology Co.,Ltd. Country or region after: China Address before: 130000 No. 2699 Qianjin Street, Jilin, Changchun Patentee before: Jilin University Country or region before: China |
|
TR01 | Transfer of patent right |