CN113837981B - Automatic fusion method for acquiring three-dimensional point cloud by multi-terminal equipment - Google Patents
Automatic fusion method for acquiring three-dimensional point cloud by multi-terminal equipment Download PDFInfo
- Publication number
- CN113837981B CN113837981B CN202111390247.5A CN202111390247A CN113837981B CN 113837981 B CN113837981 B CN 113837981B CN 202111390247 A CN202111390247 A CN 202111390247A CN 113837981 B CN113837981 B CN 113837981B
- Authority
- CN
- China
- Prior art keywords
- point cloud
- dimensional point
- cloud data
- scaling factor
- dimensional
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000007500 overflow downdraw method Methods 0.000 title claims abstract description 15
- 238000000034 method Methods 0.000 claims abstract description 26
- 238000004364 calculation method Methods 0.000 claims abstract description 4
- 230000005484 gravity Effects 0.000 claims description 23
- 230000004927 fusion Effects 0.000 abstract description 13
- 230000000295 complement effect Effects 0.000 abstract 1
- 238000003384 imaging method Methods 0.000 description 12
- 238000005516 engineering process Methods 0.000 description 10
- 230000007123 defense Effects 0.000 description 2
- 238000013499 data model Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000010408 sweeping Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/66—Analysis of geometric attributes of image moments or centre of gravity
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Geometry (AREA)
- Image Processing (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention discloses an automatic fusion method for acquiring three-dimensional point clouds by multi-terminal equipment, which is used for solving the problem of difficult automatic fusion caused by different scaling coefficients among the three-dimensional point clouds acquired by different terminal instruments. The invention provides a rough-to-fine multi-step point cloud scaling coefficient calculation method, which can calculate the relative scaling coefficient between different point clouds, improve the point cloud fusion efficiency obtained by different devices, and meet the point cloud complementary requirements of different devices in a specific scene. The method has the key point that the method does not need to limit the acquisition way of the point cloud, and has the advantages of strong universality and high automation degree.
Description
Technical Field
The invention relates to a three-dimensional point cloud fusion method, in particular to an automatic fusion method for acquiring three-dimensional point cloud by multi-terminal equipment.
Background
The three-dimensional imaging technology has important application in the fields of industry, national defense and the like nowadays. The current three-dimensional technology includes, but is not limited to, three-dimensional imaging technologies such as stereoscopic imaging, multi-view imaging, oblique photography, structured light three-dimensional imaging, and laser radar three-dimensional imaging. The key of the three-dimensional imaging technology is to acquire depth information of an object or a scene or corresponding three-dimensional point cloud data so as to provide depth perception capability or a three-dimensional space data model for a machine. The three-dimensional imaging technology has become mature after recent development and has considerable application in the fields of entertainment, industrial detection, national defense and the like. However, for complex application scenarios, such as urban building modeling, it is difficult to acquire complete three-dimensional point cloud data by a single three-dimensional imaging technology, and it is often necessary to perform fusion processing on point cloud data acquired by multiple three-dimensional imaging technologies to acquire relatively complete point cloud data. Therefore, point cloud fusion becomes a key technology for complete three-dimensional point cloud acquisition in these application scenes.
The existing point cloud fusion technology mainly aims at the fusion of homologous point clouds, for example, a common sweeping robot carries out three-dimensional map reconstruction through a single laser radar, in the process of map reconstruction, the map acquisition process has strong continuity, and the continuity of the homologous point clouds and the map acquisition process strengthens the stability of map fusion. However, in some application scenarios, the consistency of the homologous point cloud and the map acquisition cannot be ensured, and the difficulty of point cloud fusion in the scenario is increased sharply. One of the bottlenecks in the fusion between different point clouds is that the scaling factor between the point clouds may not be estimated, or it is difficult to ensure that highly consistent scaling ratios are maintained between point cloud data acquired by different imaging technologies due to different calibration methods and different imaging principles. In addition, because the map acquisition processes may be discontinuous in different methods or the coordinate systems set in the same method are different, the relative spatial positions between different point clouds cannot be obtained in advance, so that the scaling coefficient cannot be directly estimated by combining direct scaling with a fine registration method.
In summary, it can be found that a key problem of cloud fusion of different source points is that the scaling factor between point clouds is difficult to evaluate. The existing commercial software provides quite abundant point cloud operation tools including point cloud scaling coefficient evaluation and point cloud fusion, but the software needs more manual intervention to ensure better processing effect. This is extremely inefficient or even inapplicable for application scenarios requiring cloud fusion of a large number of different source points.
Disclosure of Invention
The purpose of the invention is as follows: the invention aims to solve the technical problem of providing an automatic fusion method for acquiring three-dimensional point cloud by multi-terminal equipment aiming at the defects of the prior art.
In order to solve the technical problem, the invention discloses an automatic fusion method for acquiring three-dimensional point cloud by multi-terminal equipment, which comprises the following steps:
step 1, three-dimensional point cloud data of two different sources are obtainedAnd three-dimensional point cloud data;
Step 2, calculating three-dimensional point cloud dataCenter of gravity ofAnd three-dimensional point cloud dataCenter of gravity ofAnd calculating to obtain three-dimensional point cloud dataEach point inTo the center of gravityAverage distance ofThree-dimensional point cloud dataFrom each point to the center of gravityAverage distance ofBy three-dimensional point cloud dataFor reference, three-dimensional point cloud data is calculatedRelative to the three-dimensional point cloud dataScaling factor ofAnd comparing the three-dimensional point cloud dataZooming to obtain zoomed three-dimensional point cloud data;
Step 3, three-dimensional point cloud dataAndperforming coarse registration to obtainAndcorresponding characteristic point sequenceAndwherein the subscriptIs shown asA feature point; calculating to obtain the gravity center of the characteristic point sequence and the average distance between the characteristic point and the gravity centerAndand obtaining three-dimensional point cloud dataWith respect to three-dimensional point cloud dataScaling factor ofAnd three-dimensional point cloud dataCarry out scaling factorScaled three-dimensional point cloud data;
Step 4, three-dimensional point cloud dataAnd three-dimensional point cloud dataCarrying out fine registration and calculating to obtain a scaling coefficientAnd performing three-dimensional point cloud dataScaling factorScaled three-dimensional point cloud data;
Step 5, in the scaling factorBy setting the zoom rangeAnd step sizeTo obtain the corresponding scaling coefficient sequenceWherein the maximum step sizeAnd further obtaining three-dimensional point cloud dataCorresponding three-dimensional point cloud data after zooming;
Step 6, calculating all three-dimensional point cloud data through precise registrationAnd three-dimensional point cloud dataMean square error of three-dimensional point cloud data betweenSelecting the varianceScaling factor corresponding to minimum valueAs final scaling factor and for three-dimensional point cloud dataCarry out scaling factorZooming to obtain final three-dimensional point cloudData of;
Step 7, finishing the three-dimensional point cloud data A and the three-dimensional point cloud data through fine registrationFusing the three-dimensional point cloud data.
Calculating three-dimensional point cloud data in step 2 of the inventionCenter of gravity ofThe method comprises the following steps:
whereinAs three-dimensional point cloud dataAt any one point in the above-mentioned (b),as three-dimensional point cloud dataThe number of midpoints;
The three-dimensional point cloud data calculated in step 2 of the inventionEach point inTo the center of gravityAverage distance ofThe method comprises the following steps:,
simultaneous computation of three-dimensional point cloud dataFrom each point to the center of gravityAverage distance of。
In step 2, the average distance from each point of the three-dimensional point cloud data to the gravity center of the three-dimensional point cloud data is taken as a representation of the scale of the three-dimensional point cloud data, and the three-dimensional point cloud data is taken asFor reference, three-dimensional point cloud data is calculatedWith respect to three-dimensional point cloud dataScaling factor of,
The calculation method is as follows:
step 2 of the invention is to carry out three-dimensional point cloud dataAccording to the scaling factorZooming to obtain zoomed three-dimensional point cloud data:
In the present invention, step 3 comprises: three-dimensional point cloud data rough registration method based on RANSAC algorithmAnd three-dimensional point cloud dataCarrying out coarse registration and obtaining three-dimensional point cloud dataAnd three-dimensional point cloud dataCorresponding characteristic point sequenceAnd;
further calculating the feature point sequenceCenter of gravity of column and average distance of feature point to center of gravityAndand calculating three-dimensional point cloud dataWith respect to three-dimensional point cloud dataScaling factor ofAnd three-dimensional point cloud dataCarry out scaling factorScaled three-dimensional point cloud data。
In step 4 of the invention, three-dimensional point cloud data is processed by an icp algorithmAnd three-dimensional point cloud dataAnd carrying out fine registration.
The step 5 of the invention comprises: scaling factor obtained in step 4By setting the zoom rangeAnd step sizeObtaining the sequenceWherein(ii) a Obtaining a sequence of scaling coefficientsAnd three-dimensional point cloud dataCarry out scaling factorScaled corresponding three-dimensional point cloud data。
In step 6 of the invention, all three-dimensional point cloud data are calculated through icp fine registrationAnd three-dimensional point cloud dataMean square error of three-dimensional point cloud data betweenAnd by the following formula:
selecting varianceMiddle minimumValue-corresponding scaling factorAs final scaling factor and for three-dimensional point cloud dataCarry out scaling factorZooming to obtain final three-dimensional point cloud data。
In step 7 of the invention, the three-dimensional point cloud data A and the three-dimensional point cloud data A are precisely registered according to an icp algorithmFusing the three-dimensional point cloud data.
Has the advantages that:
(1) the distance from each point of the point cloud to the gravity center is fully utilized to be used as the representation of the size of the point cloud, and the acquisition mode of the point cloud is not limited or the requirement on the size of the point cloud is not required to be acquired in advance.
(2) The characteristics of coarse registration and fine registration are fully coupled, a mode of combining one-time coarse registration and multiple times of fine registration is adopted, and the accurate relative scaling coefficient between the point clouds is progressively determined from coarse to fine.
Drawings
The foregoing and/or other advantages of the invention will become further apparent from the following detailed description of the invention when taken in conjunction with the accompanying drawings.
Fig. 1 is a schematic flow chart of an automated fusion method of three-dimensional point cloud in the present invention.
Detailed Description
The invention provides an automatic fusion method for acquiring three-dimensional point cloud by multi-terminal equipment, which comprises the following steps of:
step 1,Obtaining three-dimensional point cloud data of two different sources by different terminal devices or methods, e.g. stereo vision, oblique photography, lidar, structured light scanning, etcAnd。
step 2, by the following formula
Computing point cloudsCenter of gravity ofWhereinAs a point cloudAt any one point in the above-mentioned (b),as a point cloudThe number of the middle points can be obtained by the same methodCenter of gravity of. Further by the formula
Calculating to obtain point cloudEach point inTo the center of gravityAverage distance ofAnd can calculate the point cloud in the same wayFrom each point to the center of gravityAverage distance of. Taking the average distance from each point of the point cloud to the gravity center of the point cloud as the representative of the scale of the point cloud, and taking the point cloudFor reference, a point cloud is calculatedRelative toScaling factor ofThe specific calculation method is as follows
Step 3, through RANSAC algorithm (reference Martin A. Fischler)&Robert C. Bolles (June 1981). "Random Sample Consensus: A partner for Model fixing with Applications to Image Analysis and Automated graphics" (PDF.) Comm. ACM. 24 (6): 381-395.)Andperforming coarse registration to obtainAndcorresponding characteristic point sequenceAnd. The gravity center of the feature point sequence and the average distance between the feature points and the gravity center can be calculated and obtained through the step 2 and the step 3Andand calculating again to obtain the point cloud through the step 4Relative toScaling factor ofAnd carry outZoomRear point cloud。
Step 4, passing an icp algorithm (refer to Besl P J, McKay N D. Method for registration of 3-D maps [ C)]v/Sensor fusion IV control parts and data structures, International Society for Optics and Photonics, 1992, 1611: 586-Andfine registration is performed and the scaling factor is calculated again through steps 2, 3 and 4And carry outZoomRear point cloud。
Step 5, inBy setting the zoom rangeAnd step sizeObtaining the sequenceWherein(ii) a Obtaining a sequence of scaling coefficientsAnd three-dimensional point cloud dataCarry out scaling factorScaled corresponding three-dimensional point cloud data。
Step 6, calculating all three-dimensional point cloud data through icp fine registrationAnd three-dimensional point cloud dataMean square error of three-dimensional point cloud data betweenAnd by the following formula:
selecting varianceScaling factor corresponding to minimum valueAs final scaling factor and for three-dimensional point cloud dataCarry out scaling factorZooming to obtain final three-dimensional point cloud data。
Step 7, finishing A and A by fine registration according to an icp algorithmPoint cloud fusion of (2).
The invention provides a thought and a method for an automatic fusion method for a multi-terminal device to obtain a three-dimensional point cloud, and particularly provides a plurality of methods and ways for realizing the technical scheme. All the components not specified in the present embodiment can be realized by the prior art.
Claims (10)
1. An automatic fusion method for acquiring three-dimensional point cloud by multi-terminal equipment is characterized by comprising the following steps:
step 1, three-dimensional point cloud data of two different sources are obtainedAnd three-dimensional point cloud data;
Step 2, calculating three-dimensional point cloud dataCenter of gravity ofAnd three-dimensional point cloud dataCenter of gravity ofAnd calculating to obtain three-dimensional point cloud dataEach point inTo the center of gravityAverage distance ofThree-dimensional point cloud dataFrom each point to the center of gravityAverage distance ofBy three-dimensional point cloud dataFor reference, three-dimensional point cloud data is calculatedRelative to the three-dimensional point cloud dataScaling factor ofAnd comparing the three-dimensional point cloud dataZooming to obtain zoomed three-dimensional point cloud data;
Step 3, three-dimensional point cloud dataAndperforming coarse registration to obtainAndcorresponding characteristic point sequenceAndwherein the subscriptIs shown asA feature point; calculating to obtain the gravity center of the characteristic point sequence and the average distance between the characteristic point and the gravity centerAndand obtaining three-dimensional point cloud dataWith respect to three-dimensional point cloud dataScaling factor ofAnd three-dimensional point cloud dataCarry out scaling factorScaled three-dimensional point cloud data;
Step 4, aligning the three-dimensional pointsCloud dataAnd three-dimensional point cloud dataCarrying out fine registration and calculating to obtain a scaling coefficientAnd performing three-dimensional point cloud dataScaling factorScaled three-dimensional point cloud data;
Step 5, in the scaling factorBy setting the zoom rangeAnd step sizeTo obtain the corresponding scaling coefficient sequenceWhereinAnd further obtaining three-dimensional point cloud dataCorresponding three-dimensional point cloud data after zooming;
Step 6, calculating all three-dimensional point cloud data through precise registrationAnd three-dimensional point cloud dataMean square error of three-dimensional point cloud data betweenSelecting the varianceScaling factor corresponding to minimum valueAs final scaling factor and for three-dimensional point cloud dataCarry out scaling factorZooming to obtain final three-dimensional point cloud data;
2. The method as claimed in claim 1, wherein the step 2 of calculating the three-dimensional point cloud data comprisesCenter of gravity ofThe method comprises the following steps:
whereinAs three-dimensional point cloud dataAt any one point in the above-mentioned (b),as three-dimensional point cloud dataThe number of midpoints;
3. The method as claimed in claim 2, wherein the step 2 of calculating the three-dimensional point cloud data is performed by an automated fusion method for acquiring the three-dimensional point cloud by the multi-terminal deviceEach point inTo the center of gravityAverage distance ofThe method comprises the following steps:
4. The method as claimed in claim 3, wherein the average distance between each point of the three-dimensional point cloud data and the center of gravity is used as a representative of the scale of the three-dimensional point cloud data in step 2, and the three-dimensional point cloud data is used as the representative of the scale of the three-dimensional point cloud dataFor reference, three-dimensional point cloud data is calculatedWith respect to three-dimensional point cloud dataScaling factor ofThe calculation method is as follows:
6. The automated fusion method for acquiring three-dimensional point cloud by multi-terminal equipment according to claim 5, wherein the step 3 comprises: three-dimensional point cloud data rough registration method based on RANSAC algorithmAnd three-dimensional point cloud dataCarrying out coarse registration and obtaining three-dimensional point cloud dataAnd three-dimensional point cloud dataCorresponding characteristic point sequenceAnd;
further calculating the gravity center of the characteristic point sequence and the average distance between the characteristic points and the gravity centerAndand calculating three-dimensional point cloud dataWith respect to three-dimensional point cloud dataScaling factor ofAnd three-dimensional point cloud dataCarry out scaling factorScaled three-dimensional point cloud data。
8. The automated fusion method for acquiring three-dimensional point cloud by multi-terminal device according to claim 7, wherein the step 5 comprises: scaling factor obtained in step 4By setting the zoom rangeAnd step sizeObtaining the sequenceWherein(ii) a Obtaining a sequence of scaling coefficientsAnd three-dimensional point cloud dataCarry out scaling factorScaled corresponding three-dimensional point cloud data。
9. The method as claimed in claim 8, wherein all three-dimensional point cloud data are calculated by icp fine registration in step 6And three-dimensional point cloud dataMean square error of three-dimensional point cloud data betweenAnd by the following formula:
10. The automated fusion method for multi-terminal device to acquire three-dimensional point cloud of claim 9, wherein the step 7 comprises finishing the fine registration of the three-dimensional point cloud data A and the three-dimensional point cloud data according to the icp algorithmFusing the three-dimensional point cloud data.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111390247.5A CN113837981B (en) | 2021-11-23 | 2021-11-23 | Automatic fusion method for acquiring three-dimensional point cloud by multi-terminal equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111390247.5A CN113837981B (en) | 2021-11-23 | 2021-11-23 | Automatic fusion method for acquiring three-dimensional point cloud by multi-terminal equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113837981A CN113837981A (en) | 2021-12-24 |
CN113837981B true CN113837981B (en) | 2022-03-08 |
Family
ID=78971618
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111390247.5A Active CN113837981B (en) | 2021-11-23 | 2021-11-23 | Automatic fusion method for acquiring three-dimensional point cloud by multi-terminal equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113837981B (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106485690A (en) * | 2015-08-25 | 2017-03-08 | 南京理工大学 | Cloud data based on a feature and the autoregistration fusion method of optical image |
JP2020035448A (en) * | 2018-08-30 | 2020-03-05 | バイドゥ オンライン ネットワーク テクノロジー (ベイジン) カンパニー リミテッド | Method, apparatus, device, storage medium for generating three-dimensional scene map |
CN112102458A (en) * | 2020-08-31 | 2020-12-18 | 湖南盛鼎科技发展有限责任公司 | Single-lens three-dimensional image reconstruction method based on laser radar point cloud data assistance |
CN113223145A (en) * | 2021-04-19 | 2021-08-06 | 中国科学院国家空间科学中心 | Sub-pixel measurement multi-source data fusion method and system for planetary surface detection |
-
2021
- 2021-11-23 CN CN202111390247.5A patent/CN113837981B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106485690A (en) * | 2015-08-25 | 2017-03-08 | 南京理工大学 | Cloud data based on a feature and the autoregistration fusion method of optical image |
JP2020035448A (en) * | 2018-08-30 | 2020-03-05 | バイドゥ オンライン ネットワーク テクノロジー (ベイジン) カンパニー リミテッド | Method, apparatus, device, storage medium for generating three-dimensional scene map |
CN112102458A (en) * | 2020-08-31 | 2020-12-18 | 湖南盛鼎科技发展有限责任公司 | Single-lens three-dimensional image reconstruction method based on laser radar point cloud data assistance |
CN113223145A (en) * | 2021-04-19 | 2021-08-06 | 中国科学院国家空间科学中心 | Sub-pixel measurement multi-source data fusion method and system for planetary surface detection |
Also Published As
Publication number | Publication date |
---|---|
CN113837981A (en) | 2021-12-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110567469B (en) | Visual positioning method and device, electronic equipment and system | |
CN107977997B (en) | Camera self-calibration method combined with laser radar three-dimensional point cloud data | |
CN111383279B (en) | External parameter calibration method and device and electronic equipment | |
JP6883608B2 (en) | Depth data processing system that can optimize depth data by aligning images with respect to depth maps | |
CN107588721A (en) | The measuring method and system of a kind of more sizes of part based on binocular vision | |
CN103810685A (en) | Super resolution processing method for depth image | |
CN110361005B (en) | Positioning method, positioning device, readable storage medium and electronic equipment | |
CN112097732A (en) | Binocular camera-based three-dimensional distance measurement method, system, equipment and readable storage medium | |
CN112017248B (en) | 2D laser radar camera multi-frame single-step calibration method based on dotted line characteristics | |
CN110675436A (en) | Laser radar and stereoscopic vision registration method based on 3D feature points | |
CN116309813A (en) | Solid-state laser radar-camera tight coupling pose estimation method | |
CN111429571B (en) | Rapid stereo matching method based on spatio-temporal image information joint correlation | |
CN114782636A (en) | Three-dimensional reconstruction method, device and system | |
Ann et al. | Study on 3D scene reconstruction in robot navigation using stereo vision | |
CN113393524A (en) | Target pose estimation method combining deep learning and contour point cloud reconstruction | |
JP2005322128A (en) | Calibration method for stereo three-dimensional measurement and three-dimensional position calculating method | |
CN104236468A (en) | Method and system for calculating coordinates of target space and mobile robot | |
CN114543787A (en) | Millimeter-scale indoor map positioning method based on fringe projection profilometry | |
WO2019012004A1 (en) | Method for determining a spatial uncertainty in images of an environmental area of a motor vehicle, driver assistance system as well as motor vehicle | |
CN113837981B (en) | Automatic fusion method for acquiring three-dimensional point cloud by multi-terminal equipment | |
KR101634283B1 (en) | The apparatus and method of 3d modeling by 3d camera calibration | |
CN116804537A (en) | Binocular range finding system and method | |
CN110068308B (en) | Distance measurement method and distance measurement system based on multi-view camera | |
CN106548482A (en) | It is a kind of based on sparse matching and the dense matching method and system of image border | |
CN113792645A (en) | AI eyeball fusing image and laser radar |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information | ||
CB02 | Change of applicant information |
Address after: 210000 No.1, Lingshan South Road, Qixia District, Nanjing City, Jiangsu Province Applicant after: THE 28TH RESEARCH INSTITUTE OF CHINA ELECTRONICS TECHNOLOGY Group Corp. Address before: 210007 1 East Street, alfalfa garden, Qinhuai District, Nanjing, Jiangsu. Applicant before: THE 28TH RESEARCH INSTITUTE OF CHINA ELECTRONICS TECHNOLOGY Group Corp. |
|
GR01 | Patent grant | ||
GR01 | Patent grant |