CN110322553A - The method and system of laser radar point cloud mixed reality scene implementation setting-out - Google Patents
The method and system of laser radar point cloud mixed reality scene implementation setting-out Download PDFInfo
- Publication number
- CN110322553A CN110322553A CN201910621191.6A CN201910621191A CN110322553A CN 110322553 A CN110322553 A CN 110322553A CN 201910621191 A CN201910621191 A CN 201910621191A CN 110322553 A CN110322553 A CN 110322553A
- Authority
- CN
- China
- Prior art keywords
- point cloud
- scene
- laser radar
- setting
- radar point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 56
- 238000013507 mapping Methods 0.000 claims abstract description 18
- 238000012545 processing Methods 0.000 claims abstract description 18
- 238000004590 computer program Methods 0.000 claims description 27
- 238000005259 measurement Methods 0.000 claims description 17
- 238000006243 chemical reaction Methods 0.000 claims description 12
- 238000003860 storage Methods 0.000 claims description 12
- 239000011159 matrix material Substances 0.000 claims description 10
- 239000000178 monomer Substances 0.000 claims description 8
- 238000005520 cutting process Methods 0.000 claims description 7
- 238000005457 optimization Methods 0.000 claims description 7
- 238000004364 calculation method Methods 0.000 claims 1
- 230000003287 optical effect Effects 0.000 claims 1
- 230000009467 reduction Effects 0.000 abstract description 3
- 230000010354 integration Effects 0.000 abstract description 2
- 238000005516 engineering process Methods 0.000 description 10
- 238000010586 diagram Methods 0.000 description 7
- 230000004927 fusion Effects 0.000 description 7
- 230000004048 modification Effects 0.000 description 5
- 238000012986 modification Methods 0.000 description 5
- 230000008859 change Effects 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 241000083513 Punctum Species 0.000 description 2
- 238000004040 coloring Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 239000013558 reference substance Substances 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 239000000654 additive Substances 0.000 description 1
- 230000000996 additive effect Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- PCHJSUWPFVWCPO-UHFFFAOYSA-N gold Chemical compound [Au] PCHJSUWPFVWCPO-UHFFFAOYSA-N 0.000 description 1
- 239000010931 gold Substances 0.000 description 1
- 229910052737 gold Inorganic materials 0.000 description 1
- 238000000691 measurement method Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- VMXUWOKSQNHOCA-UKTHLTGXSA-N ranitidine Chemical compound [O-][N+](=O)\C=C(/NC)NCCSCC1=CC=C(CN(C)C)O1 VMXUWOKSQNHOCA-UKTHLTGXSA-N 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000001052 transient effect Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 238000011282 treatment Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformation in the plane of the image
- G06T3/40—Scaling the whole image or part thereof
- G06T3/4038—Scaling the whole image or part thereof for image mosaicing, i.e. plane images composed of plane sub-images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/32—Indexing scheme for image data processing or generation, in general involving image mosaicing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/61—Scene description
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2012—Colour editing, changing, or manipulating; Use of colour codes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2016—Rotation, translation, scaling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2024—Style variation
Abstract
This application involves the method and systems that a kind of laser radar point cloud mixed reality scene implements setting-out.Method includes: the laser radar point cloud data obtained to setting-out region;Coordinate and attribute processing are assigned to laser radar point cloud data, the laser radar point cloud that obtains that treated;According to treated, laser radar point cloud constructs point cloud scene;Cloud scene is registrated with the reality scene to setting-out region;On the basis of the coordinate of point cloud scene after being registrated, the setting-out on reality scene.The above method is directly realized by the high accuracy data integration of different data sources scene, can be carried out high-precision mapping or precisely setting-out reduction repeated acquisition workload and cost under real dynamic scene.
Description
Technical field
This application involves mixed realities and technical field of mapping, more particularly to a kind of laser radar point cloud mixed reality field
Method, system, computer equipment and the storage medium of scape implementation setting-out.
Background technique
With the development of mixed reality technology (MR) science and technology and surveying and mapping technology algorithm of new generation, MR technology makes with intelligent terminal
With popularizing very much, it is various to solve the problems, such as to be widely used in multiple fields.Such as by making existing three using MR technology
The accurate additive fusion of laser radar point cloud scene and reality scene of coordinate and attribute information is tieed up, it is dynamic so as to make simultaneously
Reality scene high-precision can measurement and three-dimensional laser radar point cloud fusion acquisition reality scene multiplicity multidate information.
Laser radar point cloud can obtain the high-precision three-dimensional static information of scene by laser radar scanning moment at present,
Can be by aerial, ground obtains landform and complex object using contactless high-rate laser measurement method in the form of cloud
The array geometry data of three-dimensional surface.Multiple quarter and description are carried out to scene with the mode of three-dimensional laser point cloud.With height
Precision, the characteristic of high recovery degree.Building point cloud scene can after three-dimensional laser point cloud processing by assigning three-dimensional coordinate and information
To carry out high-precision mapping, setting-out and analysis.
However, due to laser radar scanning obtain be transient data, and reality scene be one change over time dynamic
Scene, have timeliness.It is needed again when needing and carrying out high-precision mapping or precisely mapping setting-out to dynamic scene
Scanning collection guarantees timeliness.It is bound to cause the substantial increase of scanning collection workload and cost in this way.
Mixed reality technology (MR) is the further development of virtual reality technology, and the technology in virtual environment by introducing
Reality scene information sets up the information circuits of an interaction feedback between virtual world, real world and user, to enhance use
The sense of reality of family experience.Since common reality scene is without measurement calibration information and absolute coordinate information, so can not be straight
It connects and the high-precision mapping of absolute coordinate and setting-out is carried out to reality scene using MR technology.It similarly also can not be direct to reality scene
Assign absolute coordinate measures scale.
Since high accuracy three-dimensional laser point cloud answers the scene of punctum cloud scene description and reality scene is two different dimensionals
Spend existing forms.Two kinds of scenes can only carry out data information exchange by third party's medium at present, can not make field directly still
The integrated realization solution of scape high accuracy data.
Summary of the invention
Based on this, it is necessary to for current three-dimensional laser point cloud answer punctum cloud scene description scene and reality scene without
Method directly carries out the problem of fusion mapping and setting-out, provides a kind of side of laser radar point cloud mixed reality scene implementation setting-out
Method, system, computer equipment and storage medium.
A kind of method that laser radar point cloud mixed reality scene implements setting-out, which comprises
Obtain the laser radar point cloud data to setting-out region;
Coordinate and attribute processing are assigned to the laser radar point cloud data, the laser radar point cloud that obtains that treated;
According to treated the laser radar point cloud building point cloud scene;
Described cloud scene is registrated with the reality scene to setting-out region;
On the basis of the coordinate of the described cloud scene after being registrated, the setting-out on the reality scene.
Setting-out includes: on the reality scene in one of the embodiments,
On the basis of relative coordinate according to described cloud scene, and according to the phase of described cloud scene and the reality scene
Respective coordinates relationship, the opposite setting-out on the reality scene;
Or
On the basis of the absolute coordinate of described cloud scene, absolute coordinate setting-out is carried out in the reality scene.
In one of the embodiments, in the step of obtaining the laser radar point cloud data to setting-out region, comprising:
In the step of obtaining the laser radar point cloud data to setting-out region, comprising:
The three-dimensional laser radar point cloud data to setting-out region is obtained from each acquisition equipment,
Splicing is merged to each three-dimensional laser radar point cloud data, obtains the laser radar to setting-out region
Point cloud data.
The step of coordinate and attribute assignment processing are carried out to the laser radar point cloud data in one of the embodiments,
In, comprising:
The relative coordinate of the laser radar point cloud data is converted into absolute coordinate;
Singulation packet transaction is carried out to the laser radar point cloud data, wherein each group of laser radar point cloud data
Indicate a monomer;
Attribute is assigned to the laser radar point cloud data of each group of singulation.
Treated according to, laser radar point cloud constructs the step of putting cloud scene in one of the embodiments,
In, comprising:
Take vacuate, treated that laser radar point cloud is handled to described for the method for cutting whole into sections, building image gold
The point cloud scene mode of word tower;
Described cloud scene mode is grouped, and successively loads described cloud scene mould by the group class quantity after grouping
Formula;
Building optimization is carried out using the method for Quadtree Spatial Index to described cloud scene mode of load, is obtained described
Point cloud scene.
The step of described cloud scene is registrated with the reality scene to setting-out region in one of the embodiments,
In, comprising:
Matrix conversion is carried out to described cloud scene, and described cloud scene after matrix conversion is launched by MR equipment
Into reality scene;
Described cloud scene and the reality scene are registrated using least square method.
The step of described cloud scene is registrated with the reality scene to setting-out region in one of the embodiments,
Later, further includes:
Opposite mapping is carried out to the reality scene, and output phase is to measurement coordinate;
The opposite measurement coordinate is converted by the absolute coordinate of the cloud scene, exports and absolutely measures coordinate.
The system that a kind of laser radar point cloud mixed reality scene implements setting-out, the system comprises:
Data acquisition module, for obtaining the laser radar point cloud data to setting-out region;
Point cloud obtains module, for assigning coordinate and attribute processing to the laser radar point cloud data, after obtaining processing
Laser radar point cloud;
Point cloud scenario building module, for treated according to, laser radar point cloud constructs point cloud scene;
Registration module, for described cloud scene to be registrated with the reality scene to setting-out region;
Setting-out module, for by be registrated after described cloud scene coordinate on the basis of, the setting-out on the reality scene.
A kind of computer equipment can be run on a memory and on a processor including memory, processor and storage
Computer program, the processor perform the steps of when executing the computer program
Obtain the laser radar point cloud data to setting-out region;
Coordinate and attribute processing are assigned to the laser radar point cloud data, the laser radar point cloud that obtains that treated;
According to treated the laser radar point cloud building point cloud scene;
Described cloud scene is registrated with the reality scene to setting-out region;
On the basis of the coordinate of the described cloud scene after being registrated, the setting-out on the reality scene.
A kind of computer readable storage medium, is stored thereon with computer program, and the computer program is held by processor
It is performed the steps of when row
Obtain the laser radar point cloud data to setting-out region;
Coordinate and attribute processing are assigned to the laser radar point cloud data, the laser radar point cloud that obtains that treated;
According to treated the laser radar point cloud building point cloud scene;
Described cloud scene is registrated with the reality scene to setting-out region;
On the basis of the coordinate of the described cloud scene after being registrated, the setting-out on the reality scene.
Above-mentioned laser radar point cloud mixed reality scene implements method, system, computer equipment and the storage medium of setting-out,
First choice obtains laser radar point cloud data, handles this point cloud data, then building point cloud scene after processing will put cloud field
Scape is registrated with reality scene fusion, can be established the immediate data information association of a cloud scene and reality scene, is directly realized by difference
The integration of data source scene high accuracy data, can be carried out under real dynamic scene accurate setting-out reduce repeated acquisition workload and at
This.
Detailed description of the invention
Fig. 1 is the flow diagram for the method that laser radar point cloud mixed reality scene implements setting-out in one embodiment;
Fig. 2 is the flow diagram that one embodiment midpoint cloud scene is registrated with reality scene;
Fig. 3 is the flow diagram for the method that laser radar point cloud mixed reality scene implements setting-out in one embodiment;
Fig. 4 is one embodiment midpoint cloud scene and reality scene coordinate transformation relation figure;
Fig. 5 is one embodiment midpoint cloud scene matrix transition diagram;
Fig. 6 is the flow diagram for the method that laser radar point cloud mixed reality scene implements setting-out in another embodiment;
Fig. 7 is the structural schematic diagram for the system that laser radar point cloud mixed reality scene implements setting-out in one embodiment;
Fig. 8 is the internal structure chart of computer equipment in one embodiment.
Specific embodiment
It is with reference to the accompanying drawings and embodiments, right in order to which the objects, technical solutions and advantages of the application are more clearly understood
The application is further elaborated.It should be appreciated that specific embodiment described herein is only used to explain the application, not
For limiting the application.
The method that laser radar point cloud mixed reality scene provided by the present application implements setting-out.This method applies to terminal
In, terminal can be computer, MR intelligent glasses terminal etc..
In one embodiment, as shown in Figure 1, providing a kind of laser radar point cloud mixed reality scene implements setting-out
Method, being applied to computer in this way is customary explanation, comprising the following steps:
Step S102 obtains the laser radar point cloud data to setting-out region;
Wherein, point cloud refers to the point data set on the product appearance surface obtained in reverse-engineering by measuring instrument;
Laser radar point cloud data refers to described in the surface point data acquisition system with setting-out region using laser radar apparatus acquisition wait put
Sample region refers to the real space for needing to carry out setting-out, can be a room, building, a mountain, a landscape or one
Piece landscape region etc..
Step S104 assigns coordinate and attribute to laser radar point cloud data and handles, the laser radar point that obtains that treated
Cloud;
Specifically, the point cloud obtained according to laser measurement principle, including three-dimensional coordinate and laser reflection intensity;It is wherein three-dimensional
Coordinate is usually relative dimensional coordinate, assigns coordinate to laser radar point cloud data and typically refers to sit relative dimensional coordinate
Mark conversion, generates absolute three-dimensional coordinate;And attribute is assigned to laser radar point cloud data, including to laser radar point cloud data
Be grouped, assign color, shape, size, purposes etc..
In a wherein specific embodiment, the step of coordinate and attribute assignment processing is carried out to laser radar point cloud data
In rapid, comprising: the relative coordinate of laser radar point cloud data is converted into absolute coordinate;Laser radar point cloud data is carried out single
Body packet transaction, wherein each group of laser radar point cloud data indicates a monomer;To the laser of each group of singulation
Radar point cloud data assigns color attribute.
Wherein, laser radar point cloud packet attributes are to carry out singulation packet transaction to cloud, do cloud grouping if reaching
After can indicate monomer atural object;The processing of laser radar point cloud classification special topicization and tax color treatments.
Step S106, according to treated, laser radar point cloud constructs point cloud scene;
Point cloud scenario building referred to using treated laser radar point cloud and modeled using some algorithms, formation point
Cloud scene.
Cloud scene is registrated by step S108 with the reality scene to setting-out region;
It is placed into reality scene specifically, typically referring to will to put the load of cloud scene, then (such as most using some algorithms
Small square law or man-machine interaction method etc.) cloud scene and reality scene be subjected to fusion matching, to complete registration (such as Fig. 2
It is shown).
Step S110, on the basis of the coordinate of the point cloud scene after registration, the setting-out on reality scene.
Specifically, setting-out is and to form the three-dimensional of complexity using a two-dimentional body object as the section along some path
Object.Different bodies can be given on same path in different sections.Setting-out be can use to realize the structure of many complex models
It builds.For the scheme " remove and move " on drawing to be arrived actual field in engineering.And in the present embodiment, exactly according to a cloud scene
Coordinate information, thus the setting-out in reality scene.
The method that above-mentioned laser radar point cloud mixed reality scene implements setting-out, first choice obtain laser radar point cloud data,
This point cloud data is handled, then point cloud scene is registrated by building point cloud scene after processing with reality scene fusion
The immediate data information association for establishing point cloud scene and reality scene is directly realized by different data sources scene high accuracy data one
Change, can be carried out high-precision mapping or precisely setting-out reduction repeated acquisition workload and cost under real dynamic scene.
Setting-out includes: on reality scene in one of the embodiments,
On the basis of relative coordinate according to point cloud scene, and closed according to cloud scene with the corresponding coordinate of reality scene
System, the opposite setting-out on reality scene;
Or
On the basis of the absolute coordinate of cloud scene, absolute coordinate setting-out is carried out in reality scene.
Specifically, absolute coordinate and relative coordinate are all the important indicators for determining geographic orientation.In the geography in relation to orientation
Things or to related geographical object position in play significance.Grasp the similarities and differences of absolute coordinate and relative coordinate
Point has studying the geography and the geographical reading graph ability of promotion and greatly helps.So-called absolute coordinate, as the term suggests that is he
Standard be relatively-stationary, such as tellurian longitude and latitude is exactly absolute coordinate.The characteristics of such coordinate is exactly will not
Because referring to object difference and change.Suitable coordinate is for reference substance, i.e., jobbie is come on the basis of reference substance
Determine relative coordinate or the position of the object.Such as absolute coordinate be regardless of at present you be in where, from coordinate origin to
Your position, X, Y, the value of Z are exactly absolute coordinate, relative coordinate be meant that it is relatively upper a little for, can be you this
The absolute coordinate of any subtracts the absolute coordinate of a bit.
In the present embodiment, setting-out is divided into suitable setting-out and absolute setting-out, wherein suitable setting-out is exactly according to a cloud scene
It is then to determine phase according to cloud scene coordinate relationship corresponding with reality scene on the basis of middle certain point (i.e. relative coordinate)
To the position of setting-out, opposite setting-out is then completed on the position;Wherein, the information such as the coordinate relationship gap, angle.Separately
Outside, putting has complete absolute coordinate in cloud scene, can be true in cloud scene according to the coordinate information of the object to setting-out
Determine the position of setting-out, then corresponding position carries out setting-out in reality scene.
Above-mentioned method makes to can be carried out accurate setting-out reduction repeated acquisition workload and cost, i.e. point under real dynamic scene
The high-precision setting-out of absolute coordinate, efficiency directly can be carried out to reality scene using MR technology after cloud scene mixed reality scene
It is high and accurate.
In one of the embodiments, as shown in figure 3, the step of obtaining the laser radar point cloud data to setting-out region
In, comprising:
Step S202 obtains the three-dimensional laser radar point cloud data to setting-out region from each acquisition equipment;
Step S204 merges splicing to each three-dimensional laser radar point cloud data, obtains the laser thunder to setting-out region
Up to point cloud data.
Specifically, acquisition equipment includes manned aircraft, unmanned vehicle, ground on-vehicle, bears the laser radars such as hand-held
Equipment.Each acquisition equipment can acquire a kind of laser radar data, merge splicing to each laser radar data,
Obtain the laser radar point cloud data to setting-out region.Use the above method can be with Multiple laser radar data, so that obtaining
Laser radar point cloud data to setting-out region is more comprehensively and accurate.
In one of the embodiments, according to treated laser radar point cloud building point cloud scene the step of in, packet
It includes:
Take vacuate, to treated, laser radar point cloud is handled the method for cutting whole into sections, construct image pyramid
Point cloud scene mode;
A cloud scene mode is grouped, and presses the successively load(ing) point cloud scene mode of the group class quantity after grouping;
Building optimization is carried out using the method for Quadtree Spatial Index to the point cloud scene mode of load, obtains Dian Yunchang
Scape.
Specifically, firstly, magnanimity laser radar point cloud data is taken vacuate, the method for cutting whole into sections, construct a type
It is similar to the point cloud scene mode of the point cloud structure of image pyramid;Then the laser radar point cloud in cloud scene mode is carried out
By group class quantity load(ing) point cloud scene mode after grouping;The side of Quadtree Spatial Index is used to the point cloud scene mode of load again
Method carries out building optimization, obtains a cloud scene.It is modeled using multiple applications, so that obtained point cloud scene is more accurate.
In the step of cloud scene is registrated with the reality scene to setting-out region in one of the embodiments,
Include:
Matrix conversion is carried out to cloud scene, and the point cloud scene after matrix conversion is launched by MR equipment to real field
Jing Zhong;Cloud scene and reality scene are registrated using least square method.
Specifically, as shown in figure 4, being launched by MR equipment to existing after conversion first to cloud scene by matrix conversion
In real field scape;Homotopy mapping method is generallyd use after release position cloud scene carries out a registration for cloud scene and reality scene;Its
Conversion can be calculated when middle homotopy mapping using least square method and man-machine interaction mode carries out and reality scene is registrated.
The step of cloud scene is registrated with the reality scene to setting-out region in one of the embodiments, it
Afterwards, further includes:
Opposite mapping is carried out to reality scene, and output phase is to measurement coordinate;
Specifically, being measured to outdoor scene scene, it usually can be and select characteristic point in outdoor scene scene, then to feature
Point is measured, and mapping coordinate is obtained;Wherein characteristic point can choose multiple, selection it is more, mapped results are more accurate.
Opposite measurement coordinate is converted by the absolute coordinate of cloud scene, exports and absolutely measures coordinate.
Specifically, can determine that a cloud scene surveys and draws coordinate according to the relationship between outdoor scene scene and point cloud scene.
In a kind of wherein optional embodiment mode, cloud field can be passed through to opposite measurement coordinate using following formula
The absolute coordinate of scape is converted,
y1=y+Scos θ
x1=x+Ssin θ
Wherein, point cloud scene coordinate is (x, y), and reality scene coordinate is (x1,y1), θ is azimuth, and S is distance (such as Fig. 5
It is shown).
In one of the embodiments, as shown in fig. 6, cloud scene is registrated with the reality scene to setting-out region
The step of in, further includes:
Step 302, attribute information and coordinate information are obtained from reality scene;
Wherein, attribute information includes laser radar point cloud grouping information, colouring information etc., wherein grouping refers to laser thunder
It is grouped up to cloud according to singulation, wherein each group of laser radar point cloud data indicates a monomer;Grouping information
Including packet mode, number of packet etc.;Colouring information refers to the color of laser radar point cloud in reality scene.
Step 304, coordinate modification is carried out to laser radar point cloud according to coordinate information, and according to attribute information to laser thunder
Attribute modification is carried out up to cloud, obtains modified cloud scene;
Specifically, point cloud scene may be deposited with reality scene when cloud scene carries out being registrated fusion with reality scene
In some differences, for example, attribute or coordinate it is different, therefore, in the present embodiment can according to the attribute information of reality scene and
Coordinate information modifies to a cloud scene.The point cloud scene that above-mentioned method can make is more accurate, so that it is guaranteed that
Point cloud scene and reality scene registration are higher.
It should be understood that although each step in the flow chart of Fig. 1-6 is successively shown according to the instruction of arrow,
These steps are not that the inevitable sequence according to arrow instruction successively executes.Unless expressly stating otherwise herein, these steps
Execution there is no stringent sequences to limit, these steps can execute in other order.Moreover, at least one in Fig. 1-6
Part steps may include that perhaps these sub-steps of multiple stages or stage are not necessarily in synchronization to multiple sub-steps
Completion is executed, but can be executed at different times, the execution sequence in these sub-steps or stage is also not necessarily successively
It carries out, but can be at least part of the sub-step or stage of other steps or other steps in turn or alternately
It executes.
In one embodiment, as shown in fig. 7, a kind of laser radar point cloud mixed reality scene implements the system of setting-out,
System includes:
Data acquisition module 10, for obtaining the laser radar point cloud data to setting-out region;
Point cloud obtains module 20, and for obtaining to laser radar point cloud data imparting coordinate and attribute processing, treated
Laser radar point cloud;
Point cloud scenario building module 30, for laser radar point cloud to construct point cloud scene according to treated;
Registration module 40 is registrated for will put cloud scene with the reality scene to setting-out region;
Setting-out module 50, on the basis of the coordinate for putting cloud scene after being registrated, the setting-out on reality scene.
Setting-out module includes: in one of the embodiments,
Opposite setting-out module, on the basis of the relative coordinate according to point cloud scene, and according to cloud scene and real field
The corresponding coordinate relationship of scape, the opposite setting-out on reality scene;
Or
Absolute setting-out module carries out absolute coordinate setting-out on the basis of the absolute coordinate of cloud scene in reality scene.
Data acquisition module includes: in one of the embodiments,
Laser radar data obtains module, for obtaining the three-dimensional laser radar point cloud to setting-out region in each acquisition equipment
Data;
Point cloud data obtains module and obtains for merging splicing to each three-dimensional laser radar point cloud data to setting-out
The laser radar point cloud data in region.
Point cloud acquisition module includes: in one of the embodiments,
Coordinate transformation module, for the relative coordinate of laser radar point cloud data to be converted into absolute coordinate;
Grouping module, for carrying out singulation packet transaction to laser radar point cloud data, wherein each group of laser thunder
A monomer is indicated up to point cloud data;
Attribute assigns module, assigns attribute for the laser radar point cloud data to each group of singulation.
Point cloud scenario building module includes: in one of the embodiments,
Point cloud scene module constructs module, for take vacuate, the method for cutting whole into sections is to treated laser radar point
Cloud is handled, and the point cloud scene mode of image pyramid is constructed;
Point cloud scene mode loading module, for being grouped to a cloud scene mode, and by the group class quantity after grouping
Successively load(ing) point cloud scene mode;
Point cloud scene obtains module, for being carried out to the point cloud scene mode of load using the method for Quadtree Spatial Index
Building optimization, obtains a cloud scene.
In one of the embodiments, further include:
Opposite measurement coordinate output module, for carrying out opposite mapping to reality scene, and output phase is to measurement coordinate;
Absolute photograph album coordinate output module, for being converted to opposite measurement coordinate by the absolute coordinate of cloud scene,
Output absolutely measures coordinate.
Registration module includes: in one of the embodiments,
Information extraction modules, for obtaining attribute information and coordinate information from reality scene;
Point cloud scene modified module, for carrying out coordinate modification to laser radar point cloud according to coordinate information, and according to category
Property information to laser radar point cloud carry out attribute modification, obtain modified cloud scene.
In one embodiment, a kind of computer equipment is provided, which can be server, internal junction
Composition can be as shown in Figure 8.The computer equipment include by system bus connect processor, memory, network interface and
Database.Wherein, the processor of the computer equipment is for providing calculating and control ability.The memory packet of the computer equipment
Include non-volatile memory medium, built-in storage.The non-volatile memory medium is stored with operating system, computer program and data
Library.The built-in storage provides environment for the operation of operating system and computer program in non-volatile memory medium.The calculating
The database of machine equipment is for storing fault case data.The network interface of the computer equipment is used to pass through with external terminal
Network connection communication.To realize that a kind of laser radar point cloud mixed reality scene is implemented when the computer program is executed by processor
The method of setting-out.
It will be understood by those skilled in the art that structure shown in Fig. 8, only part relevant to application scheme is tied
The block diagram of structure does not constitute the restriction for the computer equipment being applied thereon to application scheme, specific computer equipment
It may include perhaps combining certain components or with different component layouts than more or fewer components as shown in the figure.
In one embodiment, a kind of computer equipment is provided, including memory, processor and storage are on a memory
And the computer program that can be run on a processor, processor perform the steps of when executing computer program
Obtain the laser radar point cloud data to setting-out region;
Coordinate and attribute processing are assigned to laser radar point cloud data, the laser radar point cloud that obtains that treated;
According to treated, laser radar point cloud constructs point cloud scene;
Cloud scene is registrated with the reality scene to setting-out region;
On the basis of the coordinate of point cloud scene after being registrated, the setting-out on reality scene.
It is performed the steps of when processor executes computer program in one of the embodiments,
Setting-out includes: on reality scene
On the basis of relative coordinate according to point cloud scene, and closed according to cloud scene with the corresponding coordinate of reality scene
System, the opposite setting-out on reality scene;
Or
On the basis of the absolute coordinate of cloud scene, absolute coordinate setting-out is carried out in reality scene.
It is performed the steps of when processor executes computer program in one of the embodiments,
The three-dimensional laser radar point cloud data to setting-out region is obtained from each acquisition equipment,
Splicing is merged to each three-dimensional laser radar point cloud data, obtains the laser radar point cloud number to setting-out region
According to.
It is performed the steps of when processor executes computer program in one of the embodiments,
The relative coordinate of laser radar point cloud data is converted into absolute coordinate;
Singulation packet transaction is carried out to laser radar point cloud data, wherein each group of laser radar point cloud data indicates
One monomer;
Attribute is assigned to the laser radar point cloud data of each group of singulation.
It is performed the steps of when processor executes computer program in one of the embodiments,
Take vacuate, to treated, laser radar point cloud is handled the method for cutting whole into sections, construct image pyramid
Point cloud scene mode;
A cloud scene mode is grouped, and presses the successively load(ing) point cloud scene mode of the group class quantity after grouping;
Building optimization is carried out using the method for Quadtree Spatial Index to the point cloud scene mode of load, obtains Dian Yunchang
Scape.
It is performed the steps of when processor executes computer program in one of the embodiments,
After the step of cloud scene is registrated with the reality scene to setting-out region, further includes:
Opposite mapping is carried out to reality scene, and output phase is to measurement coordinate;
Opposite measurement coordinate is converted by the absolute coordinate of cloud scene, exports and absolutely measures coordinate.
In one embodiment, a kind of computer readable storage medium is provided, computer program is stored thereon with, is calculated
Machine program performs the steps of when being executed by processor
Obtain the laser radar point cloud data to setting-out region;
Coordinate and attribute processing are assigned to laser radar point cloud data, the laser radar point cloud that obtains that treated;
According to treated, laser radar point cloud constructs point cloud scene;
Cloud scene is registrated with the reality scene to setting-out region;
On the basis of the coordinate of point cloud scene after being registrated, the setting-out on reality scene.
It is performed the steps of when computer program is executed by processor in one of the embodiments,
Setting-out includes: on reality scene
On the basis of relative coordinate according to point cloud scene, and closed according to cloud scene with the corresponding coordinate of reality scene
System, the opposite setting-out on reality scene;
Or
On the basis of the absolute coordinate of cloud scene, absolute coordinate setting-out is carried out in reality scene.
It is performed the steps of when computer program is executed by processor in one of the embodiments,
In the step of obtaining the laser radar point cloud data to setting-out region, comprising:
The three-dimensional laser radar point cloud data to setting-out region is obtained from each acquisition equipment,
Splicing is merged to each three-dimensional laser radar point cloud data, obtains the laser radar point cloud number to setting-out region
According to.
It is performed the steps of when computer program is executed by processor in one of the embodiments,
The relative coordinate of laser radar point cloud data is converted into absolute coordinate;
Singulation packet transaction is carried out to laser radar point cloud data, wherein each group of laser radar point cloud data indicates
One monomer;
Attribute is assigned to the laser radar point cloud data of each group of singulation.
It is performed the steps of when computer program is executed by processor in one of the embodiments,
Take vacuate, to treated, laser radar point cloud is handled the method for cutting whole into sections, construct image pyramid
Point cloud scene mode;
A cloud scene mode is grouped, and presses the successively load(ing) point cloud scene mode of the group class quantity after grouping;
Building optimization is carried out using the method for Quadtree Spatial Index to the point cloud scene mode of load, obtains Dian Yunchang
Scape.
It is performed the steps of when computer program is executed by processor in one of the embodiments,
Matrix conversion is carried out to cloud scene, and the point cloud scene after matrix conversion is launched by MR equipment to real field
Jing Zhong;
Cloud scene and reality scene are registrated using least square method.
It is performed the steps of when computer program is executed by processor in one of the embodiments,
After the step of cloud scene is registrated with the reality scene to setting-out region, further includes:
Opposite mapping is carried out to reality scene, and output phase is to measurement coordinate;
Opposite measurement coordinate is converted by the absolute coordinate of cloud scene, exports and absolutely measures coordinate.
Those of ordinary skill in the art will appreciate that realizing all or part of the process in above-described embodiment method, being can be with
Instruct relevant hardware to complete by computer program, computer program to can be stored in a non-volatile computer readable
It takes in storage medium, the computer program is when being executed, it may include such as the process of the embodiment of above-mentioned each method.Wherein, this Shen
Please provided by any reference used in each embodiment to memory, storage, database or other media, may each comprise
Non-volatile and/or volatile memory.Nonvolatile memory may include read-only memory (ROM), programming ROM
(PROM), electrically programmable ROM (EPROM), electrically erasable ROM (EEPROM) or flash memory.Volatile memory may include
Random access memory (RAM) or external cache.By way of illustration and not limitation, RAM is available in many forms,
Such as static state RAM (SRAM), dynamic ram (DRAM), synchronous dram (SDRAM), double data rate sdram (DDRSDRAM), enhancing
Type SDRAM (ESDRAM), synchronization link (Synchlink) DRAM (SLDRAM), memory bus (Rambus) direct RAM
(RDRAM), direct memory bus dynamic ram (DRDRAM) and memory bus dynamic ram (RDRAM) etc..
Each technical characteristic of above embodiments can be combined arbitrarily, for simplicity of description, not to above-described embodiment
In each technical characteristic it is all possible combination be all described, as long as however, the combination of these technical characteristics be not present lance
Shield all should be considered as described in this specification.
The several embodiments of the application above described embodiment only expresses, the description thereof is more specific and detailed, but simultaneously
It cannot therefore be construed as limiting the scope of the patent.It should be pointed out that coming for those of ordinary skill in the art
It says, without departing from the concept of this application, various modifications and improvements can be made, these belong to the protection of the application
Range.Therefore, the scope of protection shall be subject to the appended claims for the application patent.
Claims (10)
1. a kind of method that laser radar point cloud mixed reality scene implements setting-out, which comprises
Obtain the laser radar point cloud data to setting-out region;
Coordinate and attribute processing are assigned to the laser radar point cloud data, the laser radar point cloud that obtains that treated;
According to treated the laser radar point cloud building point cloud scene;
Described cloud scene is registrated with the reality scene to setting-out region;
On the basis of the coordinate of the described cloud scene after being registrated, the setting-out on the reality scene.
2. the method that laser radar point cloud mixed reality scene according to claim 1 implements setting-out, which is characterized in that
Setting-out includes: on the reality scene
On the basis of relative coordinate according to described cloud scene, and it is corresponding with the reality scene according to described cloud scene
Coordinate relationship, the opposite setting-out on the reality scene;
Or
On the basis of the absolute coordinate of described cloud scene, absolute coordinate setting-out is carried out in the reality scene.
3. the method that laser radar point cloud mixed reality scene according to claim 2 implements setting-out, which is characterized in that
In the step of obtaining the laser radar point cloud data to setting-out region, comprising:
The three-dimensional laser radar point cloud data to setting-out region is obtained from each acquisition equipment,
Splicing is merged to each three-dimensional laser radar point cloud data, obtains the laser radar point cloud to setting-out region
Data.
4. the method that laser radar point cloud mixed reality scene according to claim 1 implements setting-out, which is characterized in that right
The laser radar point cloud data carried out in the step of coordinate and attribute assignment processing, comprising:
The relative coordinate of the laser radar point cloud data is converted into absolute coordinate;
Singulation packet transaction is carried out to the laser radar point cloud data, wherein each group of laser radar point cloud data indicates
One monomer;
Attribute is assigned to the laser radar point cloud data of each group of singulation.
5. the method that laser radar point cloud mixed reality scene according to claim 4 implements setting-out, which is characterized in that
In the step of treated the laser radar point cloud building point cloud scene, comprising:
Take vacuate, treated that laser radar point cloud is handled to described for the method for cutting whole into sections, construct image pyramid
Point cloud scene mode;
Described cloud scene mode is grouped, and successively loads described cloud scene mode by the group class quantity after grouping;
Building optimization is carried out using the method for Quadtree Spatial Index to described cloud scene mode of load, obtains described cloud
Scene.
6. the method that laser radar point cloud mixed reality scene according to claim 5 implements setting-out, which is characterized in that will
In the step of described cloud scene is registrated with the reality scene to setting-out region, comprising:
Matrix conversion is carried out to described cloud scene, and described cloud scene after matrix conversion is launched by MR equipment to existing
In real field scape;
Described cloud scene and the reality scene are registrated using least square method.
7. the method that laser radar point cloud mixed reality scene according to claim 1-6 implements setting-out, special
After the step of sign is, described cloud scene is registrated with the reality scene to setting-out region, further includes:
Opposite mapping is carried out to the reality scene, and output phase is to measurement coordinate;
The opposite measurement coordinate is converted by the absolute coordinate of the cloud scene, exports and absolutely measures coordinate.
8. the system that a kind of laser radar point cloud mixed reality scene implements setting-out, which is characterized in that the system comprises:
Data acquisition module, for obtaining the laser radar point cloud data to setting-out region;
Point cloud obtains module, and for obtaining to laser radar point cloud data imparting coordinate and attribute processing, treated swashs
Optical radar point cloud;
Point cloud scenario building module, for treated according to, laser radar point cloud constructs point cloud scene;
Registration module, for described cloud scene to be registrated with the reality scene to setting-out region;
Setting-out module, for by be registrated after described cloud scene coordinate on the basis of, the setting-out on the reality scene.
9. a kind of computer equipment including memory, processor and stores the meter that can be run on a memory and on a processor
Calculation machine program, which is characterized in that the processor realizes any one of claims 1 to 7 institute when executing the computer program
The step of stating method.
10. a kind of computer readable storage medium, is stored thereon with computer program, which is characterized in that the computer program
The step of method described in any one of claims 1 to 7 is realized when being executed by processor.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910621191.6A CN110322553B (en) | 2019-07-10 | 2019-07-10 | Method and system for lofting implementation of laser radar point cloud mixed reality scene |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910621191.6A CN110322553B (en) | 2019-07-10 | 2019-07-10 | Method and system for lofting implementation of laser radar point cloud mixed reality scene |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110322553A true CN110322553A (en) | 2019-10-11 |
CN110322553B CN110322553B (en) | 2024-04-02 |
Family
ID=68123178
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910621191.6A Active CN110322553B (en) | 2019-07-10 | 2019-07-10 | Method and system for lofting implementation of laser radar point cloud mixed reality scene |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110322553B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112862882A (en) * | 2021-01-28 | 2021-05-28 | 北京格灵深瞳信息技术股份有限公司 | Target distance measuring method, device, electronic apparatus and storage medium |
CN117368869A (en) * | 2023-12-06 | 2024-01-09 | 航天宏图信息技术股份有限公司 | Visualization method, device, equipment and medium for radar three-dimensional power range |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103366250A (en) * | 2013-07-12 | 2013-10-23 | 中国科学院深圳先进技术研究院 | City appearance environment detection method and system based on three-dimensional live-action data |
CN107392944A (en) * | 2017-08-07 | 2017-11-24 | 广东电网有限责任公司机巡作业中心 | Full-view image and the method for registering and device for putting cloud |
US20180108146A1 (en) * | 2016-10-13 | 2018-04-19 | Beijing Baidu Netcom Science And Technology Co., Ltd. | Method and apparatus for annotating point cloud data |
US20180232954A1 (en) * | 2017-02-15 | 2018-08-16 | Faro Technologies, Inc. | System and method of generating virtual reality data from a three-dimensional point cloud |
CN108648272A (en) * | 2018-04-28 | 2018-10-12 | 上海激点信息科技有限公司 | Three-dimensional live acquires modeling method, readable storage medium storing program for executing and device |
CN108665536A (en) * | 2018-05-14 | 2018-10-16 | 广州市城市规划勘测设计研究院 | Three-dimensional and live-action data method for visualizing, device and computer readable storage medium |
US20180308283A1 (en) * | 2017-04-20 | 2018-10-25 | TuSimple | Method and device of labeling laser point cloud |
CN109003326A (en) * | 2018-06-05 | 2018-12-14 | 湖北亿咖通科技有限公司 | A kind of virtual laser radar data generation method based on virtual world |
CN109523578A (en) * | 2018-08-27 | 2019-03-26 | 中铁上海工程局集团有限公司 | A kind of matching process of bim model and point cloud data |
CN109633665A (en) * | 2018-12-17 | 2019-04-16 | 北京主线科技有限公司 | The sparse laser point cloud joining method of traffic scene |
CN109945845A (en) * | 2019-02-02 | 2019-06-28 | 南京林业大学 | A kind of mapping of private garden spatial digitalized and three-dimensional visualization method |
US20190206123A1 (en) * | 2017-12-29 | 2019-07-04 | Baidu Online Network Technology (Beijing) Co., Ltd. | Method and apparatus for fusing point cloud data |
-
2019
- 2019-07-10 CN CN201910621191.6A patent/CN110322553B/en active Active
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103366250A (en) * | 2013-07-12 | 2013-10-23 | 中国科学院深圳先进技术研究院 | City appearance environment detection method and system based on three-dimensional live-action data |
US20180108146A1 (en) * | 2016-10-13 | 2018-04-19 | Beijing Baidu Netcom Science And Technology Co., Ltd. | Method and apparatus for annotating point cloud data |
US20180232954A1 (en) * | 2017-02-15 | 2018-08-16 | Faro Technologies, Inc. | System and method of generating virtual reality data from a three-dimensional point cloud |
US20180308283A1 (en) * | 2017-04-20 | 2018-10-25 | TuSimple | Method and device of labeling laser point cloud |
CN107392944A (en) * | 2017-08-07 | 2017-11-24 | 广东电网有限责任公司机巡作业中心 | Full-view image and the method for registering and device for putting cloud |
US20190206123A1 (en) * | 2017-12-29 | 2019-07-04 | Baidu Online Network Technology (Beijing) Co., Ltd. | Method and apparatus for fusing point cloud data |
CN108648272A (en) * | 2018-04-28 | 2018-10-12 | 上海激点信息科技有限公司 | Three-dimensional live acquires modeling method, readable storage medium storing program for executing and device |
CN108665536A (en) * | 2018-05-14 | 2018-10-16 | 广州市城市规划勘测设计研究院 | Three-dimensional and live-action data method for visualizing, device and computer readable storage medium |
CN109003326A (en) * | 2018-06-05 | 2018-12-14 | 湖北亿咖通科技有限公司 | A kind of virtual laser radar data generation method based on virtual world |
CN109523578A (en) * | 2018-08-27 | 2019-03-26 | 中铁上海工程局集团有限公司 | A kind of matching process of bim model and point cloud data |
CN109633665A (en) * | 2018-12-17 | 2019-04-16 | 北京主线科技有限公司 | The sparse laser point cloud joining method of traffic scene |
CN109945845A (en) * | 2019-02-02 | 2019-06-28 | 南京林业大学 | A kind of mapping of private garden spatial digitalized and three-dimensional visualization method |
Non-Patent Citations (3)
Title |
---|
余飞: "海量激光雷达点云数据的多尺度可视化高效管理", 《工程勘察》 * |
余飞: "海量激光雷达点云数据的多尺度可视化高效管理", 《工程勘察》, 30 September 2016 (2016-09-30), pages 69 - 73 * |
张军等: "基于粒子群优化的点云场景拼接算法", 《国防科技大学学报》, vol. 35, no. 5, pages 174 - 179 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112862882A (en) * | 2021-01-28 | 2021-05-28 | 北京格灵深瞳信息技术股份有限公司 | Target distance measuring method, device, electronic apparatus and storage medium |
CN117368869A (en) * | 2023-12-06 | 2024-01-09 | 航天宏图信息技术股份有限公司 | Visualization method, device, equipment and medium for radar three-dimensional power range |
CN117368869B (en) * | 2023-12-06 | 2024-03-19 | 航天宏图信息技术股份有限公司 | Visualization method, device, equipment and medium for radar three-dimensional power range |
Also Published As
Publication number | Publication date |
---|---|
CN110322553B (en) | 2024-04-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Kerkweg et al. | The 1-way on-line coupled atmospheric chemistry model system MECO (n)–Part 1: Description of the limited-area atmospheric chemistry model COSMO/MESSy | |
EP3505868A1 (en) | Method and apparatus for adjusting point cloud data acquisition trajectory, and computer readable medium | |
CN103258345A (en) | Method for extracting parameters of tree branches based on ground laser radar three-dimensional scanning | |
CN111080682B (en) | Registration method and device for point cloud data | |
CN103884321A (en) | Remote-sensing image mapping process | |
CN112884902B (en) | Point cloud registration-oriented target ball position optimization method | |
CN110322553A (en) | The method and system of laser radar point cloud mixed reality scene implementation setting-out | |
Zhang et al. | Three-dimensional cooperative mapping for connected and automated vehicles | |
CN112905831A (en) | Method and system for acquiring coordinates of object in virtual scene and electronic equipment | |
CN106323286A (en) | Transforming method of robot coordinate system and three-dimensional measurement coordinate system | |
CN113870425A (en) | Forest accumulation space mapping method based on random forest and multi-source remote sensing technology | |
Karel et al. | Oriental: Automatic geo-referencing and ortho-rectification of archaeological aerial photographs | |
Howland | 3D Recording in the Field: Style Without Substance? | |
CN111402332B (en) | AGV composite map building and navigation positioning method and system based on SLAM | |
CN110162812B (en) | Target sample generation method based on infrared simulation | |
CN108875184B (en) | Shale organic carbon content prediction method and device based on digital outcrop model | |
CN116976115A (en) | Remote sensing satellite application demand simulation method and device oriented to quantitative analysis and judgment | |
CN106650595A (en) | Land block boundary identification method and boundary identification device | |
JP6761388B2 (en) | Estimator and program | |
CN113822892B (en) | Evaluation method, device and equipment of simulated radar and computer storage medium | |
Olague et al. | Development of a practical photogrammetric network design using evolutionary computing | |
CN116310194A (en) | Three-dimensional model reconstruction method, system, equipment and storage medium for power distribution station room | |
El-Ashmawy | Photogrammetric block adjustment without control points | |
Qiu et al. | DSMNet: Deep high-precision 3D surface modeling from sparse point cloud frames | |
CN114187399A (en) | Method and system for generating illumination simulation image on surface of extraterrestrial star |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |