CN109919828A - A method of judging difference between 3D model - Google Patents

A method of judging difference between 3D model Download PDF

Info

Publication number
CN109919828A
CN109919828A CN201910038874.9A CN201910038874A CN109919828A CN 109919828 A CN109919828 A CN 109919828A CN 201910038874 A CN201910038874 A CN 201910038874A CN 109919828 A CN109919828 A CN 109919828A
Authority
CN
China
Prior art keywords
endpoint
reference coordinates
coordinates point
model
plane picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910038874.9A
Other languages
Chinese (zh)
Other versions
CN109919828B (en
Inventor
崔岩
刘强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Siwei Times Network Technology Co Ltd
Sino German (zhuhai) Artificial Intelligence Research Institute Co Ltd
Original Assignee
Zhuhai Siwei Times Network Technology Co Ltd
Sino German (zhuhai) Artificial Intelligence Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Siwei Times Network Technology Co Ltd, Sino German (zhuhai) Artificial Intelligence Research Institute Co Ltd filed Critical Zhuhai Siwei Times Network Technology Co Ltd
Priority to CN201910038874.9A priority Critical patent/CN109919828B/en
Publication of CN109919828A publication Critical patent/CN109919828A/en
Application granted granted Critical
Publication of CN109919828B publication Critical patent/CN109919828B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The present invention discloses a kind of method for judging difference between 3D model, comprising the following steps: 1) model that 3D Model I and 3D modelⅱ are converted to CAD format obtains the CAD model I of 3D and the CAD model II of 3D;2) CAD model II of the CAD model of 3D I and 3D are converted into 2 d plane picture;3) a reference coordinates point is determined at random;4) position of reference coordinates point is marked in 2 d plane picture I, and reference coordinates point is connect with the endpoint of 2 d plane picture I, and record the linear distance between reference coordinates point and each endpoint, and first group of reduced value is obtained;5) position of reference coordinates point is marked in 2 d plane picture II, and reference coordinates point is connect with the endpoint of 2 d plane picture II, and record the linear distance between reference coordinates point and each endpoint, and second group of reduced value is obtained;The method judgment mode of difference is simple between judgement 3D model, and processing speed is fast.

Description

A method of judging difference between 3D model
Technical field
The present invention relates to a kind of methods of difference between judgement 3D model.
Background technique
With the development of computer technology and the raising of computer hardware technology, the acquiring technology of threedimensional model is sent out rapidly Exhibition.Threedimensional model not only quantitatively has to be increased tremendously, and threedimensional model using more and more extensive.It is main to answer It include that Design of Industrial Product, virtual reality, 3d gaming, Building Design, video display animation, medical diagnosis and molecule are raw with field Object research etc. various aspects.To these magnanimity model informations carry out effectively management be it is particularly significant, with convenient search, inquiry with Recycling.Therefore threedimensional model is quickly identified as urgent problem.
Summary of the invention
The technical problem to be solved in the present invention is to provide a kind of judgment mode is simple, the fast judgement 3D model of processing speed it Between difference method.
To solve the above problems, the present invention adopts the following technical scheme:
A method of judging difference between 3D model, comprising the following steps:
1) model that 3D Model I and 3D modelⅱ are converted to CAD format obtains the CAD model I of 3D and the CAD mould of 3D Type II;
2) main view of the main view of the CAD model of 3D I and the CAD model II of 3D is converted into 2 d plane picture;
3) a reference coordinates point is determined at random;
4) position of reference coordinates point is marked in 2 d plane picture I, and by reference coordinates point and 2 d plane picture I Endpoint connection, and record the linear distance between reference coordinates point and each endpoint, obtain first group of reduced value;
5) position of reference coordinates point is marked in 2 d plane picture II, and by reference coordinates point and 2 d plane picture II endpoint connection, and the linear distance between reference coordinates point and each endpoint is recorded, obtain second group of reduced value;
6) first group of reduced value and first group of reduced value are subjected to summation process respectively and compare two values, if Two values are not identical, prove that 3D model has differences.
Preferably, also including to carry out verification processing if two values of the step 7) after summation process are the same.
Preferably, the specific steps of the verification processing are as follows:
1., the orthographic axonometric projection graph of the orthographic axonometric projection graph of the CAD model I of 3D and the CAD model II of 3D is converted into 2 d plane picture obtains 2 d plane picture III and 2 d plane picture IV;
2., mark in 2 d plane picture III position of reference coordinates point, and by reference coordinates point and two-dimensional surface The endpoint of figure III connects, and records the linear distance between reference coordinates point and each endpoint, obtains third group reduced value;Two The position of reference coordinates point is marked in dimensional plane figure IV, and reference coordinates point is connect with the endpoint of 2 d plane picture IV, And the linear distance between reference coordinates point and each endpoint is recorded, obtain the 4th group of reduced value;
3., seek the average value of third group reduced value and the 4th group of reduced value respectively, if two are flat
Mean value is inconsistent, is judged as and has differences.
Preferably, the reference coordinates point is two-dimentional reference coordinates point.
Preferably, also including 2 d plane picture localization process in the step 3).
Preferably, the specific steps of the 2 d plane picture localization process are as follows:
A) positioning coordinate points are selected;
B) coincidence endpoint is selected according to the coordinate of each endpoint of 2 d plane picture, wherein the selection mode for being overlapped endpoint is It is worth the obtained comparison foundation according to numerical value as each endpoint according to the x value-y of each endpoint, after subtracting y value according to x value The maximum endpoint of result value is as coincidence endpoint;
C) it is Chong Die with coordinate points are positioned that endpoint will be overlapped.
The invention has the benefit that by the way that 3D model conversion to be compared for two dimensional model, and according to referring to seat Linear distance between punctuate and two dimensional model judges to have differences when two models, and judgment mode is simple, and with judgement Speed is fast.
Detailed description of the invention
To describe the technical solutions in the embodiments of the present invention more clearly, make required in being described below to embodiment Attached drawing is briefly described, it should be apparent that, drawings in the following description are only some embodiments of the invention, for For those of ordinary skill in the art, without creative efforts, it can also be obtained according to these attached drawings other Attached drawing.
Fig. 1 is the flow chart of the method for difference between a kind of judgement 3D model of the present invention.
Fig. 2 is that the endpoint connection of the reference coordinates point and 2 d plane picture of difference between a kind of judgement 3D model of the present invention is shown It is intended to.
Specific embodiment
Technical solution of the present invention is described below, it is clear that described embodiment is only present invention a part Embodiment, instead of all the embodiments, to those skilled in the art, in the premise not made the creative labor Under, other embodiments can also be obtained according to these embodiments.
Embodiment 1
As shown in Figs. 1-2, a method of judging difference between 3D model, comprising the following steps:
1) model that 3D Model I and 3D modelⅱ are converted to CAD format obtains the CAD model I of 3D and the CAD mould of 3D Type II;
2) main view of the main view of the CAD model of 3D I and the CAD model II of 3D is converted into 2 d plane picture;
3) a reference coordinates point is determined at random;
4) position of reference coordinates point is marked in 2 d plane picture I, and by reference coordinates point and 2 d plane picture I Endpoint connection, and record the linear distance between reference coordinates point and each endpoint, obtain first group of reduced value;
5) position of reference coordinates point is marked in 2 d plane picture II, and by reference coordinates point and 2 d plane picture II endpoint connection, and the linear distance between reference coordinates point and each endpoint is recorded, obtain second group of reduced value;
6) first group of reduced value and first group of reduced value are subjected to summation process respectively and compare two values, if Two values are not identical, prove that 3D model has differences.
It in the present embodiment, also include to carry out verification processing if two values of the step 7) after summation process are the same.
In the present embodiment, the specific steps of the verification processing are as follows:
1., the orthographic axonometric projection graph of the orthographic axonometric projection graph of the CAD model I of 3D and the CAD model II of 3D is converted into 2 d plane picture obtains 2 d plane picture III and 2 d plane picture IV;
2., mark in 2 d plane picture III position of reference coordinates point, and by reference coordinates point and two-dimensional surface The endpoint of figure III connects, and records the linear distance between reference coordinates point and each endpoint, obtains third group reduced value;Two The position of reference coordinates point is marked in dimensional plane figure IV, and reference coordinates point is connect with the endpoint of 2 d plane picture IV, And the linear distance between reference coordinates point and each endpoint is recorded, obtain the 4th group of reduced value;
3., seek the average value of third group reduced value and the 4th group of reduced value respectively, if two are flat
Mean value is inconsistent, is judged as and has differences.
It in the present embodiment, also include 2 d plane picture localization process in the step 3).
In the present embodiment, the specific steps of the 2 d plane picture localization process are as follows:
A) positioning coordinate points are selected;
B) coincidence endpoint is selected according to the coordinate of each endpoint of 2 d plane picture, wherein the selection mode for being overlapped endpoint is It is worth the obtained comparison foundation according to numerical value as each endpoint according to the x value-y of each endpoint, after subtracting y value according to x value The maximum endpoint of result value is as coincidence endpoint;
C) it is Chong Die with coordinate points are positioned that endpoint will be overlapped.
Embodiment 2
As shown in Figs. 1-2, a method of judging difference between 3D model, comprising the following steps:
1) model that 3D Model I and 3D modelⅱ are converted to CAD format obtains the CAD model I of 3D and the CAD mould of 3D Type II;
2) main view of the main view of the CAD model of 3D I and the CAD model II of 3D is converted into 2 d plane picture;
3) a reference coordinates point is determined at random;
4) position of reference coordinates point is marked in 2 d plane picture I, and by reference coordinates point and 2 d plane picture I Endpoint connection, and record the linear distance between reference coordinates point and each endpoint, obtain first group of reduced value;
5) position of reference coordinates point is marked in 2 d plane picture II, and by reference coordinates point and 2 d plane picture II endpoint connection, and the linear distance between reference coordinates point and each endpoint is recorded, obtain second group of reduced value;
6) first group of reduced value and first group of reduced value are subjected to summation process respectively and compare two values, if Two values are not identical, prove that 3D model has differences.
It in the present embodiment, also include to carry out verification processing if two values of the step 7) after summation process are the same.
In the present embodiment, the specific steps of the verification processing are as follows:
1., the orthographic axonometric projection graph of the orthographic axonometric projection graph of the CAD model I of 3D and the CAD model II of 3D is converted into 2 d plane picture obtains 2 d plane picture III and 2 d plane picture IV;
2., mark in 2 d plane picture III position of reference coordinates point, and by reference coordinates point and two-dimensional surface The endpoint of figure III connects, and records the linear distance between reference coordinates point and each endpoint, obtains third group reduced value;Two The position of reference coordinates point is marked in dimensional plane figure IV, and reference coordinates point is connect with the endpoint of 2 d plane picture IV, And the linear distance between reference coordinates point and each endpoint is recorded, obtain the 4th group of reduced value;
3., seek the average value of third group reduced value and the 4th group of reduced value respectively, sentence if two average values are inconsistent Break to have differences.
In the present embodiment, the reference coordinates point is two-dimentional reference coordinates point.
It in the present embodiment, also include 2 d plane picture localization process in the step 3).
In the present embodiment, the specific steps of the 2 d plane picture localization process are as follows:
A) positioning coordinate points are selected;
B) coincidence endpoint is selected according to the coordinate of each endpoint of 2 d plane picture, wherein the selection mode for being overlapped endpoint is It is worth the obtained comparison foundation according to numerical value as each endpoint according to the x value-y of each endpoint, after subtracting y value according to x value The maximum endpoint of result value is as coincidence endpoint;
C) it is Chong Die with coordinate points are positioned that endpoint will be overlapped.
Embodiment 3
As shown in Figs. 1-2, a method of judging difference between 3D model, comprising the following steps:
1) model that 3D Model I and 3D modelⅱ are converted to CAD format obtains the CAD model I of 3D and the CAD mould of 3D Type II;
2) main view of the main view of the CAD model of 3D I and the CAD model II of 3D is converted into 2 d plane picture;
3) a reference coordinates point is determined at random;
4) position of reference coordinates point is marked in 2 d plane picture I, and by reference coordinates point and 2 d plane picture I Endpoint connection, and record the linear distance between reference coordinates point and each endpoint, obtain first group of reduced value;
5) position of reference coordinates point is marked in 2 d plane picture II, and by reference coordinates point and 2 d plane picture II endpoint connection, and the linear distance between reference coordinates point and each endpoint is recorded, obtain second group of reduced value;
6) first group of reduced value and first group of reduced value are subjected to summation process respectively and compare two values, if Two values are not identical, prove that 3D model has differences.
It in the present embodiment, also include to carry out verification processing if two values of the step 7) after summation process are the same.
In the present embodiment, the specific steps of the verification processing are as follows:
1., the orthographic axonometric projection graph of the orthographic axonometric projection graph of the CAD model I of 3D and the CAD model II of 3D is converted into 2 d plane picture obtains 2 d plane picture III and 2 d plane picture IV;
2., mark in 2 d plane picture III position of reference coordinates point, and by reference coordinates point and two-dimensional surface The endpoint of figure III connects, and records the linear distance between reference coordinates point and each endpoint, obtains third group reduced value;Two The position of reference coordinates point is marked in dimensional plane figure IV, and reference coordinates point is connect with the endpoint of 2 d plane picture IV, And the linear distance between reference coordinates point and each endpoint is recorded, obtain the 4th group of reduced value;
3., seek the average value of third group reduced value and the 4th group of reduced value respectively, if two are flat
Mean value is inconsistent, is judged as and has differences.
In the present embodiment, the reference coordinates point is two-dimentional reference coordinates point.
It in the present embodiment, also include 2 d plane picture localization process in the step 3).
In the present embodiment, the specific steps of the 2 d plane picture localization process are as follows:
A) positioning coordinate points are selected;
B) coincidence endpoint is selected according to the coordinate of each endpoint of 2 d plane picture, wherein the selection mode for being overlapped endpoint is It is worth the obtained comparison foundation according to numerical value as each endpoint according to the x value-y of each endpoint, after subtracting y value according to x value The maximum endpoint of result value is as coincidence endpoint;
C) it is Chong Die with coordinate points are positioned that endpoint will be overlapped.
In the present embodiment, coordinate points are positioned and reference coordinates point is not be overlapped.
In the present embodiment, reference coordinates point is not in 2 d plane picture.
The invention has the benefit that by the way that 3D model conversion to be compared for two dimensional model, and according to referring to seat Linear distance between punctuate and two dimensional model judges to have differences when two models, and judgment mode is simple, and with judgement Speed is fast.
The above description is merely a specific embodiment, but scope of protection of the present invention is not limited thereto, any The change or replacement expected without creative work, should be covered by the protection scope of the present invention.

Claims (6)

1. a kind of method of difference between judgement 3D model, it is characterised in that: the following steps are included:
1) model that 3D Model I and 3D modelⅱ are converted to CAD format obtains the CAD model I of 3D and the CAD model II of 3D;
2) main view of the main view of the CAD model of 3D I and the CAD model II of 3D is converted into 2 d plane picture;
3) a reference coordinates point is determined at random;
4) position of reference coordinates point is marked in 2 d plane picture I, and by the end of reference coordinates point and 2 d plane picture I Point connection, and the linear distance between reference coordinates point and each endpoint is recorded, obtain first group of reduced value;
5) position of reference coordinates point is marked in 2 d plane picture II, and by reference coordinates point and 2 d plane picture II Endpoint connection, and the linear distance between reference coordinates point and each endpoint is recorded, obtain second group of reduced value;
6) first group of reduced value and first group of reduced value are subjected to summation process respectively and compare two values, if two Numerical value is not identical, proves that 3D model has differences.
2. the method for difference between a kind of judgement 3D model according to claim 1, it is characterised in that: also include step 7) verification processing is carried out if two values after summation process are the same.
3. the method for difference between a kind of judgement 3D model according to claim 2, it is characterised in that: the verification processing Specific steps are as follows:
1., the orthographic axonometric projection graph of the orthographic axonometric projection graph of the CAD model I of 3D and the CAD model II of 3D is converted into two dimension Plan view obtains 2 d plane picture III and 2 d plane picture IV;
2., mark in 2 d plane picture III position of reference coordinates point, and by reference coordinates point and 2 d plane picture III Endpoint connection, and record the linear distance between reference coordinates point and each endpoint, obtain third group reduced value;It is flat in two dimension The position of reference coordinates point is marked in face figure IV, and reference coordinates point is connect with the endpoint of 2 d plane picture IV, and remembers The linear distance between reference coordinates point and each endpoint is recorded, the 4th group of reduced value is obtained;
3., seek the average value of third group reduced value and the 4th group of reduced value respectively, be judged as if two average values are inconsistent It has differences.
4. the method for difference between a kind of judgement 3D model according to claim 3, it is characterised in that: the reference coordinates Point is two-dimentional reference coordinates point.
5. the method for difference between a kind of judgement 3D model according to claim 4, it is characterised in that: in the step 3) It also include 2 d plane picture localization process.
6. the method for difference between a kind of judgement 3D model according to claim 5, it is characterised in that: the two-dimensional surface The specific steps of figure localization process are as follows:
A) positioning coordinate points are selected;
B) coincidence endpoint is selected according to the coordinate of each endpoint of 2 d plane picture, wherein being overlapped according to the selection mode of endpoint X value-the y of each endpoint is worth the obtained comparison foundation according to numerical value as each endpoint, subtracts result after y value according to x value The maximum endpoint of numerical value is as coincidence endpoint;
C) it is Chong Die with coordinate points are positioned that endpoint will be overlapped.
CN201910038874.9A 2019-01-16 2019-01-16 Method for judging difference between 3D models Active CN109919828B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910038874.9A CN109919828B (en) 2019-01-16 2019-01-16 Method for judging difference between 3D models

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910038874.9A CN109919828B (en) 2019-01-16 2019-01-16 Method for judging difference between 3D models

Publications (2)

Publication Number Publication Date
CN109919828A true CN109919828A (en) 2019-06-21
CN109919828B CN109919828B (en) 2023-01-06

Family

ID=66960337

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910038874.9A Active CN109919828B (en) 2019-01-16 2019-01-16 Method for judging difference between 3D models

Country Status (1)

Country Link
CN (1) CN109919828B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112132167A (en) * 2019-06-24 2020-12-25 商汤集团有限公司 Image generation and neural network training method, apparatus, device, and medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07105410A (en) * 1993-10-06 1995-04-21 Omron Corp Device and method for finding vertex candidate for three-dimensional model from three-dimensional coordinate data
US6330356B1 (en) * 1999-09-29 2001-12-11 Rockwell Science Center Llc Dynamic visual registration of a 3-D object with a graphical model
US20080100616A1 (en) * 2006-10-30 2008-05-01 Tsunehiko Yamazaki Method for converting two-dimensional drawing into three-dimensional solid model and method for converting attribute
KR101655783B1 (en) * 2015-05-22 2016-09-08 대우조선해양 주식회사 Apparatus and method for comparing tree-demensional model of heterogeneous cad system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07105410A (en) * 1993-10-06 1995-04-21 Omron Corp Device and method for finding vertex candidate for three-dimensional model from three-dimensional coordinate data
US6330356B1 (en) * 1999-09-29 2001-12-11 Rockwell Science Center Llc Dynamic visual registration of a 3-D object with a graphical model
US20080100616A1 (en) * 2006-10-30 2008-05-01 Tsunehiko Yamazaki Method for converting two-dimensional drawing into three-dimensional solid model and method for converting attribute
KR101655783B1 (en) * 2015-05-22 2016-09-08 대우조선해양 주식회사 Apparatus and method for comparing tree-demensional model of heterogeneous cad system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
章志勇: ""三维模型几何相似性比较的研究"", 《中国优秀博硕士学位论文全文数据库 (博士)信息科技辑》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112132167A (en) * 2019-06-24 2020-12-25 商汤集团有限公司 Image generation and neural network training method, apparatus, device, and medium
CN112132167B (en) * 2019-06-24 2024-04-16 商汤集团有限公司 Image generation and neural network training method, device, equipment and medium

Also Published As

Publication number Publication date
CN109919828B (en) 2023-01-06

Similar Documents

Publication Publication Date Title
CN111932688B (en) Indoor plane element extraction method, system and equipment based on three-dimensional point cloud
CN107045526B (en) Pattern recognition method for electronic building construction drawing
KR101955035B1 (en) Method for designing a geometrical three-dimensional modeled object
CN109685914A (en) Cutting profile based on triangle grid model mends face algorithm automatically
CN108182318B (en) A method of the plastic geometry mouldability analysis based on UG NX system
CN100353384C (en) Fast method for posting players to electronic game
CN109636919A (en) A kind of virtual museum's construction method, system and storage medium based on holographic technique
CN111898990A (en) Building construction progress management method
CN104199659A (en) Method and device for exporting model information capable of being identified by 3DMAX
CN104143209B (en) Method for engraving three-dimensional model based on line pattern
CN108733911A (en) Building aluminum alloy pattern plate construction code Design method based on three-dimensional digital model
CN111612911A (en) Dynamo-based point cloud BIM automatic modeling method
CN116152444B (en) Automatic adsorption method, device and medium for three-dimensional scene model based on digital twin
CN111340834B (en) Lining plate assembly system and method based on laser radar and binocular camera data fusion
CN109919828A (en) A method of judging difference between 3D model
CN116126809A (en) Building information model data storage conversion method based on national standard
CN111369670A (en) Method for real-time construction of practical training digital twin model
CN108898679A (en) A kind of method of component serial number automatic marking
CN108563915A (en) Vehicle digitizes emulation testing model construction system and method, computer program
CN112183264A (en) Method for judging people lingering under crane boom based on spatial relationship learning
CN108763767A (en) Big data quantity IGS industry pattern POLYGON conversion methods towards VR engines
CN114791800A (en) White-model building edge tracing method and device, computer equipment and storage medium
CN112687004A (en) Indoor ceiling method and system based on AR and artificial intelligence
CN115213038B (en) Polygonal frame selection method for point cloud of automobile sheet metal
CN111985011A (en) Customized combined cabinet grouping design method based on inter-board model information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant