CN113239445A - AR-based indoor pipeline information display method and system - Google Patents

AR-based indoor pipeline information display method and system Download PDF

Info

Publication number
CN113239445A
CN113239445A CN202110653302.9A CN202110653302A CN113239445A CN 113239445 A CN113239445 A CN 113239445A CN 202110653302 A CN202110653302 A CN 202110653302A CN 113239445 A CN113239445 A CN 113239445A
Authority
CN
China
Prior art keywords
pipeline
module
marker
information
stereogram
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110653302.9A
Other languages
Chinese (zh)
Other versions
CN113239445B (en
Inventor
路亚
李腾
杨睿
张科伦
李建华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing College of Electronic Engineering
Original Assignee
Chongqing College of Electronic Engineering
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing College of Electronic Engineering filed Critical Chongqing College of Electronic Engineering
Priority to CN202110653302.9A priority Critical patent/CN113239445B/en
Publication of CN113239445A publication Critical patent/CN113239445A/en
Application granted granted Critical
Publication of CN113239445B publication Critical patent/CN113239445B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/13Architectural design, e.g. computer-aided architectural design [CAAD] related to design of buildings, bridges, landscapes, production plants or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/18Network design, e.g. design based on topological or interconnect aspects of utility systems, piping, heating ventilation air conditioning [HVAC] or cabling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • G06F30/27Design optimisation, verification or simulation using machine learning, e.g. artificial intelligence, neural networks, support vector machines [SVM] or training a model
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2113/00Details relating to the application field
    • G06F2113/14Pipes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2113/00Details relating to the application field
    • G06F2113/16Cables, cable trees or wire harnesses

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • Evolutionary Computation (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Data Mining & Analysis (AREA)
  • Computational Mathematics (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Mathematical Physics (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Pure & Applied Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Optimization (AREA)
  • Molecular Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Mathematical Analysis (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Graphics (AREA)
  • Architecture (AREA)
  • Civil Engineering (AREA)
  • Structural Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • Image Processing (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The scheme belongs to the technical field of augmented reality, and particularly relates to an AR-based indoor pipeline information display method and system. The method comprises the following steps: a detection module: comprehensively detecting detection information of the embedded pipeline through infrared rays, ultrasonic waves and millimeter waves, wherein the detection information comprises the type, the position and the embedding depth of the embedded pipeline; the detection information also comprises the position relation between the marker and the wall; a processing module: the system is used for processing the detection information and generating a 3D stereogram which is the same as the reality; a storage module: the 3D perspective view is used for storing the pipeline and the position relation between the marker and the wall; an AR module: acquiring a scene image and displaying a 3D stereogram, and displaying a superimposed pattern of the scene image and the 3D stereogram by taking the marker as an origin after identifying the marker; a marker: the image used for triggering the AR module to show the 3D stereogram. According to the scheme, the marker is used as the original point to display the superposed pattern of the field picture and the 3D stereogram and display the superposed pattern, so that the positioning precision is improved.

Description

AR-based indoor pipeline information display method and system
Technical Field
The scheme belongs to the technical field of augmented reality, and particularly relates to an AR-based indoor pipeline information display method and system.
Background
Whether newly bought second-hand houses or old houses living for many years, as the living time increases, the old houses are often repaired by the choice of having more comfortable home experience. However, compared with a new house, the construction requirements for renovating an old house are quite different, and many problems need to be noticed, for example, a wire pipe, a water pipe and the like buried in a wall or a floor, and due to the fact that construction drawings (such as an old community and a second-hand house) cannot be obtained due to too long time, renovation construction is inconvenient for users.
The conventional mobile terminal screen only displays existing objects in the current real scene, but for the construction and construction display of the pipe network in the old house, explanatory and descriptive parameter information of pipe network nodes is often needed, so that an operator can conveniently and quickly know more and richer information of the pipe network scene in the old house, and new augmented reality application experience is brought to the user. Therefore, there is an urgent need for a system capable of performing 3D display of a pipeline buried in a wall or under a floor.
The invention patent with the application number of CN202010466577.7 discloses a 3D model display method and a system for an urban pipe network pipeline based on AR technology, and the method comprises a pipe network node model construction module, a data acquisition module and a data display module, wherein the pipe network node model construction module is used for reading a pipe network node index information file, acquiring a pipe network node index information set, and constructing a pipe network node model by combining an urban three-dimensional map; the pipeline coordinate mapping module is used for reading the two-dimensional pipeline path graph file, obtaining a two-dimensional pipeline path under a relative coordinate system, and mapping the relative coordinate system of the two-dimensional pipeline path to an absolute coordinate system of the pipe network node model; the three-dimensional pipe network model building module is used for building a three-dimensional channel scene on the pipe network node model processed by the pipeline coordinate mapping module by using a pipe network channel model library to obtain a three-dimensional pipe network model; and the AR display module is used for determining the display pose of the three-dimensional pipe network model by taking the pipe network nodes as the reference and performing AR display.
Although this scheme can be constructed city pipe network pipeline 3D model, when need carry out the 3D model show at every turn, need definitely to input corresponding pipe network node index information file to this obtains the pipe network node attribute information that the geographical positional information and the geographical positional information of each pipe network node on the different pipelines of different places match, but outdoor pipeline's landmark location adopts multiple positioning system to synthesize the location and just can obtain very high precision, but to indoor, positioning system accurate location then is more difficult, be difficult to reach centimeter level's precision positioning.
Disclosure of Invention
The scheme provides an AR-based indoor pipeline information display method and system, and aims to solve the problem that an indoor 3D model is not matched with an actual pipeline position.
In order to achieve the above object, the present solution provides an AR-based indoor pipeline information display system, comprising
A detection module: comprehensively detecting detection information of the embedded pipeline through infrared rays, ultrasonic waves and millimeter waves, wherein the detection information comprises the type, the position and the embedding depth of the embedded pipeline; the detection information also comprises the position relation between the marker and the wall;
a processing module: the system is used for processing the detection information and generating a 3D stereogram which is the same as the reality;
a storage module: the 3D perspective view is used for storing the pipeline and the position relation between the marker and the wall;
an AR module: acquiring a scene image and displaying a 3D stereogram, and displaying a superimposed pattern of the scene image and the 3D stereogram by taking the marker as an origin after identifying the marker;
a marker: the image used for triggering the AR module to show the 3D stereogram.
The principle of the scheme is as follows: firstly, a detection module is adopted to detect and identify the pipelines embedded in the indoor wall or bottom plate to obtain
The information of the type, the position and the embedding depth of the embedded pipeline and the position relation between the marker and the wall. Then, an AR module is used for acquiring a field picture; then the detection module transmits the detection information and the field picture to the processing module for processing, the processing module generates a 3D stereogram which is coincident with reality according to the actual position of the indoor pipeline information and stores the 3D stereogram in the storage module, and then the AR module is adopted to identify the marker, so that the field picture and the 3D stereogram can be superimposed to the real world picture to obtain the augmented reality picture. Then the staff can obtain indoor pipeline information according to the augmented reality picture, can not destroy the pipeline when carrying out old building renovation.
The beneficial effect of this scheme:
1. the marker is used as the origin to display the superposed patterns of the on-site picture and the 3D stereogram and display the superposed patterns, so that the positioning precision of the positioning system which cannot be achieved indoors is achieved, the 3D stereogram of the pipeline is identical to the pipeline position in the indoor reality, and when the overhaul work is carried out, workers carrying out the overhaul work can conveniently and quickly know more and richer information of pipeline scenes in old houses, and further the working efficiency is improved.
2. The 3D stereograph can be displayed directly by scanning the marker, and the operation is convenient and simple.
Further, the detection information also comprises a two-dimensional plane electronic pipeline map; the AR module also comprises a two-dimensional plane electronic pipeline map display module which is used for displaying the two-dimensional plane electronic pipeline map; and displaying the current position of the mobile terminal on the two-dimensional plane electronic pipeline map according to the real-time position information. When the staff carries out indoor renovation, can cross the real-time position of knowing oneself to but the different angles of optional and place are renovated, and can a plurality of renovation personnel carry out the renovation simultaneously, can not disturb work each other yet.
Further, the AR module comprises a shooting device, the shooting device completes real object identification in the real world by using an artificial intelligence learning technology of fusion Net, the fusion Net is a mixture of three neural networks which are respectively V-CNN I, V-CNN II and MV-CNN, the MV-CNN neural network is constructed based on an AlexNet structure and is pre-trained by an ImageNet data set, the three networks are fused in a scoring layer, and finally predicted classification is found by calculating a scored linear combination. V-CNN I and V-CNN II use a voxelized CAD model, and MV-CNN uses 2D projection as input. The module uses a standard pre-training neural network model (AlexNet) as the basis of the 2D network MV-CNN, and performs warm start pre-training on a network of the three-dimensional object 2D projection based on a large-scale 2D pixel picture data set ImageNet. Many features used for 2D image classification need not be trained from scratch, subject to pre-training.
The detection module further comprises a comparison unit, a medium emission unit and a processing unit, wherein the comparison unit is used for being fixed on two sides of the wall body to allow a working medium to pass through, the medium emission unit is used for emitting non-parallel working media at corresponding positions on two sides of the wall body respectively, the processing unit is used for obtaining pipeline layout information according to the ratio of the size of the image formed by the comparison unit and the ratio of the size of the image of the two pipelines, and the pipeline layout information comprises embedded depth information.
Firstly, respectively arranging a medium transmitting unit and a medium receiving unit at two sides of a wall body, then transmitting a working medium by the medium transmitting unit at one side of the wall body, then respectively imaging at two sides of the wall body, wherein one side close to the medium transmitting unit is an imaging 1, and one side far away from the medium transmitting unit is an imaging 2, then obtaining the embedded depth information of a pipeline and the diameter information of the pipeline according to the known value of the contrast unit and the imaging 1 and the imaging 2, and when the size of an image formed by the contrast unit is equal to the value of the imaging 1, the position of the pipeline is positioned at one side close to the medium transmitting unit; when the size of the image formed by the contrast unit is equal to the value of the image 2, the position of the pipeline is positioned on the side far away from the medium emission unit; then, the embedding depth information of the pipeline and the diameter information of the pipeline can be obtained according to the size of the comparison unit and the size of the image formed twice.
Further, the AR module is also provided with a renovation step, and the processing module obtains the renovation step at equal intervals and displays the renovation step. When carrying out house renovation to the renovation personnel, the AR module also demonstrates the renovation step in proper order when showing 3D three-dimensional thing, instructs the staff to carry out the renovation work from this, even the renovation staff does not know how the order of renovation is how, also can carry out the renovation work according to the renovation step, and then can improve the renovation personnel work efficiency, reduces the time that delays among the investigation renovation process, avoids damaging the pipeline at the investigation renovation in-process.
Further, the identifier has a plurality of identifiers, each of the identifiers corresponding to an associated chamber. When the marker is placed, the staff can mark numbers matched with corresponding rooms on the marker, and the situation that a plurality of markers are mixed behind an instrument and are difficult to distinguish is avoided.
The invention also provides an AR-based indoor pipeline information display method, which comprises the following steps:
step one, shooting the indoor condition through a camera device in an AR module to form a shot image;
step two: the type, the position and the embedding depth of the pipeline are detected through the detection module, and then the detected information is sent to the processing module;
step three: the position relation between the marker and the wall is obtained through the detection module, and then the information is fed back to the processing module of the cloud;
step four: the processing module processes the pipeline information and the position information of the marker and the wall, generates a 3D stereogram which is the same as the reality, and then stores the 3D stereogram into the storage module;
step five: the marker is scanned and identified through the AR module, and then the on-site picture and the superposition pattern of the 3D stereograph can be displayed by taking the marker as an origin after the marker is identified.
Drawings
FIG. 1 is a logical framework diagram of an embodiment of the AR-based indoor pipeline information presentation system of the present invention;
fig. 2 is a flowchart of an embodiment of an AR-based indoor pipeline information presentation method according to the present invention.
FIG. 3 is a diagram of the measurement of pipeline information of the indoor wall according to the present invention.
Detailed Description
The following is further detailed by the specific embodiments:
the embodiment is basically as shown in the attached figure 1:
an AR-based indoor pipeline information presentation system, comprising:
a detection module: comprehensively detecting detection information of the embedded pipeline through infrared rays, ultrasonic waves and millimeter waves, wherein the detection information comprises the type, the position and the embedding depth of the embedded pipeline; the detection information also comprises the position relation between the marker and the wall;
the detection information also comprises a two-dimensional plane electronic pipeline map; the AR module also comprises a two-dimensional plane electronic pipeline map display module which is used for displaying the two-dimensional plane electronic pipeline map; and displaying the current position of the mobile terminal on the two-dimensional plane electronic pipeline map according to the real-time position information. When the staff carries out indoor renovation, can cross the real-time position of knowing oneself to but the different angles of optional and place are renovated, and can a plurality of renovation personnel carry out the renovation simultaneously, can not disturb work each other yet. The two-dimensional plane electronic pipeline map is formed by drawing centimeter-level GNSS positioning node pipeline data.
A processing module: the system is used for processing the detection information and generating a 3D stereogram which is the same as the reality;
a storage module: the 3D perspective view is used for storing the pipeline and the position relation between the marker and the wall;
an AR module: acquiring a scene image and displaying a 3D stereogram, and displaying a superimposed pattern of the scene image and the 3D stereogram by taking the marker as an origin after identifying the marker;
the AR module comprises shooting equipment, the shooting equipment completes real object identification in the real world by using an artificial intelligence learning technology of fusion Net, the fusion Net is a mixture of three neural networks which are respectively V-CNN I, V-CNN II and MV-CNN, the MV-CNN neural network is constructed based on an AlexNet structure and is pre-trained by an ImageNet data set, the three networks are fused in a scoring layer, and finally predicted classification is found by calculating a scored linear combination. V-CNN I and V-CNN II use a voxelized CAD model, and MV-CNN uses 2D projection as input. The module uses a standard pre-training neural network model (AlexNet) as the basis of the 2D network MV-CNN, and performs warm start pre-training on a network of the three-dimensional object 2D projection based on a large-scale 2D pixel picture data set ImageNet. Many features used for 2D image classification need not be trained from scratch, subject to pre-training.
And the AR module is also provided with a renovation step, and the processing module acquires the renovation step at equal intervals and displays the renovation step. When carrying out house renovation to the renovation personnel, the AR module also demonstrates the renovation step in proper order when showing 3D three-dimensional thing, instructs the staff to carry out the renovation work from this, even the renovation staff does not know how the order of renovation is how, also can carry out the renovation work according to the renovation step, and then can improve the renovation personnel work efficiency, reduces the time that delays among the investigation renovation process, avoids damaging the pipeline at the investigation renovation in-process.
A marker: the image used for triggering the AR module to show the 3D stereogram.
The identifier is provided in plurality, each identifier corresponding to an associated chamber. When the marker is placed, the staff can mark numbers matched with corresponding rooms on the marker, and the situation that a plurality of markers are mixed behind an instrument and are difficult to distinguish is avoided.
As shown in fig. 3:
the detection module further comprises a comparison unit, a medium emitting unit and a processing unit, wherein the comparison unit is used for being fixed on two sides of the wall body to allow working media to pass through, the medium emitting unit is used for emitting non-parallel working media at corresponding positions on two sides of the wall body respectively, the processing unit is used for obtaining pipeline layout information according to the ratio of the sizes of the images formed by the comparison unit and the ratio of the sizes of the images of the two pipelines, and the pipeline layout information comprises embedded depth information. (specifically, the working medium in this embodiment is X-ray)
Firstly, respectively arranging a medium transmitting unit and a medium receiving unit at two sides of a wall body, then transmitting a working medium by the medium transmitting unit at one side of the wall body, then respectively imaging at two sides of the wall body, wherein one side close to the medium transmitting unit is an imaging 1, and one side far away from the medium transmitting unit is an imaging 2, then obtaining the embedded depth information of a pipeline and the diameter information of the pipeline according to the known value of the contrast unit and the imaging 1 and the imaging 2, and when the size of an image formed by the contrast unit is equal to the value of the imaging 1, the position of the pipeline is positioned at one side close to the medium transmitting unit; when the size of the image formed by the contrast unit is equal to the value of the image 2, the position of the pipeline is positioned on the side far away from the medium emission unit; then, the embedding depth information of the pipeline and the diameter information of the pipeline can be obtained according to the size of the comparison unit and the size of the image formed twice.
The method comprises the following specific operations: the method comprises the steps of fixing a marker 5 with a wall 1, respectively arranging a medium transmitting unit 2 and a medium receiving unit 3 on two sides of the wall 1, starting the medium transmitting unit 2, when a pipeline exists in the wall 1, arranging a contrast unit 4 on two sides of the wall 1 and between the medium transmitting unit 2 and the medium receiving unit 3, then starting the medium transmitting unit 2 to transmit X-rays on the outer side of the wall 1, wherein the medium transmitting unit 2 is a portable X-ray machine (namely equipment comprising a basic circuit and an X-ray tube, and the X-ray tube transmits the X-rays with divergent properties), and the medium receiving unit 3 finishes primary collection on the inner side of the wall 1. Then the positions of the medium transmitting unit 2 and the medium receiving unit 3 are exchanged, and the second acquisition is completed.
Then the processing module can obtain the embedded depth information of the pipeline and the diameter information of the pipeline according to the known size (and the known ratio) of the image formed twice because the size of the comparison unit is known, thereby completing the construction of the three-dimensional model, and finally the three-dimensional model is stored by the storage module. One account number may then correspond to one three-dimensional model. It is also possible that different users use different identifiers 5 for the distinction.
As shown in fig. 2:
the embodiment also discloses an AR-based indoor pipeline information display method, which comprises the following steps:
step one, shooting the indoor condition through a camera device in an AR module to form a shot image;
step two: the type, the position and the embedding depth of the pipeline are detected through the detection module, and then the detected information is sent to the processing module;
step three: the position relation between the marker and the wall is obtained through the detection module, and then the information is fed back to the processing module of the cloud;
step four: the processing module processes the pipeline information and the position information of the marker and the wall, generates a 3D stereogram which is the same as the reality, and then stores the 3D stereogram into the storage module;
step five: the marker is scanned and identified through the AR module, and then the on-site picture and the superposition pattern of the 3D stereograph can be displayed by taking the marker as an origin after the marker is identified.
The foregoing is merely an example of the present invention and common general knowledge of known specific structures and features of the embodiments is not described herein in any greater detail. It should be noted that, for those skilled in the art, without departing from the structure of the present invention, several changes and modifications can be made, which should also be regarded as the protection scope of the present invention, and these will not affect the effect of the implementation of the present invention and the practicability of the patent. The scope of the claims of the present application shall be determined by the contents of the claims, and the description of the embodiments and the like in the specification shall be used to explain the contents of the claims.

Claims (7)

1. Indoor pipeline information display system based on AR, its characterized in that: the method comprises the following steps:
a detection module: comprehensively detecting detection information of the embedded pipeline through infrared rays, ultrasonic waves and millimeter waves, wherein the detection information comprises the type, the position and the embedding depth of the embedded pipeline; the detection information also comprises the position relation between the marker and the wall;
a processing module: the system is used for processing the detection information and generating a 3D stereogram which is the same as the reality;
a storage module: the 3D perspective view is used for storing the pipeline and the position relation between the marker and the wall;
an AR module: acquiring a scene image and displaying a 3D stereogram, and displaying a superimposed pattern of the scene image and the 3D stereogram by taking the marker as an origin after identifying the marker;
a marker: the image used for triggering the AR module to show the 3D stereogram.
2. The AR-based indoor pipeline information presentation system of claim 1, wherein: the detection information also comprises a two-dimensional plane electronic pipeline map; the AR module also comprises a two-dimensional plane electronic pipeline map display module which is used for displaying the two-dimensional plane electronic pipeline map; and displaying the current position of the mobile terminal on the two-dimensional plane electronic pipeline map according to the real-time position information.
3. The AR-based indoor pipeline information presentation system of claim 1, wherein: the detection module further comprises a comparison unit, the comparison unit is used for being fixed on two sides of the wall body to allow working media to pass through, the medium emitting unit is further used for emitting non-parallel working media at corresponding positions on two sides of the wall body respectively, the processing unit is further used for obtaining pipeline layout information according to the ratio of the sizes of the images formed by the comparison unit and the ratio of the sizes of the images of the two pipelines, and the pipeline layout information further comprises embedded depth information.
4. The AR-based indoor pipeline information presentation system of claim 1, wherein: the AR module comprises shooting equipment, the shooting equipment completes real object identification in the real world by using an artificial intelligence learning technology of fusion Net, the fusion Net is a mixture of three neural networks which are respectively V-CNN I, V-CNN II and MV-CNN, the MV-CNN neural network is constructed based on an AlexNet structure and is pre-trained by an ImageNet data set, the three networks are fused in a scoring layer, and finally predicted classification is found by calculating a scored linear combination; the V-CNN I and V-CNN II use a voxelized CAD model, and the MV-CNN uses 2D projection as input; the module uses a standard pre-training neural network model (AlexNet) as the basis of the 2D network MV-CNN, and performs warm start pre-training on a network of the three-dimensional object 2D projection based on a large-scale 2D pixel picture data set ImageNet.
5. The AR-based indoor pipeline information presentation system of claim 1, wherein: and the AR module is also provided with a renovation step, and the processing module acquires the renovation step at equal intervals and displays the renovation step.
6. The AR-based indoor pipeline information presentation method and system as claimed in claim 1, wherein: the identifier has a plurality of identifiers, each of which corresponds to an associated chamber.
7. The AR-based indoor pipeline information display method is characterized by comprising the following steps: the method comprises the following steps:
step one, shooting the indoor condition through a camera device in an AR module to form a shot image;
step two: the type, the position and the embedding depth of the pipeline are detected through the detection module, and then the detected information is sent to the processing module;
step three: the position relation between the marker and the wall is obtained through the detection module, and then the information is fed back to the processing module of the cloud;
step four: the processing module processes the pipeline information and the position information of the marker and the wall, generates a 3D stereogram which is the same as the reality, and then stores the 3D stereogram into the storage module;
step five: the marker is scanned and identified through the AR module, and then the on-site picture and the superposition pattern of the 3D stereograph can be displayed by taking the marker as an origin after the marker is identified.
CN202110653302.9A 2021-06-11 2021-06-11 AR-based indoor pipeline information display method and system Active CN113239445B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110653302.9A CN113239445B (en) 2021-06-11 2021-06-11 AR-based indoor pipeline information display method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110653302.9A CN113239445B (en) 2021-06-11 2021-06-11 AR-based indoor pipeline information display method and system

Publications (2)

Publication Number Publication Date
CN113239445A true CN113239445A (en) 2021-08-10
CN113239445B CN113239445B (en) 2023-08-01

Family

ID=77139635

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110653302.9A Active CN113239445B (en) 2021-06-11 2021-06-11 AR-based indoor pipeline information display method and system

Country Status (1)

Country Link
CN (1) CN113239445B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114549801A (en) * 2022-04-25 2022-05-27 深圳市同立方科技有限公司 AR augmented reality water supply and drainage project visualization method, device and system

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004178308A (en) * 2002-11-27 2004-06-24 Make Softwear:Kk Image editing device, method and program
CN103687530A (en) * 2012-04-25 2014-03-26 奥林巴斯医疗株式会社 Endoscope image pickup unit and endoscope
CN108122475A (en) * 2017-12-29 2018-06-05 安徽迈普德康信息科技有限公司 Underground utilities localization method based on body unit
CN108954017A (en) * 2017-11-09 2018-12-07 北京市燃气集团有限责任公司 Fuel gas pipeline leakage detection system based on augmented reality
CN109751986A (en) * 2019-01-25 2019-05-14 重庆予胜远升网络科技有限公司 A kind of processing system and method generating AR image according to pipe network attribute data
CN109816794A (en) * 2019-01-25 2019-05-28 重庆予胜远升网络科技有限公司 A kind of three-dimension visible sysem and method based on pipe network attribute data
CN110390731A (en) * 2019-07-15 2019-10-29 贝壳技术有限公司 Image processing method, device, computer readable storage medium and electronic equipment
CN111639408A (en) * 2020-05-27 2020-09-08 上海实迅网络科技有限公司 AR technology-based urban pipe network pipeline 3D model display method and system
CN111789606A (en) * 2020-08-10 2020-10-20 上海联影医疗科技有限公司 Imaging system and method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004178308A (en) * 2002-11-27 2004-06-24 Make Softwear:Kk Image editing device, method and program
CN103687530A (en) * 2012-04-25 2014-03-26 奥林巴斯医疗株式会社 Endoscope image pickup unit and endoscope
CN108954017A (en) * 2017-11-09 2018-12-07 北京市燃气集团有限责任公司 Fuel gas pipeline leakage detection system based on augmented reality
CN108122475A (en) * 2017-12-29 2018-06-05 安徽迈普德康信息科技有限公司 Underground utilities localization method based on body unit
CN109751986A (en) * 2019-01-25 2019-05-14 重庆予胜远升网络科技有限公司 A kind of processing system and method generating AR image according to pipe network attribute data
CN109816794A (en) * 2019-01-25 2019-05-28 重庆予胜远升网络科技有限公司 A kind of three-dimension visible sysem and method based on pipe network attribute data
CN110390731A (en) * 2019-07-15 2019-10-29 贝壳技术有限公司 Image processing method, device, computer readable storage medium and electronic equipment
CN111639408A (en) * 2020-05-27 2020-09-08 上海实迅网络科技有限公司 AR technology-based urban pipe network pipeline 3D model display method and system
CN111789606A (en) * 2020-08-10 2020-10-20 上海联影医疗科技有限公司 Imaging system and method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
吕圣卿 等: "一种基于网络的并行渲染和跨平台同步展示系统", 《计算机应用与软件》 *
吕圣卿 等: "一种基于网络的并行渲染和跨平台同步展示系统", 《计算机应用与软件》, vol. 34, no. 10, 15 October 2017 (2017-10-15), pages 116 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114549801A (en) * 2022-04-25 2022-05-27 深圳市同立方科技有限公司 AR augmented reality water supply and drainage project visualization method, device and system

Also Published As

Publication number Publication date
CN113239445B (en) 2023-08-01

Similar Documents

Publication Publication Date Title
US7398481B2 (en) Virtual environment capture
US9619944B2 (en) Coordinate geometry augmented reality process for internal elements concealed behind an external element
US7139685B2 (en) Video-supported planning of equipment installation and/or room design
US10412594B2 (en) Network planning tool support for 3D data
JP2021520584A (en) Housing data collection and model generation methods
WO2016184255A1 (en) Visual positioning device and three-dimensional mapping system and method based on same
CN103119611A (en) Method and apparatus for image-based positioning
JP2013509665A (en) System and method using 3D and 2D digital images
US10890447B2 (en) Device, system and method for displaying measurement gaps
CN104463969A (en) Building method of model of aviation inclined shooting geographic photos
US11395102B2 (en) Field cooperation system and management device
CN106441298A (en) Method for map data man-machine interaction with robot view image
CN113239445A (en) AR-based indoor pipeline information display method and system
CN113391366A (en) Indoor pipeline three-dimensional model generation method and system
CN108957507A (en) Fuel gas pipeline leakage method of disposal based on augmented reality
Nguyen et al. Interactive syntactic modeling with a single-point laser range finder and camera
CN108954016A (en) Fuel gas pipeline leakage disposal system based on augmented reality
KR20150028533A (en) Apparatus for gathering indoor space information
Liu et al. System development of an augmented reality on-site BIM viewer based on the integration of SLAM and BLE indoor positioning
Zakaria et al. Practical Terrestrial Laser Scanning Field Procedure And Point Cloud Processing For Bim Applications–Tnb Control And Relay Room 132/22KV
Miyake et al. Outdoor markerless augmented reality
JP7133900B2 (en) Shooting position specifying system, shooting position specifying method, and program
JP2021064267A (en) Image processing apparatus and image processing method
TWI780734B (en) Method for displaying hidden object through augmented reality
EP3674660A2 (en) A computer-implemented method, a computer program product, and an apparatus for facilitating placement of a surveying tool

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant