CN113239445B - AR-based indoor pipeline information display method and system - Google Patents

AR-based indoor pipeline information display method and system Download PDF

Info

Publication number
CN113239445B
CN113239445B CN202110653302.9A CN202110653302A CN113239445B CN 113239445 B CN113239445 B CN 113239445B CN 202110653302 A CN202110653302 A CN 202110653302A CN 113239445 B CN113239445 B CN 113239445B
Authority
CN
China
Prior art keywords
pipeline
module
information
marker
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110653302.9A
Other languages
Chinese (zh)
Other versions
CN113239445A (en
Inventor
路亚
李腾
杨睿
张科伦
李建华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing College of Electronic Engineering
Original Assignee
Chongqing College of Electronic Engineering
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing College of Electronic Engineering filed Critical Chongqing College of Electronic Engineering
Priority to CN202110653302.9A priority Critical patent/CN113239445B/en
Publication of CN113239445A publication Critical patent/CN113239445A/en
Application granted granted Critical
Publication of CN113239445B publication Critical patent/CN113239445B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/13Architectural design, e.g. computer-aided architectural design [CAAD] related to design of buildings, bridges, landscapes, production plants or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/18Network design, e.g. design based on topological or interconnect aspects of utility systems, piping, heating ventilation air conditioning [HVAC] or cabling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • G06F30/27Design optimisation, verification or simulation using machine learning, e.g. artificial intelligence, neural networks, support vector machines [SVM] or training a model
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2113/00Details relating to the application field
    • G06F2113/14Pipes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2113/00Details relating to the application field
    • G06F2113/16Cables, cable trees or wire harnesses

Abstract

The scheme belongs to the technical field of augmented reality, and particularly relates to an AR-based indoor pipeline information display method and system. Comprising the following steps: and a detection module: the method comprises the steps of comprehensively detecting detection information of the embedded pipeline through infrared rays, ultrasonic waves and millimeter waves, wherein the detection information comprises the type, the position and the embedded depth of the embedded pipeline; the detection information also comprises the position relation between the marker and the wall; the processing module is used for: the method comprises the steps of processing detection information and generating a 3D stereogram which is the same as reality; and a storage module: the three-dimensional (3D) perspective view is used for storing the position relation between the pipeline and the marker and the wall; AR module: after the identifier is identified, displaying the superimposed pattern of the field picture and the 3D stereogram by taking the identifier as an origin; the marker is as follows: for triggering the AR module to show an image of the 3D stereoscopic map. According to the scheme, the superimposed pattern of the field picture and the 3D stereogram is displayed by taking the marker as the origin, so that the positioning accuracy is improved.

Description

AR-based indoor pipeline information display method and system
Technical Field
The scheme belongs to the technical field of augmented reality, and particularly relates to an AR-based indoor pipeline information display method and system.
Background
Whether a newly purchased second house or an old house for many years, as residence time increases, it is often desirable to have a more comfortable home experience, and the old house is often selected for renovation. However, the construction requirements for renovating old houses are quite different from those of new houses, and the problems to be noted are quite many, such as the fact that the construction drawing (such as old communities and second houses) cannot be obtained due to overlong time of the line pipes, the water pipes and the like buried under the walls or the floors, so that the renovation construction is inconvenient for users.
The existing objects in the current reality scene are only displayed on the screen of the conventional mobile terminal, but for the construction and construction display of the pipe network in the old house, the explanatory and descriptive parameter information of the pipe network node is often required, so that an operator can conveniently and rapidly know more and richer information of the pipe network scene in the old house, and new augmented reality application experience is brought to the user. Thus, there is an urgent need for a system that enables 3D display of a pipeline buried in a wall or floor.
The invention patent with the application number of CN202010466577.7 discloses a method and a system for displaying a 3D model of an urban pipe network pipeline based on an AR technology, wherein the method comprises the steps of a pipe network node model construction module, a pipe network node index information file, a pipe network node index information set and a three-dimensional urban map are combined to construct a pipe network node model; the pipeline coordinate mapping module is used for reading the two-dimensional pipeline path map file, obtaining a two-dimensional pipeline path under a relative coordinate system, and mapping the relative coordinate system of the two-dimensional pipeline path to an absolute coordinate system of the pipe network node model; the three-dimensional pipe network model construction module is used for constructing a three-dimensional channel scene of the pipe network node model processed by the pipeline coordinate mapping module by utilizing a pipe network channel model library to obtain a three-dimensional pipe network model; and the AR display module is used for determining the display pose of the three-dimensional pipe network model by taking the pipe network node as a reference and carrying out AR display.
According to the scheme, although the 3D model of the urban pipe network pipeline can be constructed, when the 3D model is needed to be displayed, corresponding pipe network node index information files are needed to be input definitely, so that the geographical position information of all pipe network nodes on different pipelines in different places and the pipe network node attribute information matched with the geographical position information are acquired, but the landmark positioning of the outdoor pipeline can be comprehensively positioned by adopting a plurality of positioning systems to acquire high precision, but for indoor, the accurate positioning of the positioning system is difficult, and the centimeter-level precision positioning is difficult to achieve.
Disclosure of Invention
The scheme provides an AR-based indoor pipeline information display method and an AR-based indoor pipeline information display system, which are used for solving the problem that an indoor 3D model is not matched with an actual pipeline position.
In order to achieve the above object, the present invention provides an AR-based indoor pipeline information display system, comprising
And a detection module: the method comprises the steps of comprehensively detecting detection information of the embedded pipeline through infrared rays, ultrasonic waves and millimeter waves, wherein the detection information comprises the type, the position and the embedded depth of the embedded pipeline; the detection information also comprises the position relation between the marker and the wall;
the processing module is used for: the method comprises the steps of processing detection information and generating a 3D stereogram which is the same as reality;
and a storage module: the three-dimensional (3D) perspective view is used for storing the position relation between the pipeline and the marker and the wall;
AR module: after the identifier is identified, displaying the superimposed pattern of the field picture and the 3D stereogram by taking the identifier as an origin;
the marker is as follows: for triggering the AR module to show an image of the 3D stereoscopic map.
The principle of the scheme is as follows: firstly, adopting a detection module to detect and identify pipelines buried in a wall or a bottom plate indoors to obtain
Information such as the type, position, depth of embedded pipeline, and the positional relationship between the marker and the wall. Then, an AR module is used for obtaining a field picture; and then the detection module transmits the detection information and the scene image to the processing module for processing, the processing module generates a 3D stereogram which is overlapped with reality according to the actual position of the indoor pipeline information and stores the 3D stereogram in the storage module, and then the AR module is adopted to identify the marker, so that the scene image and the 3D stereogram can be overlapped in the real world image to obtain the augmented reality image. And then, staff can obtain indoor pipeline information according to the augmented reality picture, and pipelines cannot be damaged when old houses are overhauled.
The beneficial effect of this scheme:
1. through using the identifier as the superimposed pattern of on-the-spot picture and 3D stereogram of origin show and show, reached positioning system in indoor unable positioning accuracy for the 3D stereogram of pipeline is the same with the pipeline position in the indoor reality completely, makes the staff who carries out the repair work when carrying out the repair work, can know more richer information of pipeline scene in old room conveniently, swiftly, and then improves work efficiency.
2. The 3D stereogram can be displayed by directly scanning the marker, and the operation is convenient and simple.
Further, the detection information also comprises a two-dimensional plane electronic pipeline map; the AR module further comprises a two-dimensional plane electronic pipeline map display module, and is used for displaying the two-dimensional plane electronic pipeline map; and displaying the current position of the mobile terminal on the two-dimensional plane electronic pipeline map according to the real-time position information. When the staff carries out indoor repair, the staff can know the real-time position of the staff, can select different angles and places to carry out repair, can carry out repair simultaneously by a plurality of repair staff, and can not interfere each other to work.
Further, the AR module comprises shooting equipment, the shooting equipment completes real-world real object identification by using an artificial intelligent learning technology of fusion Net, the fusion Net is a mixture of three types of neural networks, namely V-CNN I, V-CNN II and MV-CNN, which are respectively constructed based on AlexNet structures and are pre-trained through an image Net data set, the three types of networks are fused in a scoring layer, and finally the predicted classification is found through linear combination of calculation scoring. V-CNN I and V-CNN II use voxelized CAD models, and MV-CNN uses 2D projections as inputs. The module uses a standard pre-training neural network model (AlexNet) as a basis of a 2D network MV-CNN, and performs warm-start pre-training on a network of the 2D projection of the three-dimensional object based on a large-scale 2D pixel picture data set ImageNet. Many features for 2D image classification need not be trained from scratch, subject to pre-training.
Further, the detection module further comprises a comparison unit, a medium emission unit and a processing unit, wherein the comparison unit is used for being fixed on two sides of the wall body for a working medium to pass through, the medium emission unit is used for emitting non-parallel working mediums at corresponding positions on two sides of the wall body respectively, the processing unit is used for obtaining pipeline layout information according to the ratio of the image size formed by the comparison unit and the ratio of the image sizes of two pipeline images, and the pipeline layout information comprises embedded depth information.
Firstly, respectively arranging a medium transmitting unit and a medium receiving unit on two sides of a wall body, then, respectively imaging the two sides of the wall body by the medium transmitting unit on one side of the wall body, wherein the side close to the medium transmitting unit is an imaging 1, the side far away from the medium transmitting unit is an imaging 2, then, obtaining the embedded depth information of a pipeline and the diameter information of the pipeline according to the known values of the size of a comparison unit and the imaging 1 and the imaging 2, and when the size of an image formed by the comparison unit is equal to the value of the imaging 1, positioning the pipeline on the side close to the medium transmitting unit; when the image size formed by the contrast unit is equal to the value of the imaging 2, the position of the pipeline is positioned at the side far away from the medium emission unit; and then obtaining the embedded depth information of the pipeline and the diameter information of the pipeline according to the size of the comparison unit and the size of the image formed twice.
Furthermore, the AR module is also provided with a repairing step, and the processing module acquires and displays the repairing step at equal intervals. When the house is overhauled by the overhauling staff, the AR module displays the overhauling steps in sequence when the 3D stereoscopic object is displayed, so that the overhauling staff is guided to carry out the overhauling work, even if the overhauling staff does not know what the overhauling sequence is, the overhauling work can be carried out according to the overhauling steps, the working efficiency of the overhauling staff can be improved, the delay time in the process of fumbling and overhauling is reduced, and the pipeline is prevented from being damaged in the process of fumbling and overhauling.
Further, the number of the markers is plural, and each marker corresponds to an associated room. When the staff places the marker, the number matched with the corresponding room can be marked on the marker, so that the situation that a plurality of markers are mixed after the instrument is difficult to distinguish is avoided.
The invention also provides an AR-based indoor pipeline information display method, which comprises the following steps:
shooting indoor conditions by using image shooting equipment in an AR module to form a shooting image;
step two: detecting the type, the position and the embedding depth of the pipeline through a detection module, and then sending the detected information to a processing module;
step three: the position relation between the marker and the wall is obtained through the detection module, and then the information is fed back to the processing module of the cloud;
step four: the processing module processes the pipeline information and the position information of the marker and the wall, generates a 3D stereogram which is the same as reality, and then stores the 3D stereogram into the storage module;
step five: and the AR module is used for scanning and identifying the marker, namely, after the marker is identified, the superimposed pattern of the field picture and the 3D stereogram is displayed by taking the marker as an origin.
Drawings
FIG. 1 is a logical framework diagram of an AR-based indoor pipeline information presentation system embodiment of the present invention;
FIG. 2 is a flowchart of an AR-based indoor pipeline information display method embodiment of the present invention.
FIG. 3 is a diagram showing the pipeline information measurement of the indoor wall body.
Detailed Description
The following is a further detailed description of the embodiments:
an example is substantially as shown in figure 1:
an AR-based indoor pipeline information presentation system, comprising:
and a detection module: the method comprises the steps of comprehensively detecting detection information of the embedded pipeline through infrared rays, ultrasonic waves and millimeter waves, wherein the detection information comprises the type, the position and the embedded depth of the embedded pipeline; the detection information also comprises the position relation between the marker and the wall;
the detection information also comprises a two-dimensional plane electronic pipeline map; the AR module further comprises a two-dimensional plane electronic pipeline map display module, which is used for displaying the two-dimensional plane electronic pipeline map; and displaying the current position of the mobile terminal on the two-dimensional plane electronic pipeline map according to the real-time position information. When the staff carries out indoor repair, the staff can know the real-time position of the staff, can select different angles and places to carry out repair, can carry out repair simultaneously by a plurality of repair staff, and can not interfere each other to work. The two-dimensional plane electronic pipeline map is drawn by centimeter-level GNSS positioning knot pipeline data.
The processing module is used for: the method comprises the steps of processing detection information and generating a 3D stereogram which is the same as reality;
and a storage module: the three-dimensional (3D) perspective view is used for storing the position relation between the pipeline and the marker and the wall;
AR module: after the identifier is identified, displaying the superimposed pattern of the field picture and the 3D stereogram by taking the identifier as an origin;
the AR module comprises shooting equipment, the shooting equipment completes real object identification in the real world by using an artificial intelligence learning technology of fusion Net, the fusion Net is a mixture of three types of neural networks, namely V-CNN I, V-CNN II and MV-CNN, which are respectively constructed based on AlexNet structures and are pre-trained through an image Net data set, the three types of networks are fused in a scoring layer, and finally predicted classification is found through linear combination of calculation scoring. V-CNN I and V-CNN II use voxelized CAD models, and MV-CNN uses 2D projections as inputs. The module uses a standard pre-training neural network model (AlexNet) as a basis of a 2D network MV-CNN, and performs warm-start pre-training on a network of the 2D projection of the three-dimensional object based on a large-scale 2D pixel picture data set ImageNet. Many features for 2D image classification need not be trained from scratch, subject to pre-training.
The AR module is also provided with a repair step, and the processing module acquires the repair step at equal intervals and displays the repair step. When the house is overhauled by the overhauling staff, the AR module displays the overhauling steps in sequence when the 3D stereoscopic object is displayed, so that the overhauling staff is guided to carry out the overhauling work, even if the overhauling staff does not know what the overhauling sequence is, the overhauling work can be carried out according to the overhauling steps, the working efficiency of the overhauling staff can be improved, the delay time in the process of fumbling and overhauling is reduced, and the pipeline is prevented from being damaged in the process of fumbling and overhauling.
The marker is as follows: for triggering the AR module to show an image of the 3D stereoscopic map.
The number of markers is plural, each marker corresponding to an associated room. When the staff places the marker, the number matched with the corresponding room can be marked on the marker, so that the situation that a plurality of markers are mixed after the instrument is difficult to distinguish is avoided.
As shown in fig. 3:
the detection module further comprises a comparison unit, a medium emission unit and a processing unit, wherein the comparison unit is used for being fixed on two sides of the wall body for a working medium to pass through, the medium emission unit is used for emitting non-parallel working mediums at corresponding positions on two sides of the wall body respectively, the processing unit is used for obtaining pipeline layout information according to the ratio of the image size formed by the comparison unit and the ratio of the image sizes of two pipelines, and the pipeline layout information comprises embedded depth information. (specifically, the working medium in this embodiment is X-ray)
Firstly, respectively arranging a medium transmitting unit and a medium receiving unit on two sides of a wall body, then, respectively imaging the two sides of the wall body by the medium transmitting unit on one side of the wall body, wherein the side close to the medium transmitting unit is an imaging 1, the side far away from the medium transmitting unit is an imaging 2, then, obtaining the embedded depth information of a pipeline and the diameter information of the pipeline according to the known values of the size of a comparison unit and the imaging 1 and the imaging 2, and when the size of an image formed by the comparison unit is equal to the value of the imaging 1, positioning the pipeline on the side close to the medium transmitting unit; when the image size formed by the contrast unit is equal to the value of the imaging 2, the position of the pipeline is positioned at the side far away from the medium emission unit; and then obtaining the embedded depth information of the pipeline and the diameter information of the pipeline according to the size of the comparison unit and the size of the image formed twice.
The specific operation is as follows: the marker 5 is fixed with the wall 1, then the medium emitting unit 2 and the medium receiving unit 3 are respectively arranged at two sides of the wall 1, the medium emitting unit 2 is started, when a pipeline exists in the wall 1, the contrast unit 4 is arranged at two sides of the wall 1 and between the medium emitting unit 2 and the medium receiving unit 3, then the medium emitting unit 2 is started to emit X-rays at the outer side of the wall 1, the substance of the medium emitting unit 2 is a portable X-ray machine (namely, equipment comprising a basic circuit and an X-ray tube, the X-ray tube emits X-rays with divergent properties), and the medium receiving unit 3 completes the first acquisition at the inner side of the wall 1. And then exchanging the positions of the medium transmitting unit 2 and the medium receiving unit 3 to finish the second acquisition.
Then the processing module can obtain the embedded depth information of the pipeline and the diameter information of the pipeline according to the known size of the comparison unit and the known size of the images formed twice (and the known ratio), so that the construction of the three-dimensional model is completed, and finally the three-dimensional model is saved by the storage module. One account number may then correspond to one three-dimensional model. Different users may also use different identifiers 5 for differentiation.
As shown in fig. 2:
the embodiment also discloses an AR-based indoor pipeline information display method, which comprises the following steps:
shooting indoor conditions by using image shooting equipment in an AR module to form a shooting image;
step two: detecting the type, the position and the embedding depth of the pipeline through a detection module, and then sending the detected information to a processing module;
step three: the position relation between the marker and the wall is obtained through the detection module, and then the information is fed back to the processing module of the cloud;
step four: the processing module processes the pipeline information and the position information of the marker and the wall, generates a 3D stereogram which is the same as reality, and then stores the 3D stereogram into the storage module;
step five: and the AR module is used for scanning and identifying the marker, namely, after the marker is identified, the superimposed pattern of the field picture and the 3D stereogram is displayed by taking the marker as an origin.
The foregoing is merely exemplary embodiments of the present invention, and specific structures and features that are well known in the art are not described in detail herein. It should be noted that modifications and improvements can be made by those skilled in the art without departing from the structure of the present invention, and these should also be considered as the scope of the present invention, which does not affect the effect of the implementation of the present invention and the utility of the patent. The protection scope of the present application shall be subject to the content of the claims, and the description of the specific embodiments and the like in the specification can be used for explaining the content of the claims.

Claims (6)

1. AR-based indoor pipeline information display system, its characterized in that: comprising the following steps:
and a detection module: the method comprises the steps of comprehensively detecting detection information of the embedded pipeline through infrared rays, ultrasonic waves and millimeter waves, wherein the detection information comprises the type, the position and the embedded depth of the embedded pipeline; the detection information also comprises the position relation between the marker and the wall;
the processing module is used for: the method comprises the steps of processing detection information and generating a 3D stereogram which is the same as reality;
and a storage module: the three-dimensional (3D) perspective view is used for storing the position relation between the pipeline and the marker and the wall;
AR module: after the identifier is identified, displaying the superimposed pattern of the field picture and the 3D stereogram by taking the identifier as an origin;
the marker is as follows: the AR module is used for triggering the AR module to display the image of the 3D stereogram;
the detection module further comprises a comparison unit, wherein the comparison unit is used for being fixed on two sides of the wall body for a working medium to pass through, the medium emission unit is further used for respectively emitting non-parallel working mediums at corresponding positions on two sides of the wall body, the processing unit is further used for obtaining pipeline layout information according to the ratio of the sizes of images formed by the comparison unit and the ratio of the sizes of images of two pipelines, and the pipeline layout information further comprises embedded depth information;
the method comprises the steps that a medium transmitting unit and a medium receiving unit are respectively arranged on two sides of a wall body, then the medium transmitting unit transmits working medium on one side of the wall body, then images are respectively formed on two sides of the wall body, one side close to the medium transmitting unit is an image 1, one side far away from the medium transmitting unit is an image 2, then embedding depth information of a pipeline and diameter information of the pipeline are obtained according to the known size of a comparison unit and the values of the image 1 and the image 2, and when the size of an image formed by the comparison unit is equal to the value of the image 1, the position of the pipeline is located on one side close to the medium transmitting unit; when the image size formed by the contrast unit is equal to the value of the imaging 2, the position of the pipeline is positioned at the side far away from the medium emission unit; then according to the comparison unit
The size of the image formed twice can be used to obtain the embedded depth information of the pipeline and the diameter information of the pipeline.
2. The AR-based indoor pipeline information presentation system of claim 1, wherein: the detection information also comprises a two-dimensional plane electronic pipeline map; the AR module further comprises a two-dimensional plane electronic pipeline map display module, and is used for displaying the two-dimensional plane electronic pipeline map; and displaying the current position of the mobile terminal on the two-dimensional plane electronic pipeline map according to the real-time position information.
3. The AR-based indoor pipeline information presentation system of claim 1, wherein: the AR module comprises shooting equipment, wherein the shooting equipment completes real object identification in the real world by using an artificial intelligent learning technology of fusion Net, the fusion Net is a mixture of three types of neural networks, namely V-CNN I, V-CNN II and MV-CNN, which are respectively constructed based on an AlexNet structure and are pre-trained through an ImageNet data set, the three types of networks are fused in a scoring layer, and finally predicted classification is found through linear combination of calculation scoring; V-CNN I and V-CNN II use voxelized CAD models, and MV-CNN uses 2D projection as input; the module uses a standard pre-training neural network model (AlexNet) as a basis of a 2D network MV-CNN, and performs warm-start pre-training on a network of the 2D projection of the three-dimensional object based on a large-scale 2D pixel picture data set ImageNet.
4. The AR-based indoor pipeline information presentation system of claim 1, wherein: the AR module is also provided with a repair step, and the processing module acquires and displays the repair step at equal intervals.
5. The AR-based indoor pipeline information presentation system of claim 1, wherein: the number of the markers is plural, and each marker corresponds to an associated room.
6. An AR-based indoor pipeline information display method, which adopts the system as set forth in claim 3, characterized in that: the method comprises the following steps:
shooting indoor conditions by using image shooting equipment in an AR module to form a shooting image;
step two: detecting the type, the position and the embedding depth of the pipeline through a detection module, and then sending the detected information to a processing module;
step three: the position relation between the marker and the wall is obtained through the detection module, and then the information is fed back to the processing module of the cloud;
step four: the processing module processes the pipeline information and the position information of the marker and the wall, generates a 3D stereogram which is the same as reality, and then stores the 3D stereogram into the storage module;
step five: and the AR module is used for scanning and identifying the marker, namely, after the marker is identified, the superimposed pattern of the field picture and the 3D stereogram is displayed by taking the marker as an origin.
CN202110653302.9A 2021-06-11 2021-06-11 AR-based indoor pipeline information display method and system Active CN113239445B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110653302.9A CN113239445B (en) 2021-06-11 2021-06-11 AR-based indoor pipeline information display method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110653302.9A CN113239445B (en) 2021-06-11 2021-06-11 AR-based indoor pipeline information display method and system

Publications (2)

Publication Number Publication Date
CN113239445A CN113239445A (en) 2021-08-10
CN113239445B true CN113239445B (en) 2023-08-01

Family

ID=77139635

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110653302.9A Active CN113239445B (en) 2021-06-11 2021-06-11 AR-based indoor pipeline information display method and system

Country Status (1)

Country Link
CN (1) CN113239445B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114549801B (en) * 2022-04-25 2022-07-19 深圳市同立方科技有限公司 AR augmented reality water supply and drainage project visualization method, device and system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004178308A (en) * 2002-11-27 2004-06-24 Make Softwear:Kk Image editing device, method and program
CN103687530A (en) * 2012-04-25 2014-03-26 奥林巴斯医疗株式会社 Endoscope image pickup unit and endoscope
CN110390731A (en) * 2019-07-15 2019-10-29 贝壳技术有限公司 Image processing method, device, computer readable storage medium and electronic equipment
CN111639408A (en) * 2020-05-27 2020-09-08 上海实迅网络科技有限公司 AR technology-based urban pipe network pipeline 3D model display method and system
CN111789606A (en) * 2020-08-10 2020-10-20 上海联影医疗科技有限公司 Imaging system and method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108954017A (en) * 2017-11-09 2018-12-07 北京市燃气集团有限责任公司 Fuel gas pipeline leakage detection system based on augmented reality
CN108122475A (en) * 2017-12-29 2018-06-05 安徽迈普德康信息科技有限公司 Underground utilities localization method based on body unit
CN109751986A (en) * 2019-01-25 2019-05-14 重庆予胜远升网络科技有限公司 A kind of processing system and method generating AR image according to pipe network attribute data
CN109816794B (en) * 2019-01-25 2022-11-29 重庆予胜远升网络科技有限公司 Three-dimensional visualization system and method based on pipe network attribute data

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004178308A (en) * 2002-11-27 2004-06-24 Make Softwear:Kk Image editing device, method and program
CN103687530A (en) * 2012-04-25 2014-03-26 奥林巴斯医疗株式会社 Endoscope image pickup unit and endoscope
CN110390731A (en) * 2019-07-15 2019-10-29 贝壳技术有限公司 Image processing method, device, computer readable storage medium and electronic equipment
CN111639408A (en) * 2020-05-27 2020-09-08 上海实迅网络科技有限公司 AR technology-based urban pipe network pipeline 3D model display method and system
CN111789606A (en) * 2020-08-10 2020-10-20 上海联影医疗科技有限公司 Imaging system and method

Also Published As

Publication number Publication date
CN113239445A (en) 2021-08-10

Similar Documents

Publication Publication Date Title
US10872467B2 (en) Method for data collection and model generation of house
Carozza et al. Markerless vision‐based augmented reality for urban planning
US7139685B2 (en) Video-supported planning of equipment installation and/or room design
US7398481B2 (en) Virtual environment capture
Zollmann et al. Flyar: Augmented reality supported micro aerial vehicle navigation
Rashidi et al. Generating absolute-scale point cloud data of built infrastructure scenes using a monocular camera setting
CN111080804B (en) Three-dimensional image generation method and device
US9885573B2 (en) Method, device and computer programme for extracting information about one or more spatial objects
AU2019281667A1 (en) Data collection and model generation method for house
Bae et al. Image-based localization and content authoring in structure-from-motion point cloud models for real-time field reporting applications
US10127667B2 (en) Image-based object location system and process
US20160037356A1 (en) Network planning tool support for 3d data
JP6353175B1 (en) Automatically combine images using visual features
Khoshelham et al. Indoor mapping eyewear: geometric evaluation of spatial mapping capability of HoloLens
Hübner et al. Marker-based localization of the microsoft hololens in building models
CN102867057A (en) Virtual wizard establishment method based on visual positioning
US10890447B2 (en) Device, system and method for displaying measurement gaps
CN113239445B (en) AR-based indoor pipeline information display method and system
CN112581402B (en) Road and bridge fault automatic detection method based on machine vision technology
US11395102B2 (en) Field cooperation system and management device
EP3330928A1 (en) Image generation device, image generation system, and image generation method
CN108957507A (en) Fuel gas pipeline leakage method of disposal based on augmented reality
Ishikawa et al. In-situ 3d indoor modeler with a camera and self-contained sensors
CN113391366A (en) Indoor pipeline three-dimensional model generation method and system
Pollok et al. A visual SLAM-based approach for calibration of distributed camera networks

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant