CN111521193A - Live-action navigation method, live-action navigation device, storage medium and processor - Google Patents

Live-action navigation method, live-action navigation device, storage medium and processor Download PDF

Info

Publication number
CN111521193A
CN111521193A CN202010329237.XA CN202010329237A CN111521193A CN 111521193 A CN111521193 A CN 111521193A CN 202010329237 A CN202010329237 A CN 202010329237A CN 111521193 A CN111521193 A CN 111521193A
Authority
CN
China
Prior art keywords
bim
coordinate space
information
road network
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010329237.XA
Other languages
Chinese (zh)
Inventor
廖荣盛
舒远
曹国
姚林
王沸仁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Bozhilin Robot Co Ltd
Original Assignee
Guangdong Bozhilin Robot Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Bozhilin Robot Co Ltd filed Critical Guangdong Bozhilin Robot Co Ltd
Priority to CN202010329237.XA priority Critical patent/CN111521193A/en
Publication of CN111521193A publication Critical patent/CN111521193A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3635Guidance using 3D or perspective road maps
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images

Abstract

The application provides a live-action navigation method, a live-action navigation device, a storage medium and a processor, wherein the method comprises the following steps: determining matching environment information, wherein the matching environment information is environment information in the BIM matched with the environment information displayed by the current AR; determining a coordinate space of the BIM in an AR coordinate space and a coordinate space of a road network in the AR coordinate space; generating BIM road network information in an AR coordinate space according to the matching environment information, the coordinate space of the BIM in the AR coordinate space and the coordinate space of the road network in the AR coordinate space, wherein the BIM road network information comprises BIM and/or road network data; according to the scheme, the BIM, the road network and the AR are unified, the BIM road network information is generated in the AR to conduct AR live-action navigation, and the precision of the AR live-action navigation is improved.

Description

Live-action navigation method, live-action navigation device, storage medium and processor
Technical Field
The present application relates to the field of navigation, and in particular, to a live-action navigation method, a live-action navigation apparatus, a storage medium, and a processor.
Background
The map AR navigation is developed in a high-grade map and a Baidu map, and the GIS, visual image, cloud computing and storage technology are combined to provide the outdoor navigation, so that the very convenient and visual live-action navigation is brought to the user. Indoor navigation currently has few mature solutions on the market, but the related technologies are very many: indoor positioning technologies such as laser, UWB, WIFI and Bluetooth are applied to indoor positioning and navigation; the AR is combined with the BIM, positioning and navigation application scenes exist between the AR and the point cloud, but the existing AR live-action navigation based on the BIM is low in precision.
The above information disclosed in this background section is only for enhancement of understanding of the background of the technology described herein and, therefore, certain information may be included in the background that does not form the prior art that is already known in this country to a person of ordinary skill in the art.
Disclosure of Invention
The present application mainly aims to provide a live-action navigation method, a live-action navigation device, a storage medium and a processor, so as to solve the problem of low precision of the AR live-action navigation based on BIM in the prior art.
In order to achieve the above object, according to an aspect of the present application, there is provided a method of live-action navigation, including: determining matching environment information, wherein the matching environment information is environment information in the BIM matched with the environment information displayed by the current AR; determining a coordinate space of the BIM in an AR coordinate space and a coordinate space of a road network in the AR coordinate space; generating BIM road network information in the AR coordinate space according to the matching environment information, the coordinate space of the BIM in the AR coordinate space and the coordinate space of the road network in the AR coordinate space, wherein the BIM road network information comprises the BIM and/or road network data; and performing AR live-action navigation according to the BIM road network information.
Further, determining matching context information, comprising: determining an AR coordinate space; acquiring azimuth information of a predetermined point in the BIM and coordinates of the predetermined point in the AR coordinate space, the azimuth information of the predetermined point in the BIM including direction information of the predetermined point in the BIM and coordinates in the BIM coordinate space; determining coordinates and direction information of an origin in the BIM in the AR coordinate space according to the azimuth information of a predetermined point in the BIM and the coordinates of the predetermined point in the AR coordinate space; according to the coordinate of the origin in the BIM in the AR coordinate space, the origin in the BIM in the direction information in the AR coordinate space and the current AR displayed environment information, the matching environment information is determined, the matching environment information is matching azimuth information, and the matching azimuth information comprises the direction information in the BIM matched with the current AR displayed environment information and the coordinate in the BIM coordinate space.
Further, acquiring the azimuth information of a predetermined point in the BIM includes: acquiring coordinates and direction information of a preset position on site in the BIM coordinate space, wherein the preset position corresponds to a preset point in the BIM; and determining the azimuth information of the predetermined point in the BIM coordinate space according to the coordinate of the predetermined position in the BIM coordinate space and the direction information.
Further, acquiring coordinates and direction information of a predetermined position of a site in the BIM coordinate space, including: and scanning a mark image through AR to obtain the coordinates and the direction information of the preset position in the BIM coordinate space, wherein the mark image is arranged at the preset position.
Further, the mark image is a two-dimensional code.
Further, determining a coordinate space of the BIM in the AR coordinate space and a coordinate space of a road network in the AR coordinate space includes: determining a coordinate space of the BIM in the AR coordinate space according to the azimuth information of a predetermined point in the BIM and the coordinate of the predetermined point in the AR coordinate space; and determining the coordinate space of the road network in the AR coordinate space according to the coordinate space of the BIM in the AR coordinate space.
Further, performing AR live-action navigation according to the BIM road network information, including: acquiring a current position and a target position; and generating a navigation path according to the current position, the target position and the BIM road network information, and displaying the navigation path on a display interface, wherein the navigation path is formed by connecting road network nodes.
According to an aspect of the present application, there is provided an apparatus for live-action navigation, comprising: the device comprises a first determining unit, a second determining unit and a display unit, wherein the first determining unit is used for determining matching environment information which is environment information in the BIM matched with environment information displayed by the current AR; the second determination unit is used for determining a coordinate space of the BIM in an AR coordinate space and a coordinate space of a road network in the AR coordinate space; a generating unit, configured to generate BIM road network information in the AR coordinate space according to the matching environment information, the coordinate space of the BIM in the AR coordinate space, and the coordinate space of the road network in the AR coordinate space, where the BIM road network information includes the BIM and/or road network data; and the navigation unit is used for carrying out AR live-action navigation according to the BIM road network information.
According to another aspect of the application, there is provided a storage medium comprising a stored program, wherein the program performs any one of the methods.
According to yet another aspect of the application, a processor for running a program is provided, wherein the program when running performs any of the methods.
By applying the technical scheme of the application, the environment information in the BIM matched with the environment information displayed by the current AR is determined, the coordinate space of the BIM in the AR coordinate space and the coordinate space of the road network in the AR coordinate space are determined, the BIM road network information is generated in the AR according to the matched environment information, the coordinate space of the BIM in the AR coordinate space and the coordinate space of the road network in the AR coordinate space, the BIM, the road network and the AR are unified, and the AR live-action navigation is performed according to the BIM road network information generated in the AR. In addition, the method enriches the displayed content in the navigation process, so that the user can know the corresponding relation between the three-dimensional model and the real scene more conveniently.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments of the application and, together with the description, serve to explain the application and are not intended to limit the application. In the drawings:
FIG. 1 is a flow chart of a method of live action navigation according to an embodiment of the present application; and
fig. 2 is a schematic diagram illustrating an apparatus for live-action navigation according to an embodiment of the present application.
Detailed Description
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the application described herein may be used. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It will be understood that when an element such as a layer, film, region, or substrate is referred to as being "on" another element, it can be directly on the other element or intervening elements may also be present. Also, in the specification and claims, when an element is described as being "connected" to another element, the element may be "directly connected" to the other element or "connected" to the other element through a third element.
For convenience of description, some terms or expressions referred to in the embodiments of the present application are explained below:
BIM: the Building Information model (Building Information Modeling) is a new tool for architecture, engineering and civil engineering, is used for figuring computer aided design which mainly uses three-dimensional graphics, guides objects and relates to architecture, helps to realize the integration of Building Information, and all kinds of Information are always integrated in a three-dimensional model Information database from the design, construction and operation of the Building to the end of the whole life cycle of the Building.
AR: the Augmented Reality (Augmented Reality) technology is a technology for skillfully fusing virtual information and a real world, and is widely applied to various technical means such as multimedia, three-dimensional modeling, real-time tracking and registration, intelligent interaction, sensing and the like.
As introduced in the background art, the accuracy of the AR live-action navigation based on the BIM in the prior art is low, and to solve the problem of the low accuracy of the AR live-action navigation based on the BIM, embodiments of the present application provide a live-action navigation method, a live-action navigation apparatus, a storage medium, and a processor.
According to an exemplary embodiment of the present application, a method of live action navigation is provided.
Fig. 1 is a flowchart of a method of live action navigation according to an embodiment of the present application. As shown in fig. 1, the method comprises the steps of:
step S101, determining matching environment information, wherein the matching environment information is environment information in a BIM (building information modeling) matched with the environment information displayed by a current AR (augmented reality);
step S102, determining a coordinate space of the BIM in an AR coordinate space and a coordinate space of a road network in the AR coordinate space;
step S103, generating BIM road network information in the AR coordinate space according to the matching environment information, the coordinate space of the BIM in the AR coordinate space, and the coordinate space of the road network in the AR coordinate space, or directly displaying the BIM road network information on a display interface, wherein the BIM road network information includes the BIM and/or road network data;
and step S104, performing AR live-action navigation according to the BIM road network information.
In the above scheme, the environment information in the BIM matched with the environment information displayed by the current AR is determined, the coordinate space of the BIM in the AR coordinate space is determined, the coordinate space of the road network in the AR coordinate space is generated in the AR according to the matched environment information, the coordinate space of the BIM in the AR coordinate space and the coordinate space of the road network in the AR coordinate space, so that the unification of the BIM, the road network and the AR is realized, and the AR live-action navigation is performed according to the BIM road network information generated in the AR. In addition, the method enriches the displayed content in the navigation process, so that the user can know the corresponding relation between the three-dimensional model and the real scene more conveniently.
It should be noted that the steps illustrated in the flowcharts of the figures may be performed in a computer system such as a set of computer-executable instructions and that, although a logical order is illustrated in the flowcharts, in some cases, the steps illustrated or described may be performed in an order different than presented herein.
In one embodiment of the present application, determining matching environment information includes: determining an AR coordinate space; acquiring azimuth information of a predetermined point in the BIM and coordinates of the predetermined point in the AR coordinate space, the azimuth information of the predetermined point in the BIM including direction information of the predetermined point in the BIM and coordinates in the BIM coordinate space; determining coordinates and direction information of an origin in the BIM in the AR coordinate space according to the azimuth information of a predetermined point in the BIM and the coordinates of the predetermined point in the AR coordinate space; determining matching environment information according to the coordinate of the origin in the BIM in the AR coordinate space, the direction information of the origin in the BIM in the AR coordinate space, and the environment information of the current AR display, wherein the matching environment information is the matching direction information, the matching direction information includes the direction information in the BIM matching with the environment information of the current AR display and the coordinate in the BIM coordinate space, that is, to realize the unification of the AR and the BIM, it is necessary to determine the direction information of the predetermined point in the BIM and the coordinate of the predetermined point in the AR coordinate space, that is, to determine the corresponding position of the predetermined point in the BIM in the AR, and further determine the coordinate and the direction information of the origin in the BIM in the AR coordinate space according to the direction information of the predetermined point in the BIM and the coordinate in the BIM coordinate space and the coordinate of the predetermined point in the AR coordinate space, and further determining the environment information in the BIM matched with the environment information displayed by the current AR according to the coordinate of the origin in the BIM in the AR coordinate space and the direction information of the origin in the BIM in the AR coordinate space, specifically determining the direction information and the coordinate matched with the direction information displayed by the current AR and the coordinate in the BIM, and more specifically determining the matched environment information corresponding to the origin in the AR coordinate space. By the method, the matching environment information can be more accurately determined, and the precision of AR live-action navigation is further improved.
In another embodiment of the present application, obtaining the azimuth information of the predetermined point of the BIM includes: acquiring coordinates and direction information of a preset position in a field in the BIM coordinate space, wherein the preset position corresponds to a preset point in the BIM; the method includes the steps of determining orientation information of a predetermined point in the BIM coordinate space according to coordinates and direction information of the predetermined position in the BIM coordinate space, for example, obtaining coordinates and direction information of a certain position on a house window in the BIM coordinate space, and further determining the orientation information of the predetermined point in the BIM coordinate space according to the coordinates and the direction information in the corresponding BIM coordinate space, namely, realizing conversion from the predetermined position on site to the position in the BIM coordinate space, namely obtaining the orientation information of the predetermined point of the BIM, and further improving the precision of AR live-action navigation.
In another embodiment of the present application, acquiring coordinates and direction information of a predetermined location in a field in the BIM coordinate space includes: and obtaining the coordinates and the direction information of the preset position in the BIM coordinate space by AR scanning a mark image, wherein the mark image is arranged at the preset position, namely, the information of the preset position on the spot is marked by AR live-action scanning, and the marked information is mapped into the BIM coordinate space to obtain the coordinates and the direction information of the preset position in the BIM coordinate space, so that the unification of AR and BIM is realized, and the precision of AR live-action navigation is further improved.
In another embodiment of the present application, the marked image is a two-dimensional code, that is, a unique mark is given to each predetermined position through the two-dimensional code, so as to realize personalized marking of each predetermined position, facilitate subsequent live-action navigation, and further improve the accuracy of AR live-action navigation.
In another embodiment of the present invention, determining a coordinate space of the BIM in the AR coordinate space and a coordinate space of the road network in the AR coordinate space includes: determining a coordinate space of the BIM in the AR coordinate space according to the azimuth information of a predetermined point in the BIM and the coordinate of the predetermined point in the AR coordinate space; determining the coordinate space of the BIM in the AR coordinate space according to the coordinate space of the BIM in the AR coordinate space, determining the coordinate space of the BIM in the AR coordinate space according to the orientation information of the predetermined point in the BIM and the coordinate of the predetermined point in the AR coordinate space, namely realizing the unification of the BIM and the AR, further determining the coordinate space of the road network in the AR coordinate space on the basis of the BIM road network formed on the basis of the BIM, namely realizing the unification of the road network and the AR, namely realizing the statistics of the BIM, the road network and the AR, and then carrying out the AR live-action navigation on the basis of the BIM road network information, thereby greatly improving the precision of the AR live-action navigation, specifically, carrying out coordinate-based conversion on the AR space according to the orientation of the BIM in the AR space, and then using the AR image to identify the forward vector in the AR, thus, BIM orientation information can be obtained.
In a specific embodiment of the present application, the coordinate space of the AR coordinate space, the coordinate space of the BIM in the AR, and the coordinate space of the road network node in the BIM in the AR are determined, and the specific formula is as follows:
Figure BDA0002464353440000061
Figure BDA0002464353440000062
Figure BDA0002464353440000063
Figure BDA0002464353440000064
Figure BDA0002464353440000065
the matrix 1 represents an AR coordinate space, the matrix 2 represents a coordinate space of a BIM in an AR, the matrix 3 represents a coordinate space of a road network node in the BIM in the AR, the matrix 4 represents an origin of the BIM space in the AR coordinate space, the matrix 5 represents an orientation of the BIM origin in the AR, namely the coordinate space of the BIM in the AR is obtained through one-time coordinate transformation by taking the AR coordinate space as a reference, the coordinate space of the road network node in the BIM in the AR is obtained through one-time coordinate transformation, and orientation information and BIM modeling origin information of a BIM model in the real-time AR coordinate space are determined; when the AR is turned on, the coordinate space of the AR is initially established to obtain an AR coordinate system, where the AR coordinate system includes an origin of coordinates of the AR coordinate system and three coordinate bases of X, Y, and Z, which point in reality, and the BIM is placed in the AR coordinate system, and when the AR and BIM coordinate transformation is not performed, the origin of the AR coordinate system and the origin of the BIM may be mismatched.
Specifically, matching requires three key elements: key location points in the BIM (equivalent to the predetermined locations above) can be found on the construction site by the AR; matching of the construction site to the critical location point locations in the BIM; AR coordinate system (origin of coordinates, three-dimensional coordinate base). Since the coordinate system is developed into a left-handed system (Direct3D, unity, etc.) and a right-handed system (opengl, webgl, osg, etc.), although the representation ways are different, the solutions are the same, and a solution is given here, taking unity as an example:
(1) establishment of Unity AR coordinate basis: unity AR coordinate base is expressed as
Figure BDA0002464353440000071
unity is a left-handed coordinate system, each row representing one dimension, in turn the X, Y, Z axes.
(2) Key location points in BIM: the position of the key position point in the BIM relative to the BIM origin, that is, the coordinate of the key position point in the BIM space is [ px py pz ], and any position point has a corresponding orientation in the three-dimensional coordinate space, that is, the front-back direction, the up-down direction, and the left-right direction of the position point itself: namely, the Z-axis, the Y-axis, the X-axis, and the vector corresponding to the orientation of the key location point are represented as:
[ZxZyZz],[YxYyYz],[XxXyXz]
through the steps (1) and (2), the azimuth information of the key position points in the BIM in the AR coordinate space and the coordinates of the key position points in the BIM in the AR coordinate space are obtained.
(3) Matching the position of the construction site with the position of the key position point in the BIM, and obtaining the coordinate of the key position point in an AR coordinate space through AR scanning two-dimensional code or image recognition, wherein the coordinate is expressed as [ R [ ]xRyRz1]The coordinate changes with the change of the origin of the AR coordinate system, and the concrete conversion formula of the position of the construction site and the position of the key position point in the BIM is as follows:
Figure BDA0002464353440000072
Figure BDA0002464353440000073
firstly, acquiring the coordinate [ px py pz ] of the key position point in the BIM space]Converting the coordinates in the BIM space into an AR coordinate space through coordinate transformation, converting the coordinates into a matrix 7, and converting the [ R ] into a matrixxRyRz1]Multiplication with matrix 7 yields the corresponding locations of the key location points in the BIM at the construction site as matrix 6.
In another embodiment of the present application, the performing AR live-action navigation according to the BIM road network information includes: acquiring a current position and a target position; and generating a navigation path according to the current position, the target position and the BIM road network information, and displaying the navigation path on a display interface, wherein the navigation path is formed by connecting the road network nodes, namely, path planning is carried out according to the acquired current position, the target position and the BIM road network information, and then AR live-action navigation is carried out according to the planned path.
In another embodiment of the present application, the method further includes: and displaying auxiliary information on a display interface, wherein the auxiliary information comprises BIM related auxiliary information, a BIM sand table and related house type point cloud data, and realizing more accurate AR live-action navigation according to the auxiliary information and the BIM road network information.
In another embodiment of the present application, the BIM road network information includes at least one of: the method comprises the steps of obtaining road network information of a room in the current environment, road network information of a living room in the current environment and road network information of a passageway in the current environment, wherein the BIM road network information contains all important information influencing navigation path planning in the navigation environment, and more accurate AR live-action navigation is realized based on the BIM road network information.
The embodiment of the present application further provides a device for live-action navigation, and it should be noted that the device for live-action navigation according to the embodiment of the present application may be used to execute the method for live-action navigation according to the embodiment of the present application. The following describes a device for live-action navigation provided in an embodiment of the present application.
Fig. 2 is a schematic diagram of an apparatus for live action navigation according to an embodiment of the present application. As shown in fig. 2, the apparatus includes:
a first determining unit 10, configured to determine matching environment information, where the matching environment information is environment information in a BIM that matches environment information displayed by a current AR;
a second determining unit 20 configured to determine a coordinate space of the BIM in the AR coordinate space and a coordinate space of a road network in the AR coordinate space;
a generating unit 30 configured to generate BIM road network information in the AR based on the matching environment information, the coordinate space of the BIM in the AR coordinate space, and the coordinate space of the road network in the AR coordinate space, and may directly display the BIM road network information on a display interface, where the BIM road network information includes the BIM and/or road network data;
and a navigation unit 40, configured to perform AR live-action navigation according to the BIM road network information.
In the above scheme, the first determining unit determines the environment information in the BIM matched with the environment information displayed by the current AR, the second determining unit determines the coordinate space of the BIM in the AR coordinate space and the coordinate space of the road network in the AR coordinate space, and the generating unit generates the BIM road network information in the AR according to the matched environment information, the coordinate space of the BIM in the AR coordinate space and the coordinate space of the road network in the AR coordinate space, so that the BIM, the road network and the AR are unified, and the AR live-action navigation is performed according to the BIM road network information generated in the AR. Moreover, the device enriches the displayed content in the navigation process, so that the user can know the corresponding relation between the three-dimensional model and the real scene more conveniently.
In an embodiment of the present application, the first determining unit includes a first determining module, a first obtaining module, a second determining module, and a third determining module, where the first determining module is configured to determine an AR coordinate space; the first obtaining module is configured to obtain azimuth information of a predetermined point in the BIM and coordinates of the predetermined point in the AR coordinate space, where the azimuth information of the predetermined point in the BIM includes direction information of the predetermined point in the BIM and coordinates in the BIM coordinate space; the second determining module is used for determining the coordinate and the direction information of the origin in the BIM in the AR coordinate space according to the azimuth information of the predetermined point in the BIM and the coordinate of the predetermined point in the AR coordinate space; the third determining module is configured to determine matching environment information according to the coordinate of the origin in the BIM in the AR coordinate space, the direction information of the origin in the BIM in the AR coordinate space, and the environment information displayed by the current AR, where the matching environment information is the matching orientation information, where the matching orientation information includes the direction information in the BIM matching the environment information displayed by the current AR and the coordinate in the BIM coordinate space, that is, to achieve unification of the AR and the BIM, the orientation information of the predetermined point in the BIM and the coordinate of the predetermined point in the AR coordinate space need to be determined, that is, the corresponding position of the predetermined point in the BIM in the AR is determined, and further, according to the direction information of the predetermined point in the BIM, the coordinate in the BIM coordinate space, and the coordinate of the predetermined point in the AR coordinate space, the coordinate and the direction information of the origin in the BIM in the AR coordinate space are determined, and further determining the environment information in the BIM matched with the environment information displayed by the current AR according to the coordinate of the origin in the BIM in the AR coordinate space and the direction information of the origin in the BIM in the AR coordinate space, specifically determining the direction information and the coordinate matched with the direction information displayed by the current AR and the coordinate in the BIM, and more specifically determining the matched environment information corresponding to the origin in the AR coordinate space. The device can more accurately determine the matching environment information and further improve the precision of AR live-action navigation.
In another embodiment of the present application, the first obtaining module includes an obtaining submodule and a determining submodule, where the obtaining submodule is configured to obtain coordinates and direction information of a predetermined location in the BIM coordinate space, where the predetermined location corresponds to a predetermined point in the BIM; the determining submodule is configured to determine, according to the coordinate and the direction information of the predetermined position in the BIM coordinate space, the orientation information of the predetermined point in the BIM coordinate space, for example, obtain the coordinate and the direction information of a certain position on a house window in the BIM coordinate space, and further determine, according to the coordinate and the direction information in the corresponding BIM coordinate space, the orientation information of the predetermined point in the BIM coordinate space, that is, the conversion from the predetermined position on site to the position in the BIM coordinate space is realized, that is, the orientation information of the predetermined point of the BIM is obtained, so that the accuracy of AR live-view navigation is further improved.
In another embodiment of the application, the obtaining sub-module is further configured to obtain coordinates and direction information of the predetermined location in the BIM coordinate space by scanning a marker image through an AR, where the marker image is set at the predetermined location, that is, the information of the predetermined location on the scene is marked by scanning an AR live-action, and the marked information is mapped into the BIM coordinate space to obtain coordinates and direction information of the predetermined location in the BIM coordinate space, so as to implement the unification of the AR and the BIM, and further improve the accuracy of the AR live-action navigation.
In another embodiment of the present application, the marked image is a two-dimensional code, that is, a unique mark is given to each predetermined position through the two-dimensional code, so as to realize personalized marking of each predetermined position, facilitate subsequent live-action navigation, and further improve the accuracy of AR live-action navigation.
In another embodiment of the present application, the second determining unit includes a fourth determining module and a fifth determining module, the fourth determining module is configured to determine a coordinate space of the BIM in the AR coordinate space according to the azimuth information of the predetermined point in the BIM and the coordinate of the predetermined point in the AR coordinate space; the fifth determining module is used for determining the coordinate space of the road network in the AR coordinate space according to the coordinate space of the BIM in the AR coordinate space, determining the coordinate space of the BIM in the AR coordinate space according to the orientation information of the predetermined point in the BIM and the coordinate of the predetermined point in the AR coordinate space, namely realizing the unification of the BIM and the AR, further determining the coordinate space of the road network in the AR coordinate space on the basis of realizing the unification of the BIM and the AR, namely realizing the unification of the road network and the AR, namely realizing the statistics of the BIM, the road network and the AR, and then carrying out the AR live-action navigation based on the BIM road network information, thereby greatly improving the precision of the AR live-action navigation, specifically, the orientation of the BIM in the AR space is subjected to coordinate-based conversion in the AR space, and then the forward vector in the AR is identified by utilizing the AR image, thus, BIM orientation information can be obtained.
In a specific embodiment of the present application, the coordinate space of the AR coordinate space, the coordinate space of the BIM in the AR, and the coordinate space of the road network node in the BIM in the AR are determined, and the specific formula is as follows:
Figure BDA0002464353440000101
Figure BDA0002464353440000102
Figure BDA0002464353440000103
Figure BDA0002464353440000104
Figure BDA0002464353440000105
the matrix 1 represents an AR coordinate space, the matrix 2 represents a coordinate space of a BIM in an AR, the matrix 3 represents a coordinate space of a road network node in the BIM in the AR, the matrix 4 represents an origin of the BIM space in the AR coordinate space, the matrix 5 represents an orientation of the BIM origin in the AR, namely the coordinate space of the BIM in the AR is obtained through one-time coordinate transformation by taking the AR coordinate space as a reference, the coordinate space of the road network node in the BIM in the AR is obtained through one-time coordinate transformation, and orientation information and BIM modeling origin information of a BIM model in the real-time AR coordinate space are determined; when the AR is started, the coordinate space of the AR is established to obtain an AR coordinate system, and the AR coordinate system comprises the coordinate origin of the AR coordinate system and three coordinate bases of X, Y and Z, and points to the coordinate bases in reality. When the BIM is placed in the AR coordinate system, and the origin of the AR coordinate system and the origin of the BIM are mismatched without transforming the AR and BIM coordinates, for example, if the AR coordinate system is established in the first room and the BIM is based on the three-dimensional modeling of the second room, if the origin of the AR coordinate system and the origin of the BIM are simply matched, that is, the origin of the AR coordinate system established in the first room and the origin of the BIM based on the three-dimensional modeling of the second room are actually matched, that is, the AR coordinate system and the BIM are mismatched, so the AR coordinate space and the BIM need to be matched.
Specifically, matching requires three key elements: key location points (equivalent to the predetermined locations above) created in BIM can be found on the construction site by AR; matching of the construction site to the critical location point locations in the BIM; AR coordinate system (origin of coordinates, three-dimensional coordinate base). Since the coordinate system is developed into a left-handed system (Direct3D, unity, etc.) and a right-handed system (opengl, webgl, osg, etc.), although the representation ways are different, the solutions are the same, and a solution is given here, taking unity as an example:
(1) establishment of Unity AR coordinate basis: unity AR coordinate base is expressed as
Figure BDA0002464353440000111
unity is a left-handed coordinate system, each row representing one dimension, in turn the X, Y, Z axes.
(2) Key location points in BIM: the position of the key position point in the BIM relative to the BIM origin, that is, the coordinate of the key position point in the BIM space is [ px py pz ], and any position point has a corresponding orientation in the three-dimensional coordinate space, that is, the front-back direction, the up-down direction, and the left-right direction of the position point itself: namely, the Z-axis, the Y-axis, the X-axis, and the vector corresponding to the orientation of the key location point are represented as:
[ZxZyZz],[YxYyYz],[XxXyXz]
through the steps (1) and (2), the azimuth information of the key position points in the BIM in the AR coordinate space and the coordinates of the key position points in the BIM in the AR coordinate space are obtained.
(3) Matching the position of the construction site with the position of the key position point in the BIM, and obtaining the coordinate of the key position point in an AR coordinate space through AR scanning two-dimensional code or image recognition, wherein the coordinate is expressed as [ R [ ]xRyRz1]The coordinate changes with the change of the origin of the AR coordinate system, and the concrete conversion formula of the position of the construction site and the position of the key position point in the BIM is as follows:
Figure BDA0002464353440000112
Figure BDA0002464353440000113
firstly, acquiring the coordinate [ px py pz ] of the key position point in the BIM space]Converting the coordinates in the BIM space into an AR coordinate space through coordinate transformation, converting the coordinates into a matrix 7, and converting the [ R ] into a matrixxRyRz1]Multiplication with matrix 7 yields the corresponding locations of the key location points in the BIM at the construction site as matrix 6.
In another embodiment of the application, the navigation unit comprises a second obtaining module and a generating module, wherein the second obtaining module is used for obtaining the current position and the target position; the generation module is used for generating a navigation path according to the current position, the target position and the BIM road network information and displaying the navigation path on a display interface, wherein the navigation path is formed by connecting the road network nodes, namely, path planning is carried out according to the obtained current position, the target position and the BIM road network information, and then AR live-action navigation is carried out according to the planned path.
In another embodiment of the present application, the apparatus further includes: and displaying auxiliary information on a display interface, wherein the auxiliary information comprises BIM related auxiliary information, a BIM sand table and related house type point cloud data, and realizing more accurate AR live-action navigation according to the auxiliary information and the BIM road network information.
In another embodiment of the present application, the BIM road network information includes at least one of: the method comprises the steps of obtaining road network information of a room in the current environment, road network information of a living room in the current environment and road network information of a passageway in the current environment, wherein the BIM road network information contains all important information influencing navigation path planning in the navigation environment, and more accurate AR live-action navigation is realized based on the BIM road network information.
The live-action navigation device comprises a processor and a memory, wherein the first determining unit, the second determining unit, the generating unit, the navigation unit and the like are stored in the memory as program units, and the processor executes the program units stored in the memory to realize corresponding functions.
The processor comprises a kernel, and the kernel calls the corresponding program unit from the memory. The kernel can be set to be one or more, and the precision of the AR live-action navigation based on the BIM is improved by adjusting the kernel parameters.
The memory may include volatile memory in a computer readable medium, Random Access Memory (RAM) and/or nonvolatile memory such as Read Only Memory (ROM) or flash memory (flash RAM), and the memory includes at least one memory chip.
An embodiment of the present invention provides a storage medium, on which a program is stored, and the program, when executed by a processor, implements the method for live-action navigation.
The embodiment of the invention provides a processor, wherein the processor is used for running a program, and the method for real scene navigation is executed when the program runs.
The embodiment of the invention provides equipment, which comprises a processor, a memory and a program which is stored on the memory and can run on the processor, wherein when the processor executes the program, at least the following steps are realized:
step S101, determining matching environment information, wherein the matching environment information is environment information in a BIM (building information modeling) matched with the environment information displayed by a current AR (augmented reality);
step S102, determining a coordinate space of the BIM in an AR coordinate space and a coordinate space of a road network in the AR coordinate space;
step S103 of generating BIM road network information in the AR coordinate space according to the matching environment information, the coordinate space of the BIM in the AR coordinate space, and the coordinate space of the road network in the AR coordinate space, wherein the BIM road network information includes the BIM and/or road network data;
and step S104, performing AR live-action navigation according to the BIM road network information.
The device herein may be a server, a PC, a PAD, a mobile phone, etc.
The present application further provides a computer program product adapted to perform a program of initializing at least the following method steps when executed on a data processing device:
step S101, determining matching environment information, wherein the matching environment information is environment information in a BIM (building information modeling) matched with the environment information displayed by a current AR (augmented reality);
step S102, determining a coordinate space of the BIM in an AR coordinate space and a coordinate space of a road network in the AR coordinate space;
step S103 of generating BIM road network information in the AR coordinate space according to the matching environment information, the coordinate space of the BIM in the AR coordinate space, and the coordinate space of the road network in the AR coordinate space, wherein the BIM road network information includes the BIM and/or road network data;
and step S104, performing AR live-action navigation according to the BIM road network information.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). The memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.
From the above description, it can be seen that the above-described embodiments of the present application achieve the following technical effects:
1) the method for real-scene navigation comprises the steps of determining environment information in a BIM matched with environment information displayed by a current AR, determining a coordinate space of the BIM in an AR coordinate space, generating BIM road network information in the AR according to the matched environment information, the coordinate space of the BIM in the AR coordinate space and the coordinate space of the road network in the AR coordinate space, realizing the unification of the BIM, the road network and the AR, and performing the AR real-scene navigation according to the BIM road network information generated in the AR. In addition, the method enriches the displayed content in the navigation process, so that the user can know the corresponding relation between the three-dimensional model and the real scene more conveniently.
2) The device for real-scene navigation comprises a first determining unit, a second determining unit, a generating unit and a navigation unit, wherein the first determining unit determines the environment information in the BIM matched with the environment information displayed by the current AR, the second determining unit determines the coordinate space of the BIM in the AR coordinate space and the coordinate space of the road network in the AR coordinate space, and the generating unit generates the BIM road network information in the AR according to the matched environment information, the coordinate space of the BIM in the AR coordinate space and the coordinate space of the road network in the AR coordinate space. Moreover, the device enriches the displayed content in the navigation process, so that the user can know the corresponding relation between the three-dimensional model and the real scene more conveniently.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. A method of live action navigation, comprising:
determining matching environment information, wherein the matching environment information is environment information in the BIM matched with the environment information displayed by the current AR;
determining a coordinate space of the BIM in an AR coordinate space and a coordinate space of a road network in the AR coordinate space;
generating BIM road network information in the AR coordinate space according to the matching environment information, the coordinate space of the BIM in the AR coordinate space and the coordinate space of the road network in the AR coordinate space, wherein the BIM road network information comprises the BIM and/or road network data;
and performing AR live-action navigation according to the BIM road network information.
2. The method of claim 1, wherein determining matching context information comprises:
determining an AR coordinate space;
acquiring azimuth information of a predetermined point in the BIM and coordinates of the predetermined point in the AR coordinate space, the azimuth information of the predetermined point in the BIM including direction information of the predetermined point in the BIM and coordinates in the BIM coordinate space;
determining coordinates and direction information of an origin in the BIM in the AR coordinate space according to the azimuth information of a predetermined point in the BIM and the coordinates of the predetermined point in the AR coordinate space;
according to the coordinate of the origin in the BIM in the AR coordinate space, the origin in the BIM in the direction information in the AR coordinate space and the current AR displayed environment information, the matching environment information is determined, the matching environment information is matching azimuth information, and the matching azimuth information comprises the direction information in the BIM matched with the current AR displayed environment information and the coordinate in the BIM coordinate space.
3. The method of claim 2, wherein obtaining orientation information of a predetermined point in the BIM comprises:
acquiring coordinates and direction information of a preset position on site in the BIM coordinate space, wherein the preset position corresponds to a preset point in the BIM;
and determining the azimuth information of the predetermined point in the BIM coordinate space according to the coordinate of the predetermined position in the BIM coordinate space and the direction information.
4. The method of claim 3, wherein obtaining coordinates and orientation information of a predetermined location of a site in the BIM coordinate space comprises:
and scanning a mark image through AR to obtain the coordinates and the direction information of the preset position in the BIM coordinate space, wherein the mark image is arranged at the preset position.
5. The method of claim 4, wherein the marker image is a two-dimensional code.
6. The method of claim 2, wherein determining the coordinate space of the BIM in the AR coordinate space and the coordinate space of the road network in the AR coordinate space comprises:
determining a coordinate space of the BIM in the AR coordinate space according to the azimuth information of a predetermined point in the BIM and the coordinate of the predetermined point in the AR coordinate space;
and determining the coordinate space of the road network in the AR coordinate space according to the coordinate space of the BIM in the AR coordinate space.
7. The method according to any one of claims 1 to 6, wherein performing AR live action navigation according to the BIM road network information comprises:
acquiring a current position and a target position;
and generating a navigation path according to the current position, the target position and the BIM road network information, and displaying the navigation path on a display interface, wherein the navigation path is formed by connecting road network nodes.
8. An apparatus for live action navigation, comprising:
the device comprises a first determining unit, a second determining unit and a display unit, wherein the first determining unit is used for determining matching environment information which is environment information in the BIM matched with environment information displayed by the current AR;
the second determination unit is used for determining a coordinate space of the BIM in an AR coordinate space and a coordinate space of a road network in the AR coordinate space;
a generating unit, configured to generate BIM road network information in the AR coordinate space according to the matching environment information, the coordinate space of the BIM in the AR coordinate space, and the coordinate space of the road network in the AR coordinate space, where the BIM road network information includes the BIM and/or road network data;
and the navigation unit is used for carrying out AR live-action navigation according to the BIM road network information.
9. A storage medium, characterized in that the storage medium comprises a stored program, wherein the program performs the method of any one of claims 1 to 7.
10. A processor, characterized in that the processor is configured to run a program, wherein the program when running performs the method of any of claims 1 to 7.
CN202010329237.XA 2020-04-23 2020-04-23 Live-action navigation method, live-action navigation device, storage medium and processor Pending CN111521193A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010329237.XA CN111521193A (en) 2020-04-23 2020-04-23 Live-action navigation method, live-action navigation device, storage medium and processor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010329237.XA CN111521193A (en) 2020-04-23 2020-04-23 Live-action navigation method, live-action navigation device, storage medium and processor

Publications (1)

Publication Number Publication Date
CN111521193A true CN111521193A (en) 2020-08-11

Family

ID=71910513

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010329237.XA Pending CN111521193A (en) 2020-04-23 2020-04-23 Live-action navigation method, live-action navigation device, storage medium and processor

Country Status (1)

Country Link
CN (1) CN111521193A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112052502A (en) * 2020-09-04 2020-12-08 中国十七冶集团有限公司 Route identification method based on BIM + AR technology
CN117215305A (en) * 2023-09-12 2023-12-12 北京城建智控科技股份有限公司 Travel auxiliary system

Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103234547A (en) * 2013-04-18 2013-08-07 易图通科技(北京)有限公司 Method and device for displaying road scene in vacuum true three-dimensional navigation
US20150109338A1 (en) * 2013-10-17 2015-04-23 Nant Holdings Ip, Llc Wide area augmented reality location-based services
CN105069842A (en) * 2015-08-03 2015-11-18 百度在线网络技术(北京)有限公司 Modeling method and device for three-dimensional model of road
CN107037881A (en) * 2017-03-24 2017-08-11 广西七三科技有限公司 The interactive demonstration method and system of GIS and BIM augmented realities in piping lane, subway work
CN108227929A (en) * 2018-01-15 2018-06-29 廖卫东 Augmented reality setting-out system and implementation method based on BIM technology
CN108318010A (en) * 2017-11-27 2018-07-24 中建华东投资有限公司 A kind of pit retaining monitoring measuring point fast selecting method based on BIM
CN108363860A (en) * 2018-02-07 2018-08-03 中交公局第二工程有限公司 A kind of 3-D abnormal bridge formwork assembly setting out method based on BIM technology
US20180253900A1 (en) * 2017-03-02 2018-09-06 Daqri, Llc System and method for authoring and sharing content in augmented reality
CN108917758A (en) * 2018-02-24 2018-11-30 石化盈科信息技术有限责任公司 A kind of navigation methods and systems based on AR
CN108921946A (en) * 2018-06-25 2018-11-30 中国人民解放军陆军工程大学 A kind of hidden pipeline of engineering based on BIM+AR measures and spatial position automatic matching method
US20180349703A1 (en) * 2018-07-27 2018-12-06 Yogesh Rathod Display virtual objects in the event of receiving of augmented reality scanning or photo of real world object from particular location or within geofence and recognition of real world object
CN109031464A (en) * 2018-07-05 2018-12-18 国网福建省电力有限公司 Buried cable 3-dimensional digital visable positioning method based on AR and BIM
CN109960717A (en) * 2019-03-22 2019-07-02 北京建筑大学 Indoor navigation road network map model data method for organizing and system
CN110222137A (en) * 2019-06-11 2019-09-10 鲁东大学 One kind is based on oblique photograph and augmented reality Intelligent campus system
CN110298852A (en) * 2019-06-21 2019-10-01 中国电建集团成都勘测设计研究院有限公司 Geological boundary extraction method based on unmanned plane image chromatography
CN110516386A (en) * 2019-08-30 2019-11-29 天津住总机电设备安装有限公司 A method of live coordinate position is determined based on mobile phone B IM construction drawing AR
CN110609883A (en) * 2019-09-20 2019-12-24 成都中科大旗软件股份有限公司 AR map dynamic navigation system
CN110933632A (en) * 2019-12-03 2020-03-27 北京建筑大学 Terminal indoor positioning method and system
CN110956690A (en) * 2019-11-19 2020-04-03 广东博智林机器人有限公司 Building information model generation method and system
CN110969704A (en) * 2019-11-27 2020-04-07 北京新势界科技有限公司 Marker generation tracking method and device based on AR guide
CN110990917A (en) * 2019-11-19 2020-04-10 北京长空云海科技有限公司 BIM model display method, device and system
CN111006676A (en) * 2019-11-14 2020-04-14 广东博智林机器人有限公司 Map construction method, device and system

Patent Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103234547A (en) * 2013-04-18 2013-08-07 易图通科技(北京)有限公司 Method and device for displaying road scene in vacuum true three-dimensional navigation
US20150109338A1 (en) * 2013-10-17 2015-04-23 Nant Holdings Ip, Llc Wide area augmented reality location-based services
CN105069842A (en) * 2015-08-03 2015-11-18 百度在线网络技术(北京)有限公司 Modeling method and device for three-dimensional model of road
US20180253900A1 (en) * 2017-03-02 2018-09-06 Daqri, Llc System and method for authoring and sharing content in augmented reality
CN107037881A (en) * 2017-03-24 2017-08-11 广西七三科技有限公司 The interactive demonstration method and system of GIS and BIM augmented realities in piping lane, subway work
CN108318010A (en) * 2017-11-27 2018-07-24 中建华东投资有限公司 A kind of pit retaining monitoring measuring point fast selecting method based on BIM
CN108227929A (en) * 2018-01-15 2018-06-29 廖卫东 Augmented reality setting-out system and implementation method based on BIM technology
CN108363860A (en) * 2018-02-07 2018-08-03 中交公局第二工程有限公司 A kind of 3-D abnormal bridge formwork assembly setting out method based on BIM technology
CN108917758A (en) * 2018-02-24 2018-11-30 石化盈科信息技术有限责任公司 A kind of navigation methods and systems based on AR
CN108921946A (en) * 2018-06-25 2018-11-30 中国人民解放军陆军工程大学 A kind of hidden pipeline of engineering based on BIM+AR measures and spatial position automatic matching method
CN109031464A (en) * 2018-07-05 2018-12-18 国网福建省电力有限公司 Buried cable 3-dimensional digital visable positioning method based on AR and BIM
US20180349703A1 (en) * 2018-07-27 2018-12-06 Yogesh Rathod Display virtual objects in the event of receiving of augmented reality scanning or photo of real world object from particular location or within geofence and recognition of real world object
CN109960717A (en) * 2019-03-22 2019-07-02 北京建筑大学 Indoor navigation road network map model data method for organizing and system
CN110222137A (en) * 2019-06-11 2019-09-10 鲁东大学 One kind is based on oblique photograph and augmented reality Intelligent campus system
CN110298852A (en) * 2019-06-21 2019-10-01 中国电建集团成都勘测设计研究院有限公司 Geological boundary extraction method based on unmanned plane image chromatography
CN110516386A (en) * 2019-08-30 2019-11-29 天津住总机电设备安装有限公司 A method of live coordinate position is determined based on mobile phone B IM construction drawing AR
CN110609883A (en) * 2019-09-20 2019-12-24 成都中科大旗软件股份有限公司 AR map dynamic navigation system
CN111006676A (en) * 2019-11-14 2020-04-14 广东博智林机器人有限公司 Map construction method, device and system
CN110956690A (en) * 2019-11-19 2020-04-03 广东博智林机器人有限公司 Building information model generation method and system
CN110990917A (en) * 2019-11-19 2020-04-10 北京长空云海科技有限公司 BIM model display method, device and system
CN110969704A (en) * 2019-11-27 2020-04-07 北京新势界科技有限公司 Marker generation tracking method and device based on AR guide
CN110933632A (en) * 2019-12-03 2020-03-27 北京建筑大学 Terminal indoor positioning method and system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112052502A (en) * 2020-09-04 2020-12-08 中国十七冶集团有限公司 Route identification method based on BIM + AR technology
CN117215305A (en) * 2023-09-12 2023-12-12 北京城建智控科技股份有限公司 Travel auxiliary system

Similar Documents

Publication Publication Date Title
CN106767827B (en) Mobile robot point cloud map creation method based on laser data
JP2019120927A (en) Method and device for creating grid map
US8390534B2 (en) Method and device for generating tracking configurations for augmented reality applications
CN108921946A (en) A kind of hidden pipeline of engineering based on BIM+AR measures and spatial position automatic matching method
Paczkowski et al. Insitu: sketching architectural designs in context.
CN104897160B (en) A kind of localization method and device based on vector quantization indoor map
KR102097416B1 (en) An augmented reality representation method for managing underground pipeline data with vertical drop and the recording medium thereof
CN113741698A (en) Method and equipment for determining and presenting target mark information
CN107577750A (en) Draw the method and system at navigation data vector crossing
CN111521193A (en) Live-action navigation method, live-action navigation device, storage medium and processor
US20210097760A1 (en) System and method for collecting geospatial object data with mediated reality
CN109857825A (en) A kind of threedimensional model methods of exhibiting and system
Rohacz et al. Concept for the comparison of intralogistics designs with real factory layout using augmented reality, SLAM and marker-based tracking
CN108534789B (en) Multipath positioning coordinate unifying method, electronic equipment and readable storage medium
CN115049811A (en) Editing method, system and storage medium of digital twin virtual three-dimensional scene
CN113626455A (en) Method and device for updating picture library in linkage manner, electronic equipment and storage medium
Wither et al. Using aerial photographs for improved mobile AR annotation
CN110058684A (en) A kind of geography information exchange method, system and storage medium based on VR technology
KR102276451B1 (en) Apparatus and method for modeling using gis
CN111127661A (en) Data processing method and device and electronic equipment
CN104680578A (en) BIM-based axis labeling method and system
CN111724485A (en) Method, device, electronic equipment and storage medium for realizing virtual-real fusion
Liu et al. A 2d and 3d indoor mapping approach for virtual navigation services
Bednarczyk The use of augmented reality in geomatics
CN115240140A (en) Equipment installation progress monitoring method and system based on image recognition

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200811