CN112381946A - Digital scene viewing method and device, storage medium and computer equipment - Google Patents

Digital scene viewing method and device, storage medium and computer equipment Download PDF

Info

Publication number
CN112381946A
CN112381946A CN202011412279.6A CN202011412279A CN112381946A CN 112381946 A CN112381946 A CN 112381946A CN 202011412279 A CN202011412279 A CN 202011412279A CN 112381946 A CN112381946 A CN 112381946A
Authority
CN
China
Prior art keywords
target object
position corresponding
viewing
information
navigation mark
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202011412279.6A
Other languages
Chinese (zh)
Inventor
尤勇敏
其他发明人请求不公开姓名
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiuling Jiangsu Digital Intelligent Technology Co Ltd
Original Assignee
Jiuling Jiangsu Digital Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiuling Jiangsu Digital Intelligent Technology Co Ltd filed Critical Jiuling Jiangsu Digital Intelligent Technology Co Ltd
Priority to CN202011412279.6A priority Critical patent/CN112381946A/en
Publication of CN112381946A publication Critical patent/CN112381946A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/10Constructive solid geometry [CSG] using solid primitives, e.g. cylinders, cubes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Architecture (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a digital scene checking method and a digital scene checking device, which are suitable for a computer device, wherein the method comprises the following steps: obtaining a building information model; generating a navigation mark to the building information model, wherein the navigation mark is used for representing the position of an observer; judging whether the distance between the position corresponding to the navigation mark and the position corresponding to the target object is within a preset distance range, wherein the target object is drawn on the building information model; and when the distance between the position corresponding to the navigation mark and the position corresponding to the target object is judged to be within a preset distance range, displaying the related information of the target object. The method provides three-dimensional scene establishment and immersive interactive experience by using a BIM technology and an immersive technology, thereby reducing waste of on-site viewing time and human cost, and can freely control viewing routes to improve participation and enhance sense of substitution.

Description

Digital scene viewing method and device, storage medium and computer equipment
Technical Field
The invention relates to the technical field of building informatization, in particular to a digital scene checking method, a digital scene checking device, a storage medium and computer equipment.
Background
The Building Information Model (BIM) is a Building Information model and has five characteristics of visualization, coordination, simulation, optimization and diagraph. The system can realize the integration of building information, and based on a three-dimensional digital information technology, through the integration of the datamation and informatization models of the building, the building information can be shared and transmitted in the whole life cycle process of project planning, operation and maintenance, so that related personnel can correctly understand and efficiently respond to various building information.
Existing immersive technologies can build digital scenes to interact with the audience and provide the audience with an immersive experience. If an immersive experience is provided, various digital technologies, such as interactive projection, touch interaction, dynamic projection, VR/AR/MR, holographic imaging, etc., need to be fused.
Traditional factory line inspection is primarily based on one-way viewing and interpretation. However, the conventional immersive experience is greatly different from the real live experience, so that the immersive experience in the real environment cannot be provided, and the substituted feeling is not high. Moreover, the scene display mostly uses a roaming video method for the observer to browse, and the roaming is based on the route planned by the designer in advance and cannot be controlled by the intention of the observer.
In view of the above, how to apply the existing immersive technology and BIM technology to the existing factory production line inspection is an important research topic for the relevant technicians and researchers.
Disclosure of Invention
An object of the present invention is to provide a digital scene viewing method, apparatus, storage medium and computer device, which provide three-dimensional scene creation and immersive interactive experience by using BIM technology and immersive technology, thereby reducing waste of on-site viewing time and human cost, and which can freely control viewing routes to improve participation and enhance sense of substitution. In addition, the method can also provide immersive viewing interaction, thereby providing an immersive experience. As does the device.
According to a first aspect of the present invention, an embodiment of the present invention provides a digital scene viewing method, which is applied to a computer device, and the method includes: obtaining a building information model; generating a navigation mark to the building information model, wherein the navigation mark is used for representing the position of an observer; judging whether the distance between the position corresponding to the navigation mark and the position corresponding to the target object is within a preset distance range, wherein the target object is drawn on the building information model; and when the distance between the position corresponding to the navigation mark and the position corresponding to the target object is judged to be within a preset distance range, displaying the related information of the target object.
On the basis of the technical scheme, the method can be further improved.
In at least some embodiments of the present invention, the step of obtaining the building information model is preceded by the steps of: receiving a digital scene viewing request, wherein the digital scene viewing request is used for requesting to view the related information of a target object; responding to the digital scene viewing request, and acquiring viewing behavior information of a target object; and analyzing the viewing behavior information to obtain the position corresponding to the navigation mark.
In at least some embodiments of the present invention, in the step of parsing the viewing behavior information to obtain a location corresponding to a navigation mark, the method further includes: acquiring navigation visual angle information; and when the distance between the position corresponding to the navigation mark and the position corresponding to the target object is judged to be within a preset distance range, correspondingly adjusting the position of the observer relative to the target object according to the navigation visual angle information, wherein the target object and the observer are positioned on the same plane.
In at least some embodiments of the present invention, when it is determined that the distance between the position corresponding to the navigation mark and the position corresponding to the target object exceeds a preset distance range, the related information of the target object is hidden.
According to a second aspect of the present invention, an embodiment of the present invention provides a digital scene viewing apparatus, which includes: a building model providing unit for acquiring a building information model; a navigation mark generation unit for generating a navigation mark to the building information model, wherein the navigation mark is used for representing the position of the observer; the distance judging unit is used for judging whether the distance between the position corresponding to the navigation mark and the position corresponding to the target object is within a preset distance range or not, wherein the target object is drawn on the building information model; and the information display unit is used for displaying the related information of the target object when judging that the distance between the position corresponding to the navigation mark and the position corresponding to the target object is within a preset distance range.
In at least some embodiments of the invention, the apparatus further comprises: a viewing request receiving unit, configured to receive a digital scene viewing request, where the digital scene viewing request is used to request to view related information of a target object; the viewing information acquisition unit is used for responding to the digital scene viewing request and acquiring viewing behavior information of a target object; and the viewing information analyzing unit is used for analyzing the viewing behavior information to obtain the position corresponding to the navigation mark.
In at least some embodiments of the invention, the apparatus further comprises: the visual angle information acquisition unit is used for acquiring navigation visual angle information; and the navigation visual angle adjusting unit is used for correspondingly adjusting the position of the observer relative to the target object according to the navigation visual angle information when the distance between the position corresponding to the navigation mark and the position corresponding to the target object is judged to be within a preset distance range, wherein the target object and the observer are positioned on the same plane.
In at least some embodiments of the invention, the apparatus further comprises: and the information hiding unit is used for hiding the related information of the target object when judging that the distance between the position corresponding to the navigation mark and the position corresponding to the target object exceeds a preset distance range.
According to a third aspect of the present invention, an embodiment of the present invention provides a computer device, which includes a processor and a memory, where the memory stores a computer program that is executable on the processor, and the processor implements the steps of the digital scene viewing method as provided in any embodiment of the present invention when executing the computer program.
According to a fourth aspect of the present invention, there is provided a storage medium having a computer program stored thereon, the computer program, when executed by a processor, implementing the steps of the digital scene viewing method as provided in any of the embodiments of the present invention.
The invention provides a digital scene viewing method, a device, a storage medium and a computer device, wherein the method provides three-dimensional scene establishment and immersive interactive experience by using a BIM technology and an immersive technology, so that waste of on-site viewing time and labor cost is reduced, viewing routes can be freely controlled, participation is improved, and substitution feeling is enhanced. In addition, the method can also provide immersive viewing interaction, thereby providing an immersive experience.
Drawings
The technical solution and other advantages of the present invention will become apparent from the following detailed description of specific embodiments of the present invention, which is to be read in connection with the accompanying drawings.
Fig. 1 is a schematic step diagram of a digital scene viewing method in an embodiment of the present invention.
Fig. 2 is a schematic step diagram of a preamble of step S110 shown in fig. 1.
Fig. 3 is a schematic step diagram of another embodiment of the steps subsequent to step S130 shown in fig. 1.
Fig. 4 is a schematic structural diagram of a digital scene viewing apparatus according to an embodiment of the present invention.
Fig. 5 is a schematic internal structure diagram of a computer device according to an embodiment of the present invention.
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. It is to be understood that the described embodiments are merely exemplary of the invention, and not restrictive of the full scope of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The digital scene viewing method provided by the embodiment can be applied to computer equipment. The computer device may be an electronic device with a drawing application installed, such as a smart phone, a tablet computer, a notebook computer, a desktop computer, or a personal digital assistant, and the specific form of the computer device is not limited in this embodiment.
It should be noted that, in the digital scene viewing method provided in the embodiment of the present invention, an execution main body may be a digital scene viewing apparatus, and the apparatus may be implemented as part or all of a computer device in a software, hardware, or a combination of software and hardware. The execution subject of the method embodiments described below is illustrated with a computer device as an example to enable the provision of a three-dimensional scene creation and immersive interactive experience. A hardware environment consisting of computer devices (including clients or servers). The client may be a terminal device such as an intelligent display screen, an intelligent television, a smart phone, a tablet Computer, a notebook Computer, a touch screen, and a Personal Computer (PC). The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as cloud service, a cloud database, cloud computing, a cloud function, cloud storage, network service, cloud communication, middleware service, domain name service, security service, content distribution network service, big data and an artificial intelligence platform. Meanwhile, the computer device may implement the following method through a high-level computer language, which may include CSharp and JavaScript. The building model herein refers to a building informatization model, and the same is applied hereinafter. In addition, it should be noted that the review herein may also refer to a review.
In order to make the objects, technical solutions and advantages of the present invention more apparent, the technical solutions in the embodiments of the present invention are further described in detail by the following embodiments in conjunction with the accompanying drawings. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Fig. 1 is a schematic step diagram of a digital scene viewing method according to an embodiment of the present invention. Referring to fig. 1, the present invention provides a digital scene viewing method, which is suitable for a computer device, and the method includes: step S110, obtaining a building information model; step S120, generating a navigation mark on the building information model, wherein the navigation mark is used for representing the position of an observer; step S130, judging whether the distance between the position corresponding to the navigation mark and the position corresponding to a target object is within a preset distance range, wherein the target object is drawn on the building information model; and step S140, when the distance between the position corresponding to the navigation mark and the position corresponding to the target object is judged to be within a preset distance range, displaying the related information of the target object. The method of the invention provides three-dimensional scene establishment and immersive interactive experience by using the BIM technology and the immersive technology, thereby reducing the waste of on-site viewing time and labor cost, and freely controlling viewing routes to improve participation and enhance the sense of substitution. In addition, the method can also provide immersive viewing interaction, thereby providing an immersive experience.
Each step of the method is described further below.
Step S110, a building information model is acquired.
In this step, in the computer device, the building information model may be loaded in a browser or a presentation platform provided by the computer device through building software, including but not limited to Revit, Sketchup, and the like. It should be noted that the building software such as Revit, Sketchup, etc. may also be referred to as BIM engine tool. Through these BIM engine tools, the building information model can be loaded.
Before performing step S110, the following steps may be further performed.
Fig. 2 is a schematic step diagram of a preamble of step S110 shown in fig. 1. Referring to fig. 2, the method further includes: step S101, receiving a digital scene viewing request, wherein the digital scene viewing request is used for requesting to view the relevant information of a target object; step S102, responding to the digital scene viewing request, and acquiring viewing behavior information of a target object; step S103, analyzing the viewing behavior information to obtain the position corresponding to the navigation mark.
Specifically, step S101 receives a digital scene viewing request, where the digital scene viewing request is used to request to view the relevant information of the target object.
In this step, a user (or viewer, the same applies hereinafter) may send a digital scene viewing request to the computer device through a client of the computer device. The computer device may make a corresponding response after receiving the digital scene viewing request sent by the user, which may be specifically referred to as the following description. In this embodiment, the client of the computer device is configured to receive a digital scene viewing request sent by a user. Here, the digital scene is a three-dimensional scene provided by the building information model. Further, the three-dimensional scene includes a scene corresponding to a target object to be viewed by the user. For example, if the target object is a production line, the digital scene is a production line scene that can be represented digitally. For another example, if the target object is a school, the digital scene is a school scene that can be represented digitally. For another example, if the target object is a mall, the digital scene is a digitally representable mall scene.
And step S102, responding to the digital scene viewing request, and acquiring viewing behavior information of the target object.
In this step, the computer device automatically provides (or sends) a feedback information indication in response to the digital scene view request sent by the user and displays it on the client of the computer device. After receiving the feedback information indication, the user provides viewing behavior information of the target object to the computer device by operating a client of the computer device (for example, the user may operate a navigation mark, see below for details). Thus, the computer device may obtain viewing behavior information for the target object and may store the corresponding viewing behavior information. In this embodiment, the viewing behavior information may be stored in a local computer device, or may be sent to another server through the computer device for later use. The viewing behavior information at least comprises a viewing position, a viewing path and a viewing duration. For example, when the target object is a production line, the viewing behavior information may include a viewing position, a viewing path, and a viewing duration when the observer views the production line.
Step S103, analyzing the viewing behavior information to obtain the position corresponding to the navigation mark.
After obtaining the viewing behavior information, the computer device performs a parsing operation to obtain required information, such as a position corresponding to a navigation mark, but not limited to this, from the viewing behavior information, such as a viewing path, a viewing duration, and the like. The navigation mark is used for representing the position of an observer, the viewing path can be the moving track of the navigation mark, and the viewing duration can be the staying time of the navigation mark near the target object.
In this step, by analyzing the viewing behavior information, navigation view angle information can be obtained in addition to the position corresponding to the navigation mark. The navigation perspective information is used to represent a viewing position of the observer relative to the target object.
Continuing to refer to fig. 1, in step S120, a navigation mark is generated on the building information model, wherein the navigation mark is used for indicating the position of the observer.
As described above, since the computer device obtains the position corresponding to the navigation mark by analyzing the viewing behavior information, the computer device can identify the position corresponding to the navigation mark through the building information model displayed in the browser or the display platform of the client, where the navigation mark is used for representing the position of the observer.
For example, if the target object is a production line, the digital scene is a production line scene that can be represented digitally. Further, the digital scene is a three-dimensional scene. It is established according to the data provided by the manufacturer and the real scene, static object and all production lines of the factory, thereby being capable of truly restoring the factory and the production lines. Since the building information model can realize the integration of the building information, the fixed information of all objects in the factory scene can be acquired. The target object in the subsequent embodiment is described by taking a production line as an example, but is not limited thereto. For other scenes or other types of target objects, when the technical solution of the present invention is implemented, similar to the example in which the target object in the subsequent embodiment is a production line, the present invention is not limited to this.
Step S130, determining whether a distance between a position corresponding to the navigation mark and a position corresponding to a target object drawn on the building information model is within a preset distance range.
The computer device may obtain a position of a target object (e.g., a production line) in the building information model, and based on the position corresponding to the navigation mark obtained in the preceding step, the computer device may then determine whether a distance between the position corresponding to the navigation mark and the position corresponding to the target object is within a preset distance range. The position of the target object and the position of the navigation mark may be represented by three-dimensional coordinates, but are not limited thereto, and may be represented by other coordinate systems representing a positional relationship, for example. Since the building information model can provide the position and the radiation radius of the production line, in this embodiment, the computer device can determine whether the position corresponding to the navigation mark is within the radiation radius by regarding the production line as a center of a circle of the whole and setting the preset distance range as the radiation radius.
In this embodiment, the distance between the position corresponding to the navigation mark for indicating the position of the observer and the coordinates for indicating the target object as the production line can be determined.
Step S140, when it is determined that the distance between the position corresponding to the navigation mark and the position corresponding to the target object is within a preset distance range, displaying related information of the target object.
When the computer device judges that the distance between the position corresponding to the navigation mark and the target object coordinate is within the preset distance range, the computer device automatically displays the related information and the scene information of the target object through a browser or a display platform of the client side of the computer device. In this embodiment, the target object is a production line, and then the relevant information of the production line can be displayed, so that the user can obtain the actual information of the production line in time. The actual information includes the scale of the production line, such as the footprint of the production line; several major process conditions of a production line, such as processing, shipping, assembly, etc.
Furthermore, the computer device may also adjust the position of the observer relative to the target object according to the navigation perspective information obtained in step S103 and according to the navigation perspective information, wherein the target object is located on the same plane as the observer.
Specifically, the user may adjust the viewing angle, that is, the position of the observer relative to the target object, according to the position corresponding to the navigation mark by operating the navigation mark identified in the browser or the display platform of the client of the computer device. Such as viewing the front of the target object, the left perspective of the target object, the right perspective of the target object, the back of the target object, etc.
Further, in step S110, a building information model including a production line whose target object is a virtual factory may be obtained. The building information module can be displayed through a display platform or a browser of a client of the computer equipment. The user can carry out three-dimensional immersive view on the production line of the factory by logging in the display platform. During the viewing process, the user can freely control the viewing angle (such as rotating or zooming), view the whole production line through the directional operation of the navigation mark (such as the designated operation of 'W', 'S', 'A', 'D'), and combine the three-dimensional scene provided by the building information model to provide the user with an experience of immersing in the virtual environment. Since the user can arbitrarily manipulate the navigation mark (for example, directional operation of the navigation mark), the user can freely change the viewing route, not the viewing path in a single direction, thereby improving the user's participation and enhancing the sense of substitution.
When the position corresponding to the navigation mark is within the production line range (i.e. the preset distance range taking the radiation radius as an example), all information and scene information of the production line can be automatically displayed in the display platform, so that a user can timely know the related information and obtain an immersive viewing interaction and immersive experience.
In addition, the user can also perform control operations such as view angle adjustment, zooming, rotation, and the like on the digital scene, thereby being able to view the actual situation of the production line in more detail.
Referring to fig. 3, in another embodiment of the digital scene viewing method according to the present invention, after the step of S13O, the method further includes:
step S150, hiding the related information of the target object when it is determined that the distance between the position corresponding to the navigation mark and the position corresponding to the target object exceeds a preset distance range.
In other words, when it is determined that the distance between the position corresponding to the navigation mark for indicating the position of the observer and the coordinates for indicating the target object as the production line exceeds the preset distance range (the radius of radiation as described above), that is, the range in which the observer leaves the production line, then the computer device may automatically hide all relevant information of the production line.
Therefore, through implementation of the steps, three-dimensional scene establishment and immersive interactive experience can be provided through the BIM technology and the immersive technology, waste of field viewing time and labor cost is reduced, viewing routes can be freely controlled, participation is improved, and substitution feeling is enhanced. In addition, the method can also provide immersive viewing interaction, thereby providing an immersive experience.
It should be understood that, although the steps in the flow charts of fig. 1 and 2 are shown in sequence as indicated by the arrows, the steps are not necessarily performed in sequence as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, some of the steps in fig. 1 and 2 may include multiple sub-steps that are not necessarily performed at the same time, but may be performed at different times, and the order of performance of the sub-steps or stages is not necessarily sequential, but may be performed in turn or alternating with other steps or at least a portion of the sub-steps or stages of other steps.
For specific definition of the digital scene viewing apparatus, reference may be made to the above definition of the digital scene viewing method, which is not described herein again. The various modules in the digital scene viewing apparatus of the computer device described above may be implemented in whole or in part by software, hardware, and combinations thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
Fig. 4 is a schematic structural diagram of a digital scene viewing apparatus according to an embodiment of the present invention.
Referring to fig. 4, the apparatus includes: a building model providing unit 410 for obtaining a building information model; a navigation mark generation unit 420 for generating a navigation mark to the building information model, wherein the navigation mark is used for representing the position of the observer; a distance determining unit 430, configured to determine whether a distance between a position corresponding to a navigation mark and a position corresponding to a target object drawn on the building information model is within a preset distance range; an information display unit 440, configured to display relevant information of the target object when it is determined that a distance between a position corresponding to the navigation mark and a position corresponding to the target object is within a preset distance range.
The digital scene viewing apparatus provided in this embodiment may implement the method embodiments, and the implementation principle and technical effect are similar, which are not described herein again.
In one embodiment, the digital scene viewing apparatus may further include: a viewing request receiving unit 401, configured to receive a digital scene viewing request, where the digital scene viewing request is used to request to view relevant information of a target object; a viewing information obtaining unit 402, configured to obtain viewing behavior information of a target object in response to the digital scene viewing request; a viewing information analyzing unit 403, configured to analyze the viewing behavior information to obtain a position corresponding to the navigation mark.
The digital scene viewing apparatus provided in this embodiment may implement the method embodiments, and the implementation principle and the technical effect are similar, which are not described herein again.
In one embodiment, the digital scene viewing apparatus may further include: a viewing angle information acquisition unit (not shown) for acquiring navigation viewing angle information; and a navigation view angle adjusting unit (not shown) configured to correspondingly adjust a position of the observer relative to the target object according to the navigation view angle information when it is determined that the distance between the position corresponding to the navigation mark and the position corresponding to the target object is within a preset distance range, where the target object and the observer are located on the same plane.
Here, the viewing angle information acquiring unit may be disposed in the viewing information analyzing unit 403. The navigation view angle adjusting unit may be disposed in the information display unit 440.
In one embodiment, the apparatus further comprises: and the information hiding unit is used for hiding the related information of the target object when judging that the distance between the position corresponding to the navigation mark and the position corresponding to the target object exceeds a preset distance range.
The digital scene viewing apparatus provided in this embodiment may implement the method embodiments, and the implementation principle and the technical effect are similar, which are not described herein again.
In addition, for specific limitations of the digital scene viewing apparatus, reference may be made to the above limitations on the digital scene viewing method, which is not described herein again. The various modules in the digital scene viewing apparatus of the computer device described above may be implemented in whole or in part by software, hardware, and combinations thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device 500 is provided, the internal structure of which may be as shown in FIG. 5. The computer device 500 includes a processor, memory, network interface, display screen, and input means connected by a system bus. Wherein the processor of the computer device 500 is configured to provide computing and control capabilities. The memory of the computer device 500 includes a nonvolatile storage medium, an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the computer device 500 is used to communicate with an external computer device 500 through a network connection. The computer program is executed by a processor to implement a digital scene viewing method. The display screen of the computer device 500 may be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer device 500 may be a touch layer covered on the display screen, a key, a trackball or a touch pad arranged on the housing of the computer device 500, or an external keyboard, a touch pad or a mouse.
Those skilled in the art will appreciate that the configuration shown in fig. 5 is a block diagram of only a portion of the configuration associated with aspects of the present invention and is not intended to limit the computing device 500 to which aspects of the present invention may be applied, and that a particular computing device 500 may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In an embodiment, a computer device 500 is provided, comprising a memory in which a computer program is stored and a processor, which when executing the computer program performs the steps of:
obtaining a building information model;
generating a navigation mark to the building information model, wherein the navigation mark is used for representing the position of an observer;
judging whether the distance between the position corresponding to the navigation mark and the position corresponding to the target object is within a preset distance range, wherein the target object is drawn on the building information model;
and when the distance between the position corresponding to the navigation mark and the position corresponding to the target object is judged to be within a preset distance range, displaying the related information of the target object.
It will be understood by those skilled in the art that all or part of the processes of the methods of the above embodiments may be implemented by a computer program, which may be stored in a non-volatile computer-readable storage medium, and the computer program may include the processes of the above embodiments of the digital scene viewing method when executed. Any reference to memory, storage, databases, or other media used in embodiments provided herein may include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
The above detailed description is given to a digital scene viewing method, a digital scene viewing device, a computer device, and a storage medium according to embodiments of the present invention, and a specific example is applied to explain the principle and the implementation manner of the present invention, and the description of the above embodiments is only used to help understanding the technical scheme and the core idea of the present invention; those of ordinary skill in the art will understand that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. A digital scene checking method is suitable for a computer device and is characterized in that,
obtaining a building information model;
generating a navigation mark to the building information model, wherein the navigation mark is used for representing the position of an observer;
judging whether the distance between the position corresponding to the navigation mark and the position corresponding to the target object is within a preset distance range, wherein the target object is drawn on the building information model; and
and when the distance between the position corresponding to the navigation mark and the position corresponding to the target object is judged to be within a preset distance range, displaying the related information of the target object.
2. A digital scene viewing method according to claim 1, wherein the step of obtaining a building information model is preceded by:
receiving a digital scene viewing request, wherein the digital scene viewing request is used for requesting to view the related information of a target object;
responding to the digital scene viewing request, and acquiring viewing behavior information of a target object;
and analyzing the viewing behavior information to obtain the position corresponding to the navigation mark.
3. A method for viewing digital scenes according to claim 2, wherein, in the step of parsing the viewing behavior information to obtain the positions corresponding to the navigation markers, the method further comprises:
acquiring navigation visual angle information;
and when the distance between the position corresponding to the navigation mark and the position corresponding to the target object is judged to be within a preset distance range, correspondingly adjusting the position of the observer relative to the target object according to the navigation visual angle information, wherein the target object and the observer are positioned on the same plane.
4. A digital scene viewing method according to claim 1, wherein when it is determined that the distance between the position corresponding to said navigation mark and the position corresponding to said target object exceeds a preset distance range, information related to said target object is hidden.
5. A digital scene viewing device, comprising:
a building model providing unit for acquiring a building information model;
a navigation mark generation unit for generating a navigation mark to the building information model, wherein the navigation mark is used for representing the position of the observer;
the distance judging unit is used for judging whether the distance between the position corresponding to the navigation mark and the position corresponding to the target object is within a preset distance range or not, wherein the target object is drawn on the building information model;
and the information display unit is used for displaying the related information of the target object when judging that the distance between the position corresponding to the navigation mark and the position corresponding to the target object is within a preset distance range.
6. A digital scene viewing device according to claim 5, wherein said device further comprises:
a viewing request receiving unit, configured to receive a digital scene viewing request, where the digital scene viewing request is used to request to view related information of a target object;
the viewing information acquisition unit is used for responding to the digital scene viewing request and acquiring viewing behavior information of a target object;
and the viewing information analyzing unit is used for analyzing the viewing behavior information to obtain the position corresponding to the navigation mark.
7. A digital scene viewing device according to claim 6, wherein said device further comprises:
the visual angle information acquisition unit is used for acquiring navigation visual angle information;
and the navigation visual angle adjusting unit is used for correspondingly adjusting the position of the observer relative to the target object according to the navigation visual angle information when the distance between the position corresponding to the navigation mark and the position corresponding to the target object is judged to be within a preset distance range, wherein the target object and the observer are positioned on the same plane.
8. A digital scene viewing device according to claim 5, wherein said device further comprises:
and the information hiding unit is used for hiding the related information of the target object when judging that the distance between the position corresponding to the navigation mark and the position corresponding to the target object exceeds a preset distance range.
9. A computer device, characterized in that it comprises a processor and a memory, said memory having stored thereon a computer program executable on the processor, said processor implementing the steps of the digital scene viewing method according to any one of claims 1 to 4 when executing said computer program.
10. A storage medium, characterized in that the storage medium has stored thereon a computer program which, when being executed by a processor, realizes the steps of the digital scene viewing method of any one of claims 1 to 4.
CN202011412279.6A 2020-12-04 2020-12-04 Digital scene viewing method and device, storage medium and computer equipment Withdrawn CN112381946A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011412279.6A CN112381946A (en) 2020-12-04 2020-12-04 Digital scene viewing method and device, storage medium and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011412279.6A CN112381946A (en) 2020-12-04 2020-12-04 Digital scene viewing method and device, storage medium and computer equipment

Publications (1)

Publication Number Publication Date
CN112381946A true CN112381946A (en) 2021-02-19

Family

ID=74591065

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011412279.6A Withdrawn CN112381946A (en) 2020-12-04 2020-12-04 Digital scene viewing method and device, storage medium and computer equipment

Country Status (1)

Country Link
CN (1) CN112381946A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115718824A (en) * 2023-01-10 2023-02-28 北京飞渡科技股份有限公司 Method for judging position of equipment and automatically pushing equipment information through space distance
WO2024011338A1 (en) * 2022-07-11 2024-01-18 维沃移动通信有限公司 Display method, display apparatus, and electronic device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024011338A1 (en) * 2022-07-11 2024-01-18 维沃移动通信有限公司 Display method, display apparatus, and electronic device
CN115718824A (en) * 2023-01-10 2023-02-28 北京飞渡科技股份有限公司 Method for judging position of equipment and automatically pushing equipment information through space distance

Similar Documents

Publication Publication Date Title
US20230075270A1 (en) Image processing method and apparatus, electronic device, and computer-readable storage medium
Kikuchi et al. Future landscape visualization using a city digital twin: Integration of augmented reality and drones with implementation of 3D model-based occlusion handling
US7365747B2 (en) Methods and systems for controlling an image generator to define, generate, and view geometric images of an object
US11004256B2 (en) Collaboration of augmented reality content in stereoscopic view in virtualized environment
US10417833B2 (en) Automatic 3D camera alignment and object arrangment to match a 2D background image
Milosavljević et al. GIS-augmented video surveillance
CN109213945B (en) License management for cloud-based documents
CN111259545A (en) Intelligent driving virtual simulation cloud platform
CN112381946A (en) Digital scene viewing method and device, storage medium and computer equipment
US20180349367A1 (en) Systems and methods for associating virtual objects with electronic documents, and searching for a virtual object or an electronic document based on the association
CN109615707A (en) A kind of threedimensional model processing method, device, computer equipment and storage medium
KR20130063876A (en) Augmented reality system and generation method by cloud computing
US11023729B1 (en) Providing visual guidance for presenting visual content in a venue
CN114651246B (en) Method for searching for image using rotation gesture input
CN117095135B (en) Industrial three-dimensional scene modeling arrangement method and device capable of being edited online
JP2022507502A (en) Augmented Reality (AR) Imprint Method and System
US20130124155A1 (en) Controlling Features In A Software Application Based on The Status of User Subscription
CN116917842A (en) System and method for generating stable images of real environment in artificial reality
EP4266232A1 (en) Cryptographic digital assets management system
KR20230118794A (en) Method and apparatus for providing 3d contents packaging and streaming service based on web to construct and control of virtual space
CN110990106A (en) Data display method and device, computer equipment and storage medium
KR102434084B1 (en) Reality space-based content authoring device, content authoring system and method
US11030811B2 (en) Augmented reality enabled layout system and method
Wang et al. Virtual assembly method based on OSG
CN118245025A (en) CIM city information aggregation rule-built management big data platform construction method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20210219