CN111191309A - Real-time shielding and rejecting method, device and equipment suitable for BIM light weight - Google Patents

Real-time shielding and rejecting method, device and equipment suitable for BIM light weight Download PDF

Info

Publication number
CN111191309A
CN111191309A CN201911320176.4A CN201911320176A CN111191309A CN 111191309 A CN111191309 A CN 111191309A CN 201911320176 A CN201911320176 A CN 201911320176A CN 111191309 A CN111191309 A CN 111191309A
Authority
CN
China
Prior art keywords
target object
selected target
real
sampling points
auxiliary line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911320176.4A
Other languages
Chinese (zh)
Other versions
CN111191309B (en
Inventor
冯少翔
周曹俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Hornsun Information Technology Co ltd
Original Assignee
Beijing Hornsun Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Hornsun Information Technology Co ltd filed Critical Beijing Hornsun Information Technology Co ltd
Priority to CN201911320176.4A priority Critical patent/CN111191309B/en
Publication of CN111191309A publication Critical patent/CN111191309A/en
Application granted granted Critical
Publication of CN111191309B publication Critical patent/CN111191309B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/04Architectural design, interior design

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the invention discloses a real-time occlusion rejection method, a real-time occlusion rejection device and real-time occlusion rejection equipment, wherein the real-time occlusion rejection method comprises the following steps: determining the total number of sampling points and the viewport coordinate of each sampling point according to the resolution of the display equipment; initializing states of a plurality of target objects to a hidden state; generating a corresponding auxiliary line according to a given sampling point coefficient and the position of the camera; acquiring a selected target object from the plurality of target objects, and setting the selected target object in a display state if the selected target object is a non-transparent object and the selected target object is an object which is intersected with the preset auxiliary line and is closest to the camera; and if the selected target object is a transparent object, setting the selected target object to be in a hidden state after M frames from the adjustment to the display state. The invention realizes real-time shielding and elimination of the dynamic model, can effectively improve real-time rendering efficiency, reduces hardware requirements, and provides technical support for meeting real-time rendering of BIM models with large volume.

Description

Real-time shielding and rejecting method, device and equipment suitable for BIM light weight
Technical Field
The embodiment of the invention relates to the technical field of model processing, in particular to a light-weight real-time shielding and rejecting method, device and equipment suitable for a Building Information Model (BIM).
Background
The building information model is a new tool for architecture, engineering and civil engineering, and can help to realize the integration of building information, and all kinds of information are always integrated in a three-dimensional model information database from the design, construction and operation of a building to the end of the whole life cycle of the building.
Most BIMs are large in size, the number of model components is large, a large amount of resources are consumed for rendering display, and display on a mobile platform is not facilitated. The existing optimization technology is generally used in a static scene, the requirement is that an occlusion range must be occluded by a static object, and the requirement of early baking is high for the early preparation work of the model.
Disclosure of Invention
The embodiment of the invention aims to provide an object display method, device and equipment based on a building information model, which are used for solving the problems that the existing BIM rendering consumes a large amount of resources and is not beneficial to display on a mobile platform.
In order to achieve the above object, the embodiments of the present invention mainly provide the following technical solutions:
in a first aspect, an embodiment of the present invention provides a real-time occlusion rejection method suitable for BIM lightweight, including: determining the total number of sampling points and the viewport coordinate of each sampling point according to the resolution of the display equipment; giving state types of a plurality of target objects, and initializing the plurality of target objects to be hidden states; determining the number of sampling points of each frame according to the given sampling point coefficient and the total number of the sampling points, wherein the sampling points of each frame are uniformly distributed; obtaining the three-dimensional space coordinates of each frame of sampling points according to the position of the camera and the viewport coordinates of each frame of sampling points, and connecting the camera and the three-dimensional coordinates of each frame of sampling points to generate corresponding auxiliary lines; acquiring a selected target object from the plurality of target objects, and acquiring triangular surface information and a state type of the selected target object; if the selected target object is determined to be the object which is intersected with a first preset auxiliary line and is closest to the camera according to the triangular surface information of the selected target object, adjusting the selected target object to be in a display state; if the selected target object is a transparent object, starting from the adjustment of the selected target object to a display state, setting the selected target object to a hidden state after M frames; wherein M is a positive integer greater than 0.
According to an embodiment of the present invention, after the adjusting the selected target object to the display state, the method further includes: if the selected target object is a transparent object, ignoring the selected target object when other auxiliary lines than the first preset auxiliary line intersect with the selected target object.
According to an embodiment of the present invention, after the adjusting the selected target object to the display state, the method further includes: if the selected target object is a non-transparent object, within M frames from the time when the selected target object is adjusted to the display state, if the selected target object is an object that intersects a second preset auxiliary line and is closest to the camera, resetting the number of M at the time when the selected target object intersects the second preset auxiliary line.
According to an embodiment of the present invention, the sampling points of each frame are uniformly distributed, including: and calculating sampling coordinate data of the viewport in the x direction and the y direction by using a Halton sequence algorithm according to the resolution of the display equipment, using different prime numbers when calculating the sampling coordinate data of the viewport in the x direction and the y direction, uniformly distributing all sampling points on the whole viewport plane, and obtaining the three-dimensional space coordinates of the sampling points uniformly distributed in each frame according to a coordinate conversion algorithm.
In a second aspect, an embodiment of the present invention further provides a real-time occlusion rejection apparatus based on BIM, including: the initialization module is used for setting the state types of a plurality of target objects and initializing the target objects into a hidden state; the computing module is used for determining the total number of the sampling points according to the resolution of the display equipment and computing the view port coordinates of the uniformly distributed sampling points by using a Halton sequence algorithm; the auxiliary line generation module is used for determining the number of sampling points of each frame according to a given sampling point coefficient and the total number of the samples, obtaining the three-dimensional space coordinate of each frame of sampling points according to the position of the camera and the view port coordinate of each frame of sampling points, and connecting the camera and the three-dimensional coordinate of each frame of sampling points to generate corresponding auxiliary lines; the selection acquisition module is used for selecting a selected target object from the plurality of target objects and acquiring triangular surface information and state types of the selected target object; the control processing module is used for setting the selected target object to be in a display state if the state type of the selected target object is a hidden state and the selected target object is determined to be an object which is intersected with a first preset auxiliary line and is closest to the camera according to the triangular surface information of the selected target object; the control processing module is further configured to, if the selected target object is a transparent object, set the selected target object to a hidden state after M frames from the beginning of adjusting the selected target object to a display state; wherein M is a positive integer greater than 0.
According to an embodiment of the present invention, the control processing module is further configured to, after adjusting the selected target object to the display state, if the selected target object is a transparent object, ignore the selected target object when an auxiliary line other than the first preset auxiliary line intersects the selected target object.
According to an embodiment of the present invention, the control processing module is further configured to, after adjusting the selected target object to the display state, reset the number of M frames from the adjustment of the selected target object to the display state if the selected target object is a non-transparent object, and reset the number of M frames at a time when the selected target object intersects a second preset auxiliary line if the selected target object is an object that intersects the second preset auxiliary line and is closest to the camera.
According to one embodiment of the invention, the auxiliary line generation module is configured to calculate sampling coordinate data of the viewport in the x direction and the y direction by using a Halton sequence algorithm according to a resolution of the display device, use different prime numbers when calculating the sampling coordinate data of the viewport in the x direction and the y direction, uniformly distribute all sampling points on the whole viewport plane, and obtain a three-dimensional space coordinate of each frame of the sampling points according to a coordinate conversion algorithm.
In a third aspect, an embodiment of the present invention further provides an electronic device, including: at least one processor and at least one memory; the memory is to store one or more program instructions; the processor is configured to execute one or more program instructions to execute the real-time occlusion culling method suitable for BIM lightweight according to the first aspect.
In a fourth aspect, embodiments of the present invention also provide a computer-readable storage medium containing one or more program instructions for executing the real-time occlusion culling method suitable for building information model lightweight according to the first aspect.
The technical scheme provided by the embodiment of the invention at least has the following advantages:
the real-time occlusion rejection method, device and equipment suitable for BIM lightweight provided by the embodiment of the invention realize real-time occlusion rejection of dynamic models aiming at the condition that BIM point-plane quantity is huge, construction quantity is large, and the components are independent model data, can effectively improve real-time rendering efficiency, reduce hardware requirements, and provide technical support for meeting real-time rendering of large-size BIM models. Further, the present invention places the object in two layers, respectively, in consideration of whether the auxiliary line intersects with the object: the first layer is a transparent object which only comprises a display state and does not consider intersected objects; the second layer is an object that is considered to intersect, including transparent objects in a hidden state and all other non-transparent objects. The auxiliary line only considers the intersection with the object in the second layer, and the object closest to the camera is set to be displayed, so that the effect of ignoring the object in the first layer is achieved, and the operation efficiency is improved.
Drawings
Fig. 1 is a flowchart of a real-time occlusion rejection method suitable for BIM lightweight according to an embodiment of the present invention.
Fig. 2 is a block diagram of a real-time occlusion rejection apparatus suitable for BIM lightweight according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention is provided for illustrative purposes, and other advantages and effects of the present invention will become apparent to those skilled in the art from the present disclosure.
In the following description, for purposes of explanation and not limitation, specific details are set forth such as particular system structures, interfaces, techniques, etc. in order to provide a thorough understanding of the present invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
In the description of the present invention, it is to be understood that the terms "first" and "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
In the description of the present invention, it is to be noted that, unless otherwise explicitly specified or limited, the terms "connected" and "connected" are to be interpreted broadly, e.g., as meaning directly connected or indirectly connected through an intermediate. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
Fig. 1 is a flowchart of a real-time occlusion rejection method suitable for BIM lightweight according to an embodiment of the present invention. As shown in fig. 1, the real-time occlusion rejection method for BIM lightweight according to the embodiment of the present invention includes:
s1: and determining the total number of the sampling points and the viewport coordinate of each sampling point according to the resolution of the display equipment.
In one embodiment of the invention, sampling coordinate data of the viewport in the x direction and the y direction are calculated by using a Halton sequence algorithm according to the resolution of the display equipment, and different prime numbers are used when the sampling coordinate data of the viewport in the x direction and the sampling coordinate data of the viewport in the y direction are calculated, so that all sampling points are uniformly distributed on the whole viewport plane.
Specifically, the number p of screen pixels is calculated as the total number of sample points according to the screen resolution of the display device. And respectively calculating sampling coordinate data in the x direction and the y direction of the screen by using a Halton sequence algorithm. When calculating the coordinate data in the x direction and the y direction, different prime numbers are needed, for example, the prime number in the x direction is 2, and the prime number in the y direction is 3, so that the sampling points are uniformly distributed.
S2: a plurality of target objects are initialized to a hidden state.
Specifically, objects are placed in two layers, respectively, according to whether the objects pass as auxiliary lines: the first image layer comprises transparent objects in a display state, can only be shielded, cannot shield other objects, and does not serve as an object through which an auxiliary line passes; the second layer includes a non-transparent object and a transparent object in a hidden state as an object through which the auxiliary line passes.
And respectively classifying all target objects into the two layers, and initializing the target objects into a hidden state. All the target objects after initialization are in a hidden state, so all the target objects after initialization are in the second image layer.
It should be noted that the present invention does not limit the sequential execution relationship between step S1 and step S2, that is, step S1 may be executed first and then step S2 is executed, step S2 may be executed first and then step S1 is executed, or step S1 and step S2 may be executed simultaneously.
S3: determining the number of sampling points of each frame according to the given sampling point coefficient and the total number of the sampling points, wherein the sampling points of each frame are uniformly distributed, obtaining the three-dimensional space coordinate of each frame of sampling points according to the position of the camera and the viewport coordinate of each frame of sampling points, and connecting the camera and the three-dimensional coordinate of each frame of sampling points to generate corresponding auxiliary lines. Wherein, the sampling point uniform distribution of each frame includes: and calculating sampling coordinate data of the viewport in the x direction and the y direction by using a Halton sequence algorithm according to the resolution of the display equipment, using different prime numbers when calculating the sampling coordinate data of the x direction and the y direction, uniformly distributing all sampling points on the whole viewport plane, and obtaining the three-dimensional space coordinates of the uniformly distributed sampling points of each frame according to a coordinate conversion algorithm.
In one example of the invention, the total number of samples is 204800. Because the sampling points are large in quantity and uniform in distribution, all the sampling points do not need to be connected in one frame, and only one part of the sampling points needs to be connected. And selecting the connected sampling point rule to circulate according to the sequence generated by calculation, wherein the larger the number of the sampling points of each frame is, the better the effect is. When the coefficient of a given sample point is 0.001, the number of sample points per frame is rounded up by the product of 0.001 × 204800, i.e., the number of sample points per frame is 205. 205 auxiliary lines are generated from the camera, namely the first frame is connected by using 0-204 sampling points, and the second frame uses 205-409 … ….
In each frame of the run, the sample point viewport coordinates are converted to three-dimensional space coordinates. For example, if the viewport coordinate of a certain sampling point is (1/2, 1/3), the three-dimensional coordinate of the sampling point is (-4.8,2.3, -5), then given the length of the auxiliary line, for example, the length of the auxiliary line is 3000, an auxiliary line is generated by connecting the camera with the three-dimensional coordinate of this sampling point, and the end point coordinate of the auxiliary line is (-4.8,2.3,2995).
It should be noted that all auxiliary lines are only considered to intersect with the object in the second layer.
S4: a selected target object is obtained from a plurality of target objects, and triangular surface information and a state type of the selected target object are obtained. The triangular surface information of the selected target object is obtained when the model is imported. The status type is used to indicate whether the selected target object is a transparent object or a non-transparent object.
S5: and if the selected target object is determined to be the object which is intersected with the preset auxiliary line and is closest to the camera according to the triangular surface information of the selected target object, adjusting the selected target object to be in a display state, namely changing the selected target object as an occlusion object into the display state.
In an embodiment of the present invention, after step S5, the method further includes: and if the selected target object is a transparent object, switching the selected target object from the second layer to the first layer so as to achieve the effect that the auxiliary lines ignore the selected target object. The method of ignoring the selected target object is the Culling Mask method in a common graphic engine.
In an embodiment of the present invention, after step S5, the method further includes: if the state type of the selected target object is a non-transparent object, within M frames from the time when the selected target object is adjusted to the display state, if the selected target object is an object that intersects the second preset auxiliary line and is closest to the camera, resetting the number of M at the time when the selected target object intersects the second preset auxiliary line. Illustratively, the selected target object is an object a, the object a is a non-transparent object, the first preset auxiliary line is an auxiliary line X, and the second preset auxiliary line is an auxiliary line Y. When it is determined in step S5 that the object a is the closest target object to the camera among the target objects intersecting the auxiliary line X, step S5 sets the object a in the display state and the time for setting the object a in the display state is T1. Starting from T1, if the object a is the target object closest to the camera among the target objects intersecting the auxiliary line Y within the M frame, the number of M is reset, for example, if the object a is the target object intersecting the auxiliary line Y for a time T2, it is determined again from T2 whether there is an object intersecting the auxiliary line with the object a and the object closest to the camera among the objects intersecting the auxiliary line, and if so, the number of M is reset again; if not, object a is set to a hidden state at the time of T2+ M frame, and hidden state objects, which are invisible objects, are removed from the scene. Once an object enters the hidden state, all business logic will be stopped to improve efficiency, leaving only the collision volume for the relationship of the decision and the auxiliary line.
In an embodiment of the present invention, after step S5, the method further includes: the selected target object is a transparent object, and when the selected target object is set to be in a hidden state, the transparent object needs to be switched into the second layer to be used as an object through which the auxiliary line passes.
In one example of the present invention, it is possible to detect whether or not the selected target object intersects the auxiliary line by a camera detection algorithm.
Figure BDA0002326922870000081
TABLE 1
Table 1 shows the actual comparison between the real-time occlusion rejection method for BIM weight reduction using the present invention and the method without using the present invention. As can be seen from Table 1, the method for real-time occlusion rejection suitable for BIM lightweight of the present invention has obvious improvement on the prior art in the indexes such as frame rate and CPU response time.
The real-time occlusion rejection method suitable for BIM lightweight provided by the embodiment of the invention realizes real-time occlusion rejection of a dynamic model aiming at the conditions that BIM point-plane quantity is large, construction quantity is large, and the components are independent model data, can effectively improve real-time rendering efficiency, reduce hardware requirements, and provide technical support for meeting real-time rendering of large-size BIM models. Further, the present invention classifies the object into two categories according to whether the auxiliary line intersects the object or not: the first type is a transparent object including only a display state, regardless of an intersecting object; the second category is to consider intersecting objects, including transparent objects in hidden states and all other common objects. The auxiliary line only considers the intersection with the object in the second class, and the object closest to the camera is set as a display, thereby achieving the effect of ignoring the object in the first class to improve the efficiency of the operation.
Fig. 2 is a block diagram of a real-time occlusion rejection apparatus suitable for BIM lightweight according to an embodiment of the present invention. As shown in fig. 2, the BIM-based object display apparatus according to the embodiment of the present invention includes: an initialization module 100, a calculation module 200, an auxiliary line generation module 300, a selection acquisition module 400, and a control processing module 500.
The initialization module 100 is configured to set state types of a plurality of target objects, and initialize states of the plurality of target objects to a hidden state. The calculation module 200 is configured to determine the total number of sample points according to the resolution of the display device, and calculate view port coordinates of the uniformly distributed sample points by using a Halton sequence algorithm. The auxiliary line generating module 300 is configured to determine the number of sampling points in each frame according to a given sampling point coefficient and the total number of samples, obtain a three-dimensional space coordinate of each frame of sampling points according to the position of the camera and the viewport coordinate of each frame of sampling points, and connect the camera and the three-dimensional coordinate of each frame of sampling points to generate a corresponding auxiliary line. The selection acquiring module 400 is used for selecting a selected target object from a plurality of target objects and acquiring triangular surface information and a state type of the selected target object. The control processing module 500 is configured to set the selected target object to the display state if the state type of the selected target object is the hidden state and it is determined that the selected target object is an object intersecting the first preset auxiliary line and closest to the camera according to the triangular plane information of the selected target object. The control processing module 500 is further configured to, if the selected target object is a transparent object, set the selected target object to a hidden state after M frames from the beginning of adjusting the selected target object to a display state; wherein M is a positive integer greater than 0.
In one embodiment of the present invention, the control processing module 500 is further configured to, after adjusting the selected target object to the display state, ignore the selected target object when an auxiliary line other than the first preset auxiliary line intersects the selected target object if the selected target object is a transparent object.
In an embodiment of the present invention, the control processing module 500 is further configured to, after adjusting the selected target object to the display state, reset the number of M frames from the adjustment of the selected target object to the display state if the selected target object is a non-transparent object, and reset the number of M frames at a time when the selected target object intersects the second preset auxiliary line if the selected target object is an object that intersects the second preset auxiliary line and is closest to the camera.
In an embodiment of the present invention, the auxiliary line generating module 300 is configured to calculate sampling coordinate data in an x direction and a y direction of the viewport by using a Halton sequence algorithm according to a resolution of the display device, use different prime numbers when calculating the sampling coordinate data in the x direction and the y direction, uniformly distribute all sampling points on a whole viewport plane, and obtain a three-dimensional space coordinate of each frame of the sampling points according to a coordinate conversion algorithm.
It should be noted that, a specific implementation of the real-time occlusion rejection apparatus suitable for weight reduction of the building information model in the embodiment of the present invention is similar to a specific implementation of the real-time occlusion rejection method suitable for weight reduction of the building information model in the embodiment of the present invention, and specific reference is specifically made to the description of the object display method based on the building information model, and details are not repeated in order to reduce redundancy.
An embodiment of the present invention further provides an electronic device, including: at least one processor and at least one memory; the memory is to store one or more program instructions; the processor is configured to execute one or more program instructions to execute the real-time occlusion culling method suitable for the lightweight building information model according to the first aspect.
The embodiment of the invention discloses a computer-readable storage medium, wherein computer program instructions are stored in the computer-readable storage medium, and when the computer program instructions are run on a computer, the computer is enabled to execute the real-time occlusion rejection method suitable for the lightweight of the building information model.
In an embodiment of the invention, the processor may be an integrated circuit chip having signal processing capability. The Processor may be a general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete Gate or transistor logic device, discrete hardware component.
The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The processor reads the information in the storage medium and completes the steps of the method in combination with the hardware.
The storage medium may be a memory, for example, which may be volatile memory or nonvolatile memory, or which may include both volatile and nonvolatile memory.
The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory.
The volatile Memory may be a Random Access Memory (RAM) which serves as an external cache. By way of example and not limitation, many forms of RAM are available, such as Static random access memory (Static RAM, SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (ddr Data Rate SDRAM), enhanced SDRAM (enhanced SDRAM, ESDRAM), synclink DRAM (SLDRAM), and Direct Rambus RAM (DRRAM).
The storage media described in connection with the embodiments of the invention are intended to comprise, without being limited to, these and any other suitable types of memory.
Those skilled in the art will appreciate that the functionality described in the present invention may be implemented in a combination of hardware and software in one or more of the examples described above. When software is applied, the corresponding functionality may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a general purpose or special purpose computer.
The above-mentioned embodiments, objects, technical solutions and advantages of the present invention are further described in detail, it should be understood that the above-mentioned embodiments are only exemplary embodiments of the present invention, and are not intended to limit the scope of the present invention, and any modifications, equivalent substitutions, improvements and the like made on the basis of the technical solutions of the present invention should be included in the scope of the present invention.

Claims (10)

1. A real-time occlusion rejection method suitable for BIM lightweight is characterized by comprising the following steps:
determining the total number of sampling points and the viewport coordinate of each sampling point according to the resolution of the display equipment;
initializing a plurality of target objects to a hidden state;
determining the number of sampling points of each frame according to the given sampling point coefficient and the total number of the sampling points, wherein the sampling points of each frame are uniformly distributed; obtaining the three-dimensional space coordinates of each frame of sampling points according to the position of the camera and the viewport coordinates of each frame of sampling points, and connecting the camera and the three-dimensional coordinates of each frame of sampling points to generate corresponding auxiliary lines;
acquiring a selected target object from the plurality of target objects, and acquiring triangular surface information and a state type of the selected target object;
if the selected target object is determined to be the object which is intersected with a first preset auxiliary line and is closest to the camera according to the triangular surface information of the selected target object, adjusting the selected target object to be in a display state;
if the selected target object is a transparent object, starting from the adjustment of the selected target object to a display state, setting the selected target object to a hidden state after M frames; wherein M is a positive integer greater than 0.
2. The real-time occlusion rejection method for BIM lightweight according to claim 1, further comprising, after adjusting the selected target object to a display state:
if the selected target object is a transparent object, ignoring the selected target object when other auxiliary lines than the first preset auxiliary line intersect with the selected target object.
3. The real-time occlusion rejection method for BIM lightweight according to claim 1, further comprising, after adjusting the selected target object to a display state:
if the selected target object is a non-transparent object, within M frames from the time when the selected target object is adjusted to the display state, if the selected target object is an object that intersects a second preset auxiliary line and is closest to the camera, resetting the number of M at the time when the selected target object intersects the second preset auxiliary line.
4. The real-time occlusion rejection method suitable for BIM lightweight according to any one of claims 1-3, wherein the sampling points of each frame are uniformly distributed, and the method comprises the following steps:
and calculating sampling coordinate data of the viewport in the x direction and the y direction by using a Halton sequence algorithm according to the resolution of the display equipment, using different prime numbers when calculating the sampling coordinate data of the viewport in the x direction and the y direction, uniformly distributing all sampling points on the whole viewport plane, and obtaining the three-dimensional space coordinates of the sampling points uniformly distributed in each frame according to a coordinate conversion algorithm.
5. The utility model provides a shelter from remove device in real time suitable for BIM lightweight, its characterized in that includes:
the initialization module is used for initializing a plurality of target objects into a hidden state;
the computing module is used for determining the total number of the sampling points according to the resolution of the display equipment and computing the view port coordinates of the uniformly distributed sampling points by using a Halton sequence algorithm;
the auxiliary line generating module is used for determining the number of sampling points of each frame according to a given sampling point coefficient and the total number of the samples, obtaining the three-dimensional space coordinate of each frame of sampling points according to the position of the camera and the view port coordinate of each frame of sampling points, and connecting the camera and the three-dimensional coordinate of each frame of sampling points to generate corresponding auxiliary lines;
the selection acquisition module is used for selecting a selected target object from the plurality of target objects and acquiring triangular surface information and state types of the selected target object;
the control processing module is used for setting the selected target object to be in a display state if the state type of the selected target object is a hidden state and the selected target object is determined to be an object which is intersected with a first preset auxiliary line and is closest to the camera according to the triangular surface information of the selected target object; the control processing module is further configured to, if the selected target object is a transparent object, set the selected target object to a hidden state after M frames from the beginning of adjusting the selected target object to a display state; wherein M is a positive integer greater than 0.
6. The real-time occlusion rejection device for BIM lightweight according to claim 5, wherein the control processing module is further configured to, after adjusting the selected target object to the display state, if the selected target object is a transparent object, ignore the selected target object when other auxiliary lines than the first preset auxiliary line intersect with the selected target object.
7. The real-time occlusion rejection device for BIM lightweight according to claim 5, wherein the control processing module is further configured to reset the number of M frames from the adjustment of the selected target object to the display state if the selected target object is a non-transparent object after the adjustment of the selected target object to the display state, and to reset the number of M frames at a time when the selected target object intersects a second preset auxiliary line and is closest to the camera if the selected target object is an object that intersects the second preset auxiliary line.
8. The device of claim 5, wherein the auxiliary line generation module is configured to calculate sampling coordinate data of a viewport in an x direction and a y direction by using a Halton sequence algorithm according to a resolution of the display device, use different prime numbers when calculating the sampling coordinate data of the viewport in the x direction and the y direction, uniformly distribute all sampling points on a whole viewport plane, and obtain a three-dimensional space coordinate of each frame of the sampling points according to a coordinate conversion algorithm.
9. An electronic device, characterized in that the electronic device comprises: at least one processor and at least one memory;
the memory is to store one or more program instructions;
the processor is used for executing one or more program instructions to execute the real-time occlusion rejection method suitable for BIM lightweight according to any one of claims 1 to 4.
10. A computer readable storage medium having one or more program instructions embodied therein for performing the real-time occlusion culling method for BIM lightweight as recited in any of claims 1-4.
CN201911320176.4A 2019-12-19 2019-12-19 Real-time shielding and rejecting method, device and equipment suitable for BIM light weight Active CN111191309B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911320176.4A CN111191309B (en) 2019-12-19 2019-12-19 Real-time shielding and rejecting method, device and equipment suitable for BIM light weight

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911320176.4A CN111191309B (en) 2019-12-19 2019-12-19 Real-time shielding and rejecting method, device and equipment suitable for BIM light weight

Publications (2)

Publication Number Publication Date
CN111191309A true CN111191309A (en) 2020-05-22
CN111191309B CN111191309B (en) 2023-01-24

Family

ID=70707432

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911320176.4A Active CN111191309B (en) 2019-12-19 2019-12-19 Real-time shielding and rejecting method, device and equipment suitable for BIM light weight

Country Status (1)

Country Link
CN (1) CN111191309B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112906125A (en) * 2021-04-07 2021-06-04 中南大学 Light-weight loading method for BIM model of railway fixed facility

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6300965B1 (en) * 1998-02-17 2001-10-09 Sun Microsystems, Inc. Visible-object determination for interactive visualization
CN103632376A (en) * 2013-12-12 2014-03-12 江苏大学 Method for suppressing partial occlusion of vehicles by aid of double-level frames
CN108431736A (en) * 2015-10-30 2018-08-21 奥斯坦多科技公司 The system and method for gesture interface and Projection Display on body
CN109656313A (en) * 2017-10-10 2019-04-19 陈旭 Certain tablet computer and books or electronic data file

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6300965B1 (en) * 1998-02-17 2001-10-09 Sun Microsystems, Inc. Visible-object determination for interactive visualization
CN103632376A (en) * 2013-12-12 2014-03-12 江苏大学 Method for suppressing partial occlusion of vehicles by aid of double-level frames
CN108431736A (en) * 2015-10-30 2018-08-21 奥斯坦多科技公司 The system and method for gesture interface and Projection Display on body
CN109656313A (en) * 2017-10-10 2019-04-19 陈旭 Certain tablet computer and books or electronic data file

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
戴美玲等: "薄壁球壳受刚性球面压缩的可视化连续变形测量研究", 《光学学报》 *
王章野等: "大规模场景的消隐技术", 《计算机工程与应用》 *
赵全邦等: "Snake算法在运动目标检测中的应用", 《沈阳理工大学学报》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112906125A (en) * 2021-04-07 2021-06-04 中南大学 Light-weight loading method for BIM model of railway fixed facility
CN112906125B (en) * 2021-04-07 2021-11-09 中南大学 Light-weight loading method for BIM model of railway fixed facility

Also Published As

Publication number Publication date
CN111191309B (en) 2023-01-24

Similar Documents

Publication Publication Date Title
CN111815755B (en) Method and device for determining blocked area of virtual object and terminal equipment
CN111738923A (en) Image processing method, apparatus and storage medium
CN110637461B (en) Compact optical flow handling in computer vision systems
US11694405B2 (en) Method for displaying annotation information, electronic device and storage medium
CN104851127B (en) It is a kind of based on interactive building point cloud model texture mapping method and device
CN113643414B (en) Three-dimensional image generation method and device, electronic equipment and storage medium
Li et al. High throughput hardware architecture for accurate semi-global matching
CN112348885A (en) Visual feature library construction method, visual positioning method, device and storage medium
CN113724391A (en) Three-dimensional model construction method and device, electronic equipment and computer readable medium
CN111191309B (en) Real-time shielding and rejecting method, device and equipment suitable for BIM light weight
CN113963072B (en) Binocular camera calibration method and device, computer equipment and storage medium
CN112634366B (en) Method for generating position information, related device and computer program product
CN114881841A (en) Image generation method and device
EP3770665A1 (en) Method for correcting rolling shutter phenomenon, rolling shutter phenomenon correcting apparatus, and computer-readable recording medium
WO2024002064A1 (en) Method and apparatus for constructing three-dimensional model, and electronic device and storage medium
CN115619986B (en) Scene roaming method, device, equipment and medium
DE102023105068A1 (en) Motion vector optimization for multiple refractive and reflective interfaces
CN116017129A (en) Method, device, system, equipment and medium for adjusting angle of light supplementing lamp
CN113610864B (en) Image processing method, device, electronic equipment and computer readable storage medium
CN114519764A (en) Three-dimensional model construction method and device and computer readable storage medium
CN111598992B (en) Partition removing and rendering method and system based on Unity3D body and surface model
CN106408499B (en) Method and device for acquiring reverse mapping table for image processing
CN113312979B (en) Image processing method and device, electronic equipment, road side equipment and cloud control platform
CN113470131B (en) Sea surface simulation image generation method and device, electronic equipment and storage medium
CN108062793A (en) Processing method, device, equipment and storage medium at the top of object based on elevation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant