CN113076040B - Curtain wall previewing method and system and storage medium - Google Patents

Curtain wall previewing method and system and storage medium Download PDF

Info

Publication number
CN113076040B
CN113076040B CN202110476651.8A CN202110476651A CN113076040B CN 113076040 B CN113076040 B CN 113076040B CN 202110476651 A CN202110476651 A CN 202110476651A CN 113076040 B CN113076040 B CN 113076040B
Authority
CN
China
Prior art keywords
curtain wall
target
model
initial axis
target model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110476651.8A
Other languages
Chinese (zh)
Other versions
CN113076040A (en
Inventor
尤勇敏
其他发明人请求不公开姓名
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiuling Jiangsu Digital Intelligent Technology Co Ltd
Original Assignee
Jiuling Jiangsu Digital Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiuling Jiangsu Digital Intelligent Technology Co Ltd filed Critical Jiuling Jiangsu Digital Intelligent Technology Co Ltd
Priority to CN202110476651.8A priority Critical patent/CN113076040B/en
Publication of CN113076040A publication Critical patent/CN113076040A/en
Application granted granted Critical
Publication of CN113076040B publication Critical patent/CN113076040B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/20Finite element generation, e.g. wire-frame surface description, tesselation
    • G06T17/205Re-meshing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a method, a system and a storage medium for previewing a curtain wall, which are applied to computer equipment, wherein the previewing method comprises the following steps: generating a visualization window of the computer device; acquiring a target curtain wall and an initial axis network line; generating a sub-curtain wall contour line of the curtain wall according to a preset segmentation model; constructing a target model according to the target curtain wall, the initial axis network line and the sub-curtain wall contour line; and displaying the target model in the visualization window. According to the curtain wall previewing method, the curtain wall previewing system and the storage medium, the outline line of the curtain wall is directly loaded to the visual window without three-dimensional rendering, so that the generation time of the display model is greatly shortened, a user can timely know the actual curtain wall dividing effect of the used division model, and the efficiency of selecting the division model by the user is improved.

Description

Curtain wall previewing method and system and storage medium
Technical Field
The present application relates to the field of image display, and in particular, to a method and a system for previewing a curtain wall, and a storage medium.
Background
When a curtain wall is built by a construction team, one curtain wall is generally divided into a plurality of sub-curtain walls in consideration of the construction difficulty and the construction cost, and the finished curtain wall is obtained by splicing the sub-curtain walls. When the curtain wall is divided into the sub-curtain walls through software, the time for dividing the curtain wall through the software is very long because the internal division model requires operation time and the three-dimensional rendering also requires a certain time. When a user selects a segmentation model, if the user does not know which segmentation model is selected specifically and cannot preview the segmentation model, the user needs to perform the division operation on the curtain wall for multiple times according to different segmentation models to determine the segmentation model finally used for actual construction, and therefore the efficiency of selecting the segmentation model by the user is greatly reduced.
Therefore, a method for previewing a curtain wall is needed to solve the above problems.
Disclosure of Invention
The invention aims to provide a method, a system and a storage medium for previewing a curtain wall, so as to solve the technical problem that the efficiency of selecting a segmentation model by a user is low.
In order to achieve the above object, an embodiment of the present invention provides a method for previewing a curtain wall, which is applied to a computer device, where the method for previewing a curtain wall includes: generating a visual window of the computer device; acquiring a target curtain wall and an initial axis network line; generating a sub-curtain wall contour line of the curtain wall according to a preset segmentation model; constructing a target model according to the target curtain wall, the initial axis network line and the sub-curtain wall contour line; and displaying the target model in the visualization window.
Further, the preview method further includes: and acquiring display parameters in real time, and adjusting at least one parameter of the angle, the volume and the position of the target model in real time according to the display parameters.
Further, the step of acquiring a display parameter in real time and adjusting at least one parameter of an angle, a volume and a position of the target model in real time according to the display parameter includes: monitoring the moving operation of an input device of computer equipment in real time to acquire the current coordinate of the input device in the visualization window; obtaining, by the input device, a value of a volume adjustment parameter of a target model, wherein the volume adjustment parameter is used to adjust a volume of the target model; adjusting the size of the target model in the visualization window according to the current coordinates and the volume adjustment parameter, wherein the value of the volume adjustment parameter is in a proportional relationship with the volume of the target model; and re-executing the movement operation of the input device of the real-time monitoring computer device to acquire the current coordinate of the input device in the visualization window.
Further, the step of acquiring a display parameter in real time and adjusting at least one parameter of an angle, a volume and a position of the target model in real time according to the display parameter includes: monitoring the clicking operation of input equipment of computer equipment in real time to acquire the current coordinate of the input equipment in the visualization window and serve as a starting point coordinate; judging whether the input equipment executes dragging operation or not; when the input equipment is judged to execute dragging operation, updating the end point coordinate of the input equipment according to the current coordinate after dragging, and acquiring the start point coordinate; when the input equipment is judged not to execute the dragging operation, acquiring a start point coordinate and an end point coordinate when the input equipment executes the dragging operation; adjusting the position of the target model in the visualization window according to the relative position of the starting point coordinate and the end point coordinate; and re-executing the click operation of the input device of the real-time monitoring computer device to acquire the current coordinate of the input device in the visualization window.
Further, the step of acquiring a display parameter in real time and adjusting at least one parameter of an angle, a volume and a position of the target model in real time according to the display parameter includes: monitoring the clicking operation of input equipment of computer equipment in real time to obtain the current coordinate of the input equipment in the visualization window and taking the current coordinate as a starting point coordinate; judging whether the input equipment executes dragging operation or not; when the input equipment is judged to execute the dragging operation, updating the terminal coordinate of the input equipment according to the current coordinate after dragging; if the input equipment is judged not to be in dragging operation, acquiring a start point coordinate and an end point coordinate when the input equipment is in dragging operation; adjusting the angle of the target model in the visualization window according to the relative position of the starting point coordinate and the end point coordinate; and re-executing the click operation of the input device of the real-time monitoring computer device to acquire the current coordinate of the input device in the visualization window.
Further, in the step of adjusting the position of the target model in the visualization window according to the relative position of the start coordinate and the end coordinate, the position of the target model in the visualization window is adjusted by adjusting an observation camera coordinate according to a vector difference between the start coordinate and the end coordinate, wherein the observation camera is a virtual camera of a computer device for displaying an obtained image to the visualization window.
Further, in the step of adjusting the angle of the target model in the visualization window according to the relative position of the start coordinate and the end coordinate, the angle of the target model in the visualization window is adjusted by adjusting a viewing camera coordinate according to a vector difference between the start coordinate and the end coordinate, where the viewing camera is a virtual camera of a computer device for displaying an obtained image to the visualization window.
Further, the step of obtaining the target curtain wall and the initial axis network line includes: obtaining the contour line of a target curtain wall; acquiring an initial axis network line; judging whether the number of the initial axis network lines is greater than or equal to a preset number or not; and when the number of the initial axis network lines is judged to be smaller than the preset number, the step of obtaining the initial axis network lines is executed again.
Further, the step of obtaining the target curtain wall and the initial axis network line comprises: obtaining the contour line of a target curtain wall; generating a first image layer according to the contour line of the target curtain wall; acquiring an initial axis network line; and generating a second image layer according to the initial axis network line.
Further, according to the preset model of cutting apart, generate the step of the sub-curtain wall contour line of curtain, include: acquiring a sub-curtain wall contour line of the target curtain wall according to a preset segmentation model; and generating a third image layer according to the contour line of the sub-curtain wall.
Further, the step of constructing the target model according to the target curtain wall, the initial axis network line and the sub-curtain wall contour line comprises the following steps: acquiring the display state of each layer; and displaying all layers with the display states being the visual states, and re-executing the step of acquiring the display state of each layer.
The embodiment of the invention also provides a preview system of the curtain wall, which comprises the following components: the window generating unit is used for generating a visual window of the computer equipment; the reference object acquisition unit is used for acquiring a target curtain wall and an initial axis network line; the curtain wall segmentation unit is used for generating sub-curtain wall contour lines of the curtain wall according to a preset segmentation model; the model generation unit is used for constructing a target model according to the target curtain wall, the initial axis network line and the sub-curtain wall contour line; and a model display unit for displaying the target model in the visualization window.
An embodiment of the present invention further provides a storage medium, where a computer program is stored, and when the computer program is read and executed by a processor, any step in the foregoing preview method is executed.
According to the curtain wall previewing method, the system and the storage medium provided by the embodiment of the invention, the outline of the curtain wall is directly loaded to the visual window without three-dimensional rendering, so that the generation time of the display model is greatly reduced, a user can timely know the actual curtain wall dividing effect of the used division model, and the efficiency of selecting the division model by the user is further improved.
Drawings
The technical solutions and other advantages of the present application will become apparent from the following detailed description of specific embodiments of the present application when taken in conjunction with the accompanying drawings.
Fig. 1 is a flowchart of a curtain wall previewing method according to an embodiment of the present application.
Fig. 2 is a first sub-flowchart of step S200 shown in fig. 1.
Fig. 3 is a second sub-flowchart of step S200 shown in fig. 1.
Fig. 4 is a first sub-flowchart of step S600 shown in fig. 1.
Fig. 5 is a second sub-flowchart of step S600 shown in fig. 1.
Fig. 6 is a third sub-flowchart of step S600 shown in fig. 1.
Fig. 7 is a schematic structural diagram of a preview system of a curtain wall according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Referring to fig. 1, the present embodiment provides a method for previewing a curtain wall, which includes the following steps.
And S100, generating a visual window of the computer equipment.
Illustratively, the visualization window is implemented by maskpannel control built in a Winform application program, so that a user can preview a finished curtain wall through the visualization window.
And S200, acquiring a target curtain wall and an initial axis net line. The initial axis network line is selected by a user, and after the initial axis network line is selected by the user, the preview method provided by the embodiment can insert the sub-curtain wall contour line between the selected initial axis network lines. Optionally, the initial axis network may also be used as a reference to enable the user to better understand the spatial relationships of the display model. Optionally, the number of initial axis mesh lines is greater than or equal to two.
Referring to fig. 2, step S200 optionally includes steps S2101 to S2401.
And S2101, obtaining the contour line of the target curtain wall.
Step S2201, an initial axis network line is acquired. In this embodiment, the initial axis network line is used to define the partition range of the subsequent segmentation model, or serves as a reference line to assist the user in understanding the spatial relationship of the target model.
Step S2301, determining whether the number of initial axis network lines is greater than or equal to a preset number. In this embodiment, if the initial axis network line is used to define the partition range of the subsequent segmentation model, the preset number of the initial axis network line may be set to two.
Step S2401, when the number of the initial axis network lines is judged to be less than the preset number, step S220 is executed again. When it is determined that the number of the initial axis network lines is greater than or equal to the preset number, step S300 is performed.
Referring to fig. 3, step S200 optionally includes steps S2102 to S2402.
And step S2102, obtaining the contour line of the target curtain wall.
And step S2202, generating a first image layer according to the contour line of the target curtain wall.
Step S2302, obtaining an initial axis network line.
Step S2402, generating a second layer according to the initial axis network line.
In the steps S2102 to S2402, by setting the contour line of the target curtain wall and the initial axis line in layers, the user may select different layers to display or modify according to actual requirements.
Continuing to refer to fig. 1, in step S300, a sub-curtain wall contour line of the curtain wall is generated according to the preset segmentation model.
In the present embodiment, the preset segmentation model may divide the curtain wall into a plurality of sub-curtain walls according to a built-in segmentation algorithm. The division line generated by the division model is the contour line of the sub-curtain wall. The present embodiment does not limit the segmentation model used in the preview method. Illustratively, the division model may be an expansion surface division model (expanding the stereoscopic curtain wall into a planar curtain wall and then dividing the planar sub-curtain wall) or a three-dimensional division model (directly dividing the stereoscopic curtain wall into the stereoscopic sub-curtain wall through an axis mesh surface). Alternatively, step S300 may include the steps of: and obtaining the sub-curtain wall contour line of the target curtain wall according to the preset segmentation model. And generating a third image layer according to the contour line of the sub-curtain wall. So that a user can select different layers to display or modify according to actual requirements.
And S400, constructing a target model according to the target curtain wall, the initial axis network line and the sub-curtain wall contour line.
In some embodiments, step S400 includes the steps of: and acquiring the display state of each layer. Displaying all layers with the display states being visual states, and executing the step of obtaining the display state of each layer again. The display state of each layer is divided into a visible state and an invisible state, and is set by a user, so that different display requirements of the user are met.
And S500, displaying the target model on the visualization window.
In some embodiments, the previewing method may further include step S600 of obtaining a display parameter in real time, and adjusting at least one of an angle, a volume, and a position of the target model in real time according to the display parameter.
In this embodiment, the user can correspondingly adjust the target model displayed in step S500 by acquiring the display parameters in real time.
Referring to fig. 4, optionally, step S600 includes steps S6101 to S6301.
Step S6101, monitoring the movement operation of the input device of the computer device in real time to obtain the current coordinates of the input device within the visualization window. In particular, the input device of the computer device may be selected to be a mouse or a keyboard.
Step 6201, obtaining a value of a volume adjustment parameter of the target model through the input device, wherein the volume adjustment parameter is used for adjusting the volume of the target model.
And S6301, adjusting the size of the target model in the visualization window according to the current coordinates and the volume adjusting parameters, and executing S6101 again. The value of the volume adjustment parameter is proportional to the volume of the target model. Specifically, the display model increases in volume when the input device wheel slides along one side and decreases in volume when the input device wheel slides along the other side. In this embodiment, steps S6101 to S6301 can achieve the zooming-in and zooming-out effects of the display model.
As shown in fig. 5, optionally, step S600 includes steps S6102 to S6402.
Step S6102, monitoring the click operation of the input device of the computer device in real time to obtain the current coordinate of the input device in the visualization window, and using the current coordinate as the starting point coordinate. Specifically, the input device is a mouse.
And step 6202, judging whether the input equipment executes dragging operation. Specifically, whether the input device is in a dragging state is judged through an ondrag function preset in the C # function library.
And step S6302, when the input device is judged to execute the dragging operation, updating the end point coordinate of the input device according to the dragged current coordinate. When the input device is judged not to execute the dragging operation, the start point coordinate and the end point coordinate of the input device when the dragging operation is executed are obtained. The coordinates of the start point and the coordinates of the end point of the input device performing the dragging operation can be accurately obtained through step S6302, so as to facilitate the subsequent adjustment of the position of the target model in the visualization window.
And step S6402, adjusting the position of the target model in the visualization window according to the relative position of the starting point coordinate and the end point coordinate, and executing step S6102 again. Optionally, the position of the target model in the visualization window is adjusted by adjusting the observational camera coordinates according to a vector difference between the start point coordinates and the end point coordinates. The observation camera is a virtual camera of the computer device, and is used for displaying the obtained image to the visualization window. Specifically, the position of the display model in the visualization window is adjusted by obtaining a vector connecting the coordinates of the start point and the coordinates of the end point and adding the vector to the coordinates of the observation camera (in C #, the observation camera may simulate the actual loading effect of the display model by capturing a picture), thereby achieving the effect of adjusting the position of the display model.
As shown in fig. 5, optionally, step S600 includes steps S6103 to S6403.
Step S6103, monitoring the click operation of the input device of the computer device in real time to obtain the current coordinate of the input device in the visualization window, and using the current coordinate as the starting point coordinate. Specifically, the input device is a mouse.
And step 6203, judging whether the input equipment executes dragging operation. Specifically, whether the input device is in a dragging state is judged through an ondrag function preset in the C # function library.
And step S6303, when the input device is judged to execute the dragging operation, updating the end point coordinate of the input device according to the dragged current coordinate. When the input device is judged not to execute the dragging operation, the start point coordinate and the end point coordinate of the input device executing the dragging operation are acquired. The coordinates of the start point and the coordinates of the end point of the input device performing the dragging operation can be accurately obtained through step S6303, so as to subsequently adjust the angle of the target model in the visualization window.
And step S6403, adjusting the angle of the target model in the visualization window according to the relative position of the starting point coordinate and the end point coordinate, and executing step S6103 again. Optionally, the angle of the target model in the visualization window is adjusted by adjusting the observational camera coordinates according to a vector difference between the start point coordinates and the end point coordinates. The observation camera is a virtual camera of the computer device, and is used for displaying the obtained image to the visualization window. Specifically, the method comprises the steps of obtaining the component vectors of vectors connecting the starting point coordinates (x-axis and y-axis coordinates) and the end point coordinates (x-axis and y-axis coordinates) on the x-axis and y-axis, and setting the coordinates of the camera through the component vectors to achieve the effect of rotating the camera, so that the effect of adjusting the angle of the display model is achieved.
According to the curtain wall previewing method provided by the embodiment, the contour line of the curtain wall is directly loaded to the visual window without three-dimensional rendering, so that the generation time of the display model is greatly reduced, a user can timely know the actual curtain wall dividing effect of the used division model, and the efficiency of selecting the division model by the user is further improved.
Referring to fig. 6, based on the same inventive concept, the present embodiment further provides a preview system 100 for a curtain wall, which includes a window generating unit 110, a reference object obtaining unit 120, a curtain wall dividing unit 130, a model generating unit 140, and a model displaying unit 150.
The window generating unit 110 is used to generate a visualization window of a computer device. The reference object obtaining unit 120 is used for obtaining a target curtain wall and an initial axis network line. The curtain wall segmentation unit 130 is used for generating a sub-curtain wall contour line of the curtain wall according to a preset segmentation model. The model generation unit 140 is configured to construct a target model according to the target curtain wall, the initial axis and the sub-curtain wall contour line. The model display unit 150 is used for displaying the target model in the visualization window.
The preview system of curtain that this embodiment provided is through the outline line with the curtain not passing through three-dimensional rendering, and direct loading to visual window's mode has significantly reduced the generating time that shows the model for the user can in time learn the actual curtain of cutting apart the model that uses and divide the effect, and then has improved the efficiency that the user chose the model of cutting apart.
The method, the system and the storage medium for previewing the curtain wall provided by the embodiment of the application are introduced in detail, a specific example is applied in the description to explain the principle and the implementation mode of the application, and the description of the embodiment is only used for helping to understand the technical scheme and the core idea of the application; those of ordinary skill in the art will understand that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications or substitutions do not depart from the spirit and scope of the present disclosure as defined by the appended claims.

Claims (7)

1. A method for previewing a curtain wall is applied to computer equipment and is characterized by comprising the following steps:
generating a visual window of the computer device;
acquiring a target curtain wall and an initial axis network line;
generating a sub-curtain wall contour line of the curtain wall according to a preset segmentation model;
constructing a target model according to the target curtain wall, the initial axis network line and the sub-curtain wall contour line; and
displaying the target model in the visualization window;
acquiring display parameters in real time, and adjusting at least one parameter of the angle, the volume and the position of the target model in real time according to the display parameters;
the step of acquiring display parameters in real time and adjusting at least one parameter of the angle, the volume and the position of the target model in real time according to the display parameters comprises the following steps:
monitoring the movement operation of an input device of computer equipment in real time to acquire the current coordinate of the input device in the visualization window;
obtaining, by the input device, a value of a volume adjustment parameter of a target model, wherein the volume adjustment parameter is used to adjust a volume of the target model;
adjusting the size of the target model in the visualization window according to the current coordinates and the volume adjustment parameter, wherein the value of the volume adjustment parameter is in a proportional relationship with the volume of the target model; and
and re-executing the movement operation of the input device of the real-time monitoring computer device to acquire the current coordinate of the input device in the visualization window.
2. The preview method of claim 1, wherein the step of obtaining the target curtain wall and the initial axis network line comprises:
obtaining the contour line of a target curtain wall;
acquiring an initial axis network line;
judging whether the number of the initial axis network lines is greater than or equal to a preset number or not; and
and when the number of the initial axis network lines is judged to be less than the preset number, re-executing the step of obtaining the initial axis network lines.
3. The preview method of claim 1, wherein the step of obtaining the target curtain wall and the initial axis network line comprises:
obtaining the contour line of a target curtain wall;
generating a first image layer according to the contour line of the target curtain wall;
acquiring an initial axis network line;
and generating a second image layer according to the initial axis network line.
4. The preview method of claim 1, wherein the step of generating a sub-curtain wall contour of the curtain wall according to a preset segmentation model comprises:
acquiring a sub-curtain wall contour line of the target curtain wall according to a preset segmentation model;
and generating a third image layer according to the sub-curtain wall contour line.
5. The preview method of claim 3, wherein the step of constructing the target model based on the target curtain wall, the initial axis contour, and the sub-curtain wall contour includes:
acquiring the display state of each layer; and
displaying all layers with the display states being visual states, and executing the step of obtaining the display state of each layer again.
6. A system for previewing curtain walls, comprising:
the window generating unit is used for generating a visual window of the computer equipment;
the reference object acquisition unit is used for acquiring a target curtain wall and an initial axis network line;
the curtain wall segmentation unit is used for generating sub-curtain wall contour lines of the curtain wall according to a preset segmentation model;
the model generation unit is used for constructing a target model according to the target curtain wall, the initial axis network line and the sub-curtain wall contour line;
a model display unit to display the target model in the visualization window;
the display parameter acquisition unit is used for acquiring display parameters in real time and adjusting at least one parameter of the angle, the volume and the position of the target model in real time according to the display parameters;
the display parameter acquisition unit includes:
the coordinate acquisition unit is used for monitoring the moving operation of the input equipment of the computer equipment in real time so as to acquire the current coordinate of the input equipment in the visualization window;
a parameter adjusting unit, configured to obtain, through the input device, a value of a volume adjustment parameter of a target model, where the volume adjustment parameter is used to adjust a volume of the target model;
a window adjusting unit, configured to adjust a size of the target model in the visualization window according to the current coordinate and the volume adjustment parameter, where a value of the volume adjustment parameter is in a proportional relationship with a volume of the target model; and
and the coordinate acquisition unit is used for re-executing the moving operation of the input device of the real-time monitoring computer device so as to acquire the current coordinate of the input device in the visualization window.
7. A storage medium, characterized in that it stores a computer program which, when read and executed by a processor, performs the steps of a method for previewing of curtain walls according to any one of claims 1 to 5.
CN202110476651.8A 2021-04-29 2021-04-29 Curtain wall previewing method and system and storage medium Active CN113076040B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110476651.8A CN113076040B (en) 2021-04-29 2021-04-29 Curtain wall previewing method and system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110476651.8A CN113076040B (en) 2021-04-29 2021-04-29 Curtain wall previewing method and system and storage medium

Publications (2)

Publication Number Publication Date
CN113076040A CN113076040A (en) 2021-07-06
CN113076040B true CN113076040B (en) 2022-10-25

Family

ID=76616144

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110476651.8A Active CN113076040B (en) 2021-04-29 2021-04-29 Curtain wall previewing method and system and storage medium

Country Status (1)

Country Link
CN (1) CN113076040B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114004930A (en) * 2021-10-28 2022-02-01 广东三维家信息科技有限公司 Three-dimensional model generation method, device, equipment and storage medium
CN115859412B (en) * 2022-12-23 2023-10-20 中国建筑西南设计研究院有限公司 Curtain wall rapid generation method, device and equipment based on BIM

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102243653B1 (en) * 2014-02-17 2021-04-23 엘지전자 주식회사 Didsplay device and Method for controlling thereof
CN108121873A (en) * 2017-12-25 2018-06-05 广州江河幕墙系统工程有限公司 A kind of application process of buildings model in curtain wall construction
CN109271654B (en) * 2018-07-19 2023-04-18 平安科技(深圳)有限公司 Model contour segmentation method and device, storage medium and terminal
CN111552917B (en) * 2020-05-21 2023-01-24 广州市第四装修有限公司 Curtain wall construction method, device and equipment based on BIM and storage medium
CN111666126B (en) * 2020-06-09 2021-06-29 中亿丰建设集团股份有限公司 Building Internet of things integration system and method based on 3d visualization code-free development technology

Also Published As

Publication number Publication date
CN113076040A (en) 2021-07-06

Similar Documents

Publication Publication Date Title
KR101613721B1 (en) Methodology for 3d scene reconstruction from 2d image sequences
AU2017213540B2 (en) 3d sculpting
CN113076040B (en) Curtain wall previewing method and system and storage medium
US11436800B2 (en) Interactive system and method providing real-time virtual reality visualization of simulation data
EP0926628A2 (en) mpparatus and method for generating progressive polygon data
JP2018537755A (en) Foveal geometry tessellation
KR101867991B1 (en) Motion edit method and apparatus for articulated object
CN110335345B (en) Curtain wall node rendering method and device, computer equipment and storage medium
WO2008051231A1 (en) System and method for recovering three-dimensional particle systems from two-dimensional images
CN114708391B (en) Three-dimensional modeling method, three-dimensional modeling device, computer equipment and storage medium
US20220375152A1 (en) Method for Efficiently Computing and Specifying Level Sets for Use in Computer Simulations, Computer Graphics and Other Purposes
CN113781642B (en) Three-dimensional model multi-level LOD generation method based on face reduction algorithm
KR20160016812A (en) Image edits propagation to underlying video sequence via dense motion fields
CN114820980A (en) Three-dimensional reconstruction method and device, electronic equipment and readable storage medium
CN104240301B (en) Geological surface reconstructing method and equipment
Moustakides et al. 3D image acquisition and NURBS based geometry modelling of natural objects
KR20190067070A (en) System and method for projection mapping in virtual space
CN114452646A (en) Virtual object perspective processing method and device and computer equipment
JP4091403B2 (en) Image simulation program, image simulation method, and image simulation apparatus
JP7119854B2 (en) Changed pixel region extraction device, image processing system, changed pixel region extraction method, image processing method and program
KR101673442B1 (en) The method and apparatus for remeshing visual hull approximation by DBSS(displaced butterfly subdivision surface)
JP2020013390A (en) Information processing apparatus, information processing program, and information processing method
Ozbay et al. ANovel APPROACH TO SMOOTHING ON 3D STRUCTURED ADAPTIVE MESH OF THE KINECT-BASED MODELS
KR20120099994A (en) Method, apparatus and system for dense graph simplification, recording medium for the same
JP2017084214A (en) Information processing system, control method thereof, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant