CN117611726A - Real model sunlight display method and device - Google Patents

Real model sunlight display method and device Download PDF

Info

Publication number
CN117611726A
CN117611726A CN202410095122.7A CN202410095122A CN117611726A CN 117611726 A CN117611726 A CN 117611726A CN 202410095122 A CN202410095122 A CN 202410095122A CN 117611726 A CN117611726 A CN 117611726A
Authority
CN
China
Prior art keywords
target
value
vertex
rate
solar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202410095122.7A
Other languages
Chinese (zh)
Other versions
CN117611726B (en
Inventor
张帅
平红燕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Airlook Aviation Technology Beijing Co ltd
Original Assignee
Airlook Aviation Technology Beijing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Airlook Aviation Technology Beijing Co ltd filed Critical Airlook Aviation Technology Beijing Co ltd
Priority to CN202410095122.7A priority Critical patent/CN117611726B/en
Publication of CN117611726A publication Critical patent/CN117611726A/en
Application granted granted Critical
Publication of CN117611726B publication Critical patent/CN117611726B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)

Abstract

The embodiment of the specification discloses a method and a device for displaying sunlight of a live-action model, comprising the following steps: generating a surrounding rectangular frame of the target area, and determining sampling blocks in X, Y, Z coordinate axis directions in a first space range formed based on the surrounding rectangular frame, a preset bottom elevation value and a preset relative elevation value; determining the sunlight rate of each sampling block in a preset time period, and generating 3D textures by the sunlight rate of each sampling block; and screening out target vertexes of the live-action three-dimensional model in the second space range, determining the sunlight color value of each target vertex, and displaying the sunlight information of the live-action three-dimensional model in the second space range according to the sunlight color value of each target vertex.

Description

Real model sunlight display method and device
Technical Field
The application relates to the technical field of sunlight analysis, in particular to a method and a device for displaying sunlight of a live-action model.
Background
In many fields, such as architectural design and house sales, solar conditions are an important consideration, and solar analysis of buildings or houses is also an important issue.
However, in the prior art, the sunlight analysis effect on the building is not enough to meet the requirements, the illumination condition of the building is difficult to intuitively show, and the illumination analysis accuracy is still to be improved.
Disclosure of Invention
The embodiment of the specification provides a method and a device for displaying sunlight of a live-action model, which are used for solving the technical problem of how to perform sunlight analysis and display on a live-action three-dimensional model.
The embodiment of the specification provides a method for displaying sunlight of a live-action model, which comprises the following steps:
generating a surrounding rectangular frame of a target area, and determining sampling blocks in X, Y, Z three coordinate axis directions in a first space range formed based on the surrounding rectangular frame, a preset bottom elevation value and a preset relative elevation value;
determining the sunlight rate of each sampling block in a preset time period, and generating 3D textures by the sunlight rate of each sampling block;
screening out target vertexes of the live-action three-dimensional model in a second space range, determining the sunlight color value of each target vertex, and displaying sunlight information of the live-action three-dimensional model in the second space range according to the sunlight color value of each target vertex;
the second space range is formed based on the target area, a preset bottom elevation value and a preset relative elevation value;
determining the solar color value of each target vertex comprises: for any target vertex, determining the solar rate corresponding to the target vertex in the 3D texture and the solar rate of each adjacent sampling block of the sampling block where the target vertex is located; interpolation is carried out on the solar rate corresponding to the target vertex and the solar rate of each adjacent sampling block to obtain the solar rate of the target vertex, and linear interpolation is carried out on the color value of the lowest solar rate and the color value of the highest solar rate through the solar rate of the target vertex to obtain the solar color value corresponding to the target vertex.
The embodiment of the specification provides a real model sunlight display device, which comprises:
the sampling module is used for generating a surrounding rectangular frame of the target area and determining sampling blocks in X, Y, Z coordinate axis directions in a first space range formed based on the surrounding rectangular frame, a preset bottom elevation value and a preset relative elevation value;
the texture module is used for determining the sunlight rate of each sampling block in a preset time period and generating 3D textures according to the sunlight rate of each sampling block;
the coloring module is used for screening out target vertexes of the live-action three-dimensional model in the second space range, determining the sunlight color values of the target vertexes and displaying the sunlight information of the live-action three-dimensional model in the second space range according to the sunlight color values of the target vertexes;
the second space range is formed based on the target area, a preset bottom elevation value and a preset relative elevation value;
determining the solar color value of each target vertex comprises: for any target vertex, determining the solar rate corresponding to the target vertex in the 3D texture and the solar rate of each adjacent sampling block of the sampling block where the target vertex is located; interpolation is carried out on the solar rate corresponding to the target vertex and the solar rate of each adjacent sampling block to obtain the solar rate of the target vertex, and linear interpolation is carried out on the color value of the lowest solar rate and the color value of the highest solar rate through the solar rate of the target vertex to obtain the solar color value corresponding to the target vertex.
The above-mentioned at least one technical scheme that this description embodiment adopted can reach following beneficial effect:
and sampling the first space range, and determining the sunlight rate of each sampling block to generate the 3D texture. And for the target vertexes of the live-action three-dimensional model in the second space range, determining the sunlight color values of the target vertexes through mapping or indexing with the 3D texture, so that the sunlight information of the live-action three-dimensional model in the second space range can be displayed according to the color values of the target vertexes. Therefore, the sunshine analysis can be carried out on the real-scene three-dimensional model, and the sunshine color information can be attached to the real-scene three-dimensional model to display the sunshine analysis result, so that the sunshine display of the real-scene three-dimensional model is clearer and more visual.
Drawings
In order to more clearly illustrate the embodiments of the present specification or the technical solutions in the prior art, the drawings used in the description of the embodiments of the present specification or the prior art will be briefly described below. It will be apparent to those skilled in the art that the drawings, which are only illustrative of the manner in which some of the embodiments of the present application may be utilized, are readily apparent, and that other drawings may be derived from these drawings without inventive faculty.
Fig. 1 is a schematic diagram of an execution subject of a real model sunlight display method in the first embodiment of the present specification.
Fig. 2 is a flowchart of a real model sunlight display method in the first embodiment of the present specification.
Fig. 3 is a schematic view of a real model sunlight display effect in the first embodiment of the present specification.
Fig. 4 is a schematic structural view of a real model sunlight display device in the second embodiment of the present specification.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the present specification, the technical solutions in the embodiments of the present specification will be clearly and completely described below with reference to the accompanying drawings. It will be apparent that the embodiments referred to in this detailed description are only some, but not all, of the embodiments of the present application. All other examples, which are obtained based on the examples in the detailed description without inventive effort, should be considered as falling within the scope of the present application by a person of ordinary skill in the art.
The first embodiment (hereinafter referred to as "embodiment one") of the present disclosure provides a real-scene three-dimensional model sunlight display method, where the execution subject of the first embodiment may be a terminal (including, but not limited to, a mobile phone, a computer, a pad, a television), a server, an operating system, an application program, a real-scene three-dimensional model sunlight display platform, a real-scene three-dimensional model sunlight display system, or the like, that is, the execution subject may be various, and the execution subject may be set, used, or changed according to needs. In addition, a third party application may also be provided to assist the executing entity in executing embodiment one. For example, as shown in fig. 1, the method for displaying the sunlight of the real-scene three-dimensional model in the first embodiment may be executed by a server, and an application program corresponding to the server may be installed on a terminal (held by a user), and data may be transmitted between the terminal or the application program and the server, and data may be acquired, input, output, or processed by the terminal or the application program, so as to assist the server in executing the method for displaying the sunlight of the real-scene three-dimensional model in the first embodiment.
As shown in fig. 2, the method for displaying the sunlight of the three-dimensional model for real scenery provided in the first embodiment includes:
s101: generating a surrounding rectangular frame of a target area, and determining sampling blocks in X, Y, Z three coordinate axis directions in a first space range formed based on the surrounding rectangular frame, a preset bottom elevation value and a preset relative elevation value;
in the first embodiment, if a certain area (ground area) is to be subjected to solar analysis, the area may be referred to as a target area. The first embodiment is not particularly limited as to the area or shape of the target area.
In the first embodiment, a bounding rectangular box of the target area may be generated. Wherein generating the bounding rectangle of the target area may include: converting a longitude and latitude set of vertexes of a target area (namely, a set containing longitude and latitude coordinates of each vertex of the target area) into a Cartesian coordinate set (the Cartesian coordinate set is spherical coordinates and refers to a set of Cartesian coordinates respectively converted by the longitude and latitude coordinates of each vertex of the target area), multiplying the Cartesian coordinate set by an actual three-dimensional model (the actual three-dimensional model generally refers to an integral actual model in a preset area, the preset area generally comprises the target area and the area is larger than the target area), and obtaining a local coordinate set of each vertex of the target area, and obtaining a surrounding rectangular frame of the target area according to the local coordinate set. Specifically, after longitude and latitude coordinates of each vertex of the target area are respectively converted into Cartesian coordinates, each Cartesian coordinate is respectively multiplied by an inverse matrix of the live-action three-dimensional model offset matrix to obtain each local coordinate, so that a local coordinate set of each vertex of the target area is formed.
Based on the bounding rectangle and the predetermined bottom elevation values and relative elevation values, a spatial range, referred to as a first spatial range, may be formed. The bottom surface of the first space range is an area surrounded by the rectangular frame, the position or the height of the bottom surface of the first space range is determined according to the bottom elevation value, and the height of the first space range is determined according to the relative height value.
In the first embodiment, the sampling may be performed according to the three coordinate axis directions X, Y, Z, so as to divide the first spatial range into a plurality of sampling blocks, i.e. the sampling blocks in the three coordinate axis directions X, Y, Z in the first spatial range are determined. Wherein determining the sample blocks in the directions of the three coordinate axes of X, Y, Z in the first spatial range may comprise: and determining the number of sampling blocks in the directions of three coordinate axes in the first space range X, Y, Z according to the preset interval value, and dividing the first space range according to the number of the sampling blocks in the directions of the three coordinate axes X, Y, Z to obtain each sampling block in the directions of the three coordinate axes in the first space range X, Y, Z. For example, if the length of the first spatial range along the X-axis direction is a, the length along the Y-axis direction is b, the length along the Z-axis direction is c, and the preset interval value is 1, the number of sampling blocks along the X-axis direction is a, the number of sampling blocks along the Y-axis direction is b, and the number of sampling blocks along the Z-axis direction is c, so that the first spatial range is divided into c layers, and each layer of sampling blocks has a×b sampling blocks, that is, the first spatial range is divided into a×b×c sampling blocks.
S103: determining the sunlight rate of each sampling block in a preset time period, and generating 3D textures by the sunlight rate of each sampling block;
in the first embodiment, the preset time period and the time sampling point within the preset time period may be determined. Specifically, parameters such as a date, a start time, an end time, and a sampling interval may be input, so as to determine a time period (i.e., a preset time period) between the start time and the end time within the date, and each time sampling point within the time period. For example, when the start time is 8 and the end time is 11, the sampling interval is 30 minutes, and the sampling points at each time are 8 hours, 30 minutes, 9 hours, 30 minutes, 10 hours, 30 minutes, and 11 hours.
For any sampling block, the solar rate of the sampling block in the preset time period can be determined. Specifically, at each time sampling point, whether the sampling block can be irradiated by sunlight is judged, and the proportion of the time sampling point, which can be irradiated by sunlight, of the total time sampling point is used as the sunlight rate of the sampling block in the preset time period. Wherein, to any sampling block and any time sampling point, judging whether this sampling block can be irradiated by sunlight at this time sampling point can include: and calculating the light source position of the time sampling point (the light source position refers to the position of the sun with the sphere center of the earth as the origin), and judging whether the sampling block and the light source position can be seen or not. If the sunlight is visible, the sampling block can be irradiated by sunlight at the time sampling point; if not visible, the sampling block cannot be illuminated by sunlight at that time sampling point.
By the method, the solar irradiation rate of each sampling block in the preset time period is determined. In a first embodiment, the 3D texture may be generated from the solar rate of each sample block. Wherein generating the 3D texture from the solar rate of each sample block may include: and setting the length, width and height pixel size value of the 3D texture as a reference value, and establishing the corresponding relation between each sampling block and the pixels of the 3D texture. The alpha channel of the pixel of the 3D texture is used for storing the solar rate of the corresponding sampling block, namely, for any pixel of the 3D texture, the alpha channel of the pixel is used for storing the solar rate of the sampling block corresponding to the pixel. That is, the data source of the 3D texels is the insolation of each sample block at each time sample point.
Specifically, the reference value may be based on the number of sampling blocks in the directions of three coordinate axes of X, Y, Z in the first spatial range. For example, the reference value is not smaller than the maximum value of the number of sampling blocks in the directions of three coordinate axes of X, Y, Z in the first spatial range. In particular, a power of 2 that is equal to or greater than "the maximum value of the number of sampling blocks in the three coordinate axis directions of X, Y, Z in the first spatial range" may be used as the reference value.
According to the above description, the 3D texture created in the first embodiment is actually a three-dimensional texture or 3D picture formed by pixels, and is a three-dimensional concept, and the first spatial range is also a three-dimensional spatial range including a sample block, so that a correspondence or index relationship between the pixels of the 3D texture and the sample block can be created, including but not limited to a correspondence or index relationship between the 3D texture pixels and the sample block through coordinates. Specifically, each sampling block can be positioned by the coordinates in the directions of three coordinate axes of X, Y, Z, and likewise, each pixel of the 3D texture can be positioned according to the coordinates in the directions of three directions of length, width and height (corresponding to the three coordinate axes of X, Y, Z), so that the sampling blocks and the pixels with the same coordinates can be corresponding and indexed with each other, and the corresponding relation between the 3D texture pixels and the sampling blocks is established. For example, if the coordinates of a certain sample block are (A1, B1, C1), it is described that the sample block is the sample block of the A1 st sample block in the X-axis direction, the B1 st sample block in the Y-axis direction, and the C1 st sample block in the Z-axis direction, the A1 st pixel in the longitudinal direction, the B1 st pixel in the width direction, and the C1 st pixel in the height direction of the 3D texture correspond to the sample block and can be indexed to each other. Since the length, width and height directions of the 3D texture are all provided with at least reference values of pixels, and the reference values are not smaller than the maximum value of the number of sampling blocks in the directions of three coordinate axes of X, Y, Z in the first space range, each sampling block can be guaranteed to find a corresponding 3D texture pixel, so that any pixel of the 3D texture can be realized, and an alpha channel of the pixel is used for storing the solar rate of the sampling block corresponding to the pixel (under the condition that the corresponding sampling block exists).
S105: screening out target vertexes of the live-action three-dimensional model in a second space range, determining the sunlight color value of each target vertex, and displaying sunlight information of the live-action three-dimensional model in the second space range according to the sunlight color value of each target vertex; the second space range is formed based on the target area, a preset bottom elevation value and a preset relative elevation value; determining the solar color value of each target vertex comprises: for any target vertex, determining the solar rate corresponding to the target vertex in the 3D texture and the solar rate of each adjacent sampling block of the sampling block where the target vertex is located; interpolation is carried out on the solar rate corresponding to the target vertex and the solar rate of each adjacent sampling block to obtain the solar rate of the target vertex, and linear interpolation is carried out on the color value of the lowest solar rate and the color value of the highest solar rate through the solar rate of the target vertex to obtain the solar color value corresponding to the target vertex.
In the first embodiment, since the solar condition in the target area is to be analyzed, a spatial range may be formed based on the target area and the preset bottom elevation value and the relative elevation value, which is referred to as a second spatial range. The bottom surface of the second space range is the target area, the position or the height of the bottom surface of the second space range is determined according to the bottom elevation value, and the height of the second space range is determined according to the relative height value.
In the first embodiment, vertices (hereinafter referred to as "target vertices") of the live-action three-dimensional model located in the second spatial range may be screened out.
Before screening out the target vertex of the real-scene three-dimensional model in the second space range, a camera view cone can be constructed in the range of the surrounding rectangular frame, and the depth map of the surrounding rectangular frame can be obtained through calculation. Also, the depth of the region within the bounding rectangular box may be set to 0 or less outside the target region.
For each vertex of the real three-dimensional model in the first spatial range, a correspondence or index relationship between the vertex and the depth map pixel may be established, including but not limited to establishing a correspondence or index relationship between the real three-dimensional model vertex and the depth map pixel by coordinates, so that for any vertex of the real three-dimensional model in the first spatial range, the pixel corresponding to the vertex in the depth map may be determined.
Wherein, for any vertex of the live-action three-dimensional model in the first space range, determining the pixel corresponding to the vertex in the depth map may include:
for any vertex of the live-action three-dimensional model in the first space range, subtracting the X coordinate value of the starting point of the bounding rectangle frame from the X coordinate value of the vertex, and subtracting the Y coordinate value of the starting point of the bounding rectangle frame from the Y coordinate value of the vertex. When the X coordinate value or the Y coordinate value of the vertex is determined, the plane formed by the X coordinate axis and the Y coordinate value is parallel to the plane of the surrounding rectangular frame. Thus, the plane around the rectangular frame is defined as the plane formed by the X coordinate axis and the Y coordinate axis, which are parallel to the pair of vertical sides around the rectangular frame. The coordinate axis subtraction process is to subtract the X coordinate value of the starting point of the bounding rectangle frame from the X coordinate value of the projection point of the vertex on the plane of the bounding rectangle frame, and subtract the Y coordinate value of the starting point of the bounding rectangle frame from the Y coordinate value of the projection point of the vertex on the plane of the bounding rectangle frame. By subtracting the X coordinate value and the Y coordinate value, the relative position relation between the projection point of the vertex on the plane of the surrounding rectangular frame and the starting point of the surrounding rectangular frame is obtained, and the relative position relation between the projection point of the vertex on the plane of the surrounding rectangular frame and the surrounding rectangular frame is obtained.
Normalizing the subtraction result, namely normalizing the subtraction result of the X coordinate value and the subtraction result of the Y coordinate value respectively. Specifically, the subtraction result of the X coordinate value is divided by the side length of the side surrounding the rectangular frame parallel to the X coordinate axis (the division result is between 0 and 1 and thus normalized to the [0,1] section), and the subtraction result of the Y coordinate value is divided by the side length of the side surrounding the rectangular frame parallel to the Y coordinate axis (the division result is between 0 and 1 and thus normalized to the [0,1] section).
And locating pixels in the depth map according to the normalized result (coordinates of the pixels in the depth map are between 0 and 1). Assuming that the normalization result of the "subtraction result of X coordinate values" is p1 and the normalization result of the "subtraction result of Y coordinate values" is q1, a depth map pixel having the X coordinate value of p1 and the Y coordinate value of q1 can be located as a pixel corresponding to or indexed to the vertex.
It can be seen that the foregoing correspondence or index relationship between the pixels of the 3D texture and the sampling block is a three-dimensional space to three-dimensional space correspondence or index relationship, and the correspondence relationship between the depth map pixels and the fixed points is a three-dimensional space to two-dimensional space correspondence or index relationship.
In a first embodiment, screening out vertices of the live-action three-dimensional model located in the second spatial range may include:
S1051: screening out the vertexes of the real three-dimensional model in the first space range (whether the vertexes of the real three-dimensional model are in the first space range or not can be judged, and the vertexes of the real three-dimensional model in the first space range can be screened out);
s1053: determining a pixel corresponding to any vertex of the live-action three-dimensional model in the first space range in the depth map, and determining the maximum value of the depth values of the pixel and each adjacent pixel of the pixel;
for any vertex of the live-action three-dimensional model in the first spatial range, a pixel corresponding to the vertex in the depth map may be determined as described above. On the one hand, the depth value of the pixel corresponding to the vertex is determined, and on the other hand, the depth value of each adjacent pixel of the "pixel corresponding to the vertex" is determined, and the maximum value of the depth values of the "pixel corresponding to the vertex" and each adjacent pixel thereof is determined. Wherein each adjacent pixel generally refers to four adjacent pixels up, down, left, right, or front, back, left, and right. If the "pixel corresponding to the vertex" is located on the depth map side, the number of neighboring pixels may be less than four.
S1055: and judging whether the vertex is positioned in the second space range according to whether the maximum value meets a preset condition.
For example, if the maximum value is not 0 or equal to or greater than a preset value (the preset value is generally smaller and may be set as required, for example, may be 0.00001), the vertex is determined to be located in the second spatial range. Correspondingly, if the maximum value is 0 or smaller than a preset value, judging that the vertex is not in the second space range.
The above determines whether the vertex is located in the second spatial range according to whether the maximum value of the depth values of the pixel corresponding to the vertex and the adjacent pixels meets the preset condition, instead of considering only the depth value of the pixel corresponding to the vertex, so as to improve the accuracy of determining whether the vertex is located in the second spatial range.
In a first embodiment, for each target vertex in the second spatial range, its sunlight color value may be determined. Wherein determining the sunlight color value of each target vertex may include:
s1052: for any target vertex, determining the sunshine rate corresponding to the target vertex in the 3D texture and the sunshine rate of each adjacent sampling block of the sampling block where the target vertex is positioned;
for any target vertex, the solar rate corresponding to the target vertex in the 3D texture refers to the solar rate stored in the alpha channel of the pixel corresponding to the target vertex in the 3D texture.
For any target vertex, determining the solar rate corresponding to the target vertex in the 3D texture may include: for any target vertex, subtracting the X coordinate value of the starting point of the bounding rectangle frame from the X coordinate value of the target vertex, subtracting the Y coordinate value of the starting point of the bounding rectangle frame from the Y coordinate value of the target vertex, and subtracting the bottom elevation value from the Z coordinate value of the target vertex. As described above, the X coordinate value subtraction and Y coordinate value subtraction process corresponds to a plane formed by taking the plane surrounding the rectangular frame as the X coordinate axis and the Y coordinate axis, subtracting the X coordinate value of the start point of the rectangular frame from the X coordinate value of the projected point of the vertex on the plane surrounding the rectangular frame, and subtracting the Y coordinate value of the start point of the rectangular frame from the Y coordinate value of the projected point of the vertex on the plane surrounding the rectangular frame. Since the target vertex is a space coordinate, and the Z coordinate surrounding the start point of the rectangular frame is the bottom elevation value, the bottom elevation value is subtracted from the Z coordinate of the target vertex. Thus, the relative positional relationship between the target vertex and the starting point of the bounding rectangle in three-dimensional space is obtained, and the relative positional relationship between the target vertex and the bounding rectangle in three-dimensional space is also obtained.
Normalizing the subtraction result, namely normalizing the subtraction result of the X coordinate value, the subtraction result of the Y coordinate value and the subtraction result of the Z coordinate value (referring to the Z coordinate value and the bottom elevation value) respectively. Specifically, dividing the subtraction result of the X coordinate value by the preset interval value, mapping to [0, reference value ], and dividing by the reference value so as to normalize to the [0,1] interval; dividing the subtraction result of the Y coordinate value by the preset interval value, mapping to the [0, reference value ], dividing by the reference value, and normalizing to the [0,1] interval; dividing the subtraction result of the Z coordinate value by the preset interval value, mapping to the [0, reference value ], dividing by the reference value, and normalizing to the [0,1] interval.
And positioning pixels in the 3D texture according to the normalized result. Assuming that the normalization result of the "subtraction result of X coordinate values" is p2, the normalization result of the "subtraction result of Y coordinate values" is q2, and the normalization result of the "subtraction result of Z coordinate values" is r2, 3D texels of which the X coordinate values are p2, the Y coordinate values are q2, and the Z coordinate values are r2 can be located, and the solar rate stored in the alpha channel of the located pixel can be used as the solar rate corresponding to the target vertex in 3D texture.
In addition, the solar rate of each adjacent sampling block of the sampling block where the target vertex is located can also be determined. Wherein each adjacent pixel generally refers to six adjacent sampling blocks up, down, front, back, left and right. If the sampling block where the target vertex is located on the second spatial range edge, the adjacent sampling blocks may be less than four.
S1054: and interpolating the sunshine rate corresponding to the target vertex and the sunshine rate of each adjacent sampling block of the sampling block where the target vertex is positioned to obtain the sunshine rate of the target vertex.
In a first embodiment, the solar rate corresponding to the target vertex is the solar rate of the sampling block corresponding to the located pixel in the 3D texture. The solar rate of the target vertex can be accurately obtained by interpolating the solar rate corresponding to the target vertex and the solar rate of each adjacent sampling block of the sampling block where the target vertex is located. Specifically, if the sampling block where the target vertex is located has six adjacent sampling blocks, interpolating the solar rate corresponding to the target vertex and the solar rate of each adjacent sampling block of the sampling block where the target vertex is located by seven solar rates, and taking the interpolation result as the solar rate of the target vertex. If the number of the adjacent sampling blocks of the sampling block where the target vertex is located is less than six, the solar rate corresponding to the target vertex and the solar rate of each adjacent sampling block of the sampling block where the target vertex is located are interpolated in the same way, and the interpolation result is taken as the solar rate of the target vertex.
S1056: and linearly interpolating the color value of the lowest solar rate and the color value of the highest solar rate through the solar rate of the target vertex to obtain the corresponding solar color value of the target vertex.
In the first embodiment, a solar rate color value interval or a minimum solar rate color value and a maximum solar rate color value may be set, so that a solar color value corresponding to the target vertex may be obtained by performing linear interpolation on the minimum solar rate color value and the maximum solar rate color value through the solar rate of the target vertex. That is, it is determined what color the solar irradiation rate of the target vertex should correspond to.
Specifically, linear interpolation is performed for each channel of R, G, B through the solar rate of the target vertex. If the solar rate of the target vertex is m, linearly interpolating the R channel of the lowest solar rate color value and the R channel of the highest solar rate color value by using m to obtain the R channel value corresponding to the target vertex; for example, if the R channel value of the lowest solar rate color value is R1 and the R channel value of the highest solar rate color value is R2, r1+m (R2-R1) is taken as the R channel value corresponding to the target vertex.
Correspondingly, m is used for carrying out linear interpolation on the G channel and the B channel of the color value of the lowest solar rate and the color value of the highest solar rate respectively, and the G channel value and the B channel value corresponding to the target vertex are obtained. Thus, the color corresponding to the target vertex, i.e., the sun color value, is also obtained. The closer the sunlight color value corresponding to the target vertex is to the highest sunlight rate color value, which indicates that the higher the sunlight rate of the target vertex is.
Through the above, the sunlight color values corresponding to the vertexes of the real three-dimensional model in the second space range are obtained, so that the sunlight color values corresponding to the vertexes of the real three-dimensional model in the second space range can be displayed, and sunlight information or sunlight conditions (including sunlight rate of each target vertex or sunlight information or conditions in a preset time period) of the real three-dimensional model in the second space range are displayed. For example, in fig. 3, each vertex of the lower half of the building real-scene three-dimensional model in the second space range may be irradiated by sunlight, and a corresponding color is given according to the sunlight color value corresponding to each vertex, so as to display the sunlight information or the sunlight condition thereof.
In a first embodiment, a target vertex of the live-action three-dimensional model located in the second space range is screened out, a sunlight color value of each target vertex is determined, and sunlight information of the live-action three-dimensional model in the second space range is displayed according to the sunlight color value of each target vertex and can be executed by a shader. The shader may acquire the 3D texture, and take a power of 2 that is greater than or equal to the maximum value of the number of sampling blocks in the direction of X, Y, Z three coordinate axes in the first spatial range as a reference value, which is also convenient for transmitting the 3D texture to the shader. In addition, the shader may be deployed on the execution body of the first embodiment.
The first embodiment can obtain the following beneficial effects:
in a first embodiment, a 3D texture is generated by sampling a first spatial range and determining the solar rate of each block of samples. And screening out target vertexes of the live-action three-dimensional model in the second space range, and determining the sunlight color value of each target vertex through mapping or indexing with the 3D texture, so that sunlight information or sunlight condition of the live-action three-dimensional model in the second space range can be displayed according to the color value of each target vertex. Thus, on one hand, the sunshine analysis and the sunshine color display of the live-action three-dimensional model are realized; on the other hand, the sunshine information or the sunshine condition of the real three-dimensional model is displayed through colors, so that the color information is attached to the real three-dimensional model to reflect the sunshine information or the sunshine condition, and the sunshine display of the real three-dimensional model is clearer and more visual; in still another aspect, by interpolation of the sunlight rate colors, the sunlight color values corresponding to each target vertex of the real-scene three-dimensional model can be accurately determined, so that the sunlight conditions of different vertices or (delimited by the vertices) of the real-scene three-dimensional model can be displayed and distinguished by different colors, and the sunlight display effect, the display granularity and the accuracy are further improved.
In the first embodiment, the solar rate of the target vertex is obtained by interpolating the solar rate of the sample block where the target vertex is located and the solar rate of the adjacent sample blocks. In this way, the accuracy of the sunshine rate of the target vertexes is improved, the sunshine rate of the target vertexes is more balanced, the accuracy of the obtained sunshine color values of the target vertexes when the sunshine rate of the target vertexes is utilized to conduct linear interpolation on the color values of the lowest sunshine rate and the color values of the highest sunshine rate is improved, the possible fuzzy or saw-tooth situations when sunshine information or sunshine situations of the real-scene three-dimensional model are displayed according to the sunshine color values of the target vertexes are reduced, and therefore the sunshine display effect and definition are further optimized.
In particular, the real three-dimensional model has more vertexes, and when the distance or angle is changed, the real three-dimensional model is also changed. In the first embodiment, the color information is directly attached to the real-scene three-dimensional model to reflect the sunlight information or the sunlight condition, and the attached effect of the color information is maintained and changed no matter how the real-scene three-dimensional model changes, so that the first embodiment is particularly suitable for sunlight analysis and display of the real-scene three-dimensional model.
As shown in fig. 4, a second embodiment of the present disclosure provides a real-scene model sunlight display apparatus corresponding to the method of the first embodiment, including:
the sampling module 202 is configured to generate a bounding rectangle of the target area, and determine sampling blocks in X, Y, Z three coordinate axis directions within a first space range formed based on the bounding rectangle and a preset bottom elevation value and a relative elevation value;
the texture module 204 is configured to determine a solar rate of each sample block within a preset time period, and generate a 3D texture from the solar rate of each sample block;
the coloring module 206 is configured to screen out target vertices of the live-action three-dimensional model located in the second spatial range, determine a sunlight color value of each target vertex, and display sunlight information of the live-action three-dimensional model in the second spatial range according to the sunlight color value of each target vertex;
the second space range is formed based on the target area, a preset bottom elevation value and a preset relative elevation value;
determining the solar color value of each target vertex comprises: for any target vertex, determining the solar rate corresponding to the target vertex in the 3D texture and the solar rate of each adjacent sampling block of the sampling block where the target vertex is located; interpolation is carried out on the solar rate corresponding to the target vertex and the solar rate of each adjacent sampling block to obtain the solar rate of the target vertex, and linear interpolation is carried out on the color value of the lowest solar rate and the color value of the highest solar rate through the solar rate of the target vertex to obtain the solar color value corresponding to the target vertex.
Optionally, generating the bounding rectangle of the target area includes:
converting the longitude and latitude set of the vertex of the target area into a Cartesian coordinate set, multiplying the Cartesian coordinate set by the inverse matrix of the migration matrix of the live-action three-dimensional model to obtain a local coordinate set of the target area, and obtaining a surrounding rectangular frame of the target area according to the local coordinate set.
Optionally, determining the sampling block in the directions of three coordinate axes of X, Y, Z in the first space range formed based on the bounding rectangular frame and the preset bottom elevation value and the relative elevation value includes:
and determining the number of sampling blocks in the directions of three coordinate axes of X, Y, Z in a first space range formed based on the surrounding rectangular frame, the preset bottom elevation value and the relative elevation value according to a preset interval value, and dividing the first space range according to the number of the sampling blocks to obtain sampling blocks in the directions of the three coordinate axes in the first space range X, Y, Z.
Optionally, generating the 3D texture from the solar rate of each sample block includes:
setting the length, width and height pixel size value of a 3D texture as a reference value, and establishing the corresponding relation between each sampling block and the pixels of the 3D texture, wherein an alpha channel of each pixel is used for storing the sunlight rate of the corresponding sampling block; the reference value is not smaller than the maximum value of the number of sampling blocks in the directions of three coordinate axes of X, Y, Z in the first space range.
Optionally, the texture module 204 is further configured to construct a camera view cone with the bounding rectangular frame range before screening out the target vertex of the live-action three-dimensional model located in the second spatial range, so as to obtain the depth map of the bounding rectangular frame.
Optionally, screening the target vertices of the live-action three-dimensional model located in the second spatial range includes:
screening out the vertexes of the live-action three-dimensional model in the first space range;
determining a pixel corresponding to any vertex of the live-action three-dimensional model in the first space range in the depth map, and determining the maximum value of the depth values of the pixel and each adjacent pixel of the pixel;
and judging whether the vertex is positioned in the second space range according to whether the maximum value meets a preset condition.
Optionally, for any vertex of the live-action three-dimensional model in the first spatial range, determining a pixel corresponding to the vertex in the depth map includes:
subtracting the X coordinate value of the starting point of the bounding rectangle frame from the X coordinate value of any vertex of the live-action three-dimensional model in the first space range, subtracting the Y coordinate value of the starting point of the bounding rectangle frame from the Y coordinate value of the vertex, and normalizing the obtained result; wherein, the plane formed by the X coordinate axis and the Y coordinate axis is parallel to the plane of the surrounding rectangular frame;
And positioning pixels in the depth map according to the normalized result to serve as pixels corresponding to the vertex.
Optionally, for any one of the target vertices, the solar rate corresponding to the target vertex in the 3D texture refers to the solar rate stored in the alpha channel of the pixel corresponding to the target vertex in the 3D texture.
Optionally, for any one of the target vertices, determining the solar rate corresponding to the target vertex in the 3D texture includes:
subtracting the X coordinate value of the starting point of the bounding rectangle frame from the X coordinate value of any target vertex, subtracting the Y coordinate value of the starting point of the bounding rectangle frame from the Y coordinate value of the target vertex, subtracting the bottom elevation value from the Z coordinate value of the target vertex, and normalizing the obtained result;
and positioning pixels in the 3D texture according to the normalized result, and taking the solar rate stored in an alpha channel of the positioned pixels as the solar rate corresponding to the target vertex in the 3D texture.
The second embodiment can achieve the same advantageous effects as the first embodiment.
The embodiments described above may be combined and modules of the same name may be the same or different modules between different embodiments or within the same embodiment.
The foregoing describes certain embodiments of the present disclosure, other embodiments being within the scope of the following claims. In some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. Furthermore, the processes depicted in the accompanying drawings do not necessarily have to be in the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for apparatus, devices, non-transitory computer readable storage medium embodiments, the description is relatively simple, as it is substantially similar to method embodiments, with reference to portions of the description of method embodiments being relevant.
The apparatus, the device, the nonvolatile computer readable storage medium and the method provided in the embodiments of the present disclosure correspond to each other, and therefore, the apparatus, the device, and the nonvolatile computer storage medium also have similar advantageous technical effects as those of the corresponding method, and since the advantageous technical effects of the method have been described in detail above, the advantageous technical effects of the corresponding apparatus, device, and nonvolatile computer storage medium are not described herein again.
In the 90 s of the 20 th century, improvements to one technology could clearly be distinguished as improvements in hardware (e.g., improvements to circuit structures such as diodes, transistors, switches, etc.) or software (improvements to the process flow). However, with the development of technology, many improvements of the current method flows can be regarded as direct improvements of hardware circuit structures. Designers almost always obtain corresponding hardware circuit structures by programming improved method flows into hardware circuits. Therefore, an improvement of a method flow cannot be said to be realized by a hardware entity module. For example, a programmable logic device (Programmable Logic Device, PLD) (e.g., field programmable gate array (FieldProgrammable Gate Array, FPGA)) is an integrated circuit whose logic function is determined by the programming of the device by a user. A designer programs to "integrate" a digital system onto a PLD without requiring the chip manufacturer to design and fabricate application-specific integrated circuit chips. Moreover, nowadays, instead of manually manufacturing integrated circuit chips, such programming is mostly implemented by using "logic compiler" software, which is similar to the software compiler used in program development and writing, and the original code before the compiling is also written in a specific programming language, which is called hardware description language (HardwareDescription Language, HDL), but not just one of the hdds, but a plurality of kinds, such as ABEL (Advanced Boolean Expression Language), AHDL (Altera Hardware Description Language), confluence, CUPL (CornellUniversity Programming Language), HDCal, JHDL (Java Hardware Description Language), lava, lola, myHDL, PALASM, RHDL (Ruby Hardware Description Language), etc., VHDL (Very-High-SpeedIntegrated Circuit Hardware Description Language) and Verilog are currently most commonly used. It will also be apparent to those skilled in the art that a hardware circuit implementing the logic method flow can be readily obtained by merely slightly programming the method flow into an integrated circuit using several of the hardware description languages described above.
The controller may be implemented in any suitable manner, for example, the controller may take the form of, for example, a microprocessor or processor and a computer readable medium storing computer readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, application specific integrated circuits (Application Specific Integrated Circuit, ASIC), programmable logic controllers, and embedded microcontrollers, examples of which include, but are not limited to, the following microcontrollers: ARC625D, atmel AT91SAM, microchip PIC18F26K20, and Silicone Labs C8051F320, the memory controller may also be implemented as part of the control logic of the memory. Those skilled in the art will also appreciate that, in addition to implementing the controller in a pure computer readable program code, it is well possible to implement the same functionality by logically programming the method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers, etc. Such a controller may thus be regarded as a kind of hardware component, and means for performing various functions included therein may also be regarded as structures within the hardware component. Or even means for achieving the various functions may be regarded as either software modules implementing the methods or structures within hardware components.
The system, apparatus, module or unit set forth in the above embodiments may be implemented in particular by a computer chip or entity, or by a product having a certain function. One typical implementation is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being functionally divided into various units, respectively. Of course, the functions of each element may be implemented in one or more software and/or hardware elements when implemented in the present specification.
It will be appreciated by those skilled in the art that the present description may be provided as a method, system, or computer program product. Accordingly, the present specification embodiments may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present description embodiments may take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
The present description is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the specification. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Disks (DVD) or other optical storage, magnetic cassettes, magnetic tape disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
The description may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The specification may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for system embodiments, since they are substantially similar to method embodiments, the description is relatively simple, as relevant to see a section of the description of method embodiments.
The foregoing is merely exemplary embodiments of the present disclosure and is not intended to limit the present disclosure. Various modifications and changes may be made to the present application by those skilled in the art. Any modifications, equivalent substitutions, improvements, etc. which are within the spirit and principles of the present application are intended to be included within the scope of the claims of the present application.

Claims (10)

1. A method for displaying a real model insolation, the method comprising:
generating a surrounding rectangular frame of a target area, and determining sampling blocks in X, Y, Z three coordinate axis directions in a first space range formed based on the surrounding rectangular frame, a preset bottom elevation value and a preset relative elevation value;
determining the sunlight rate of each sampling block in a preset time period, and generating 3D textures by the sunlight rate of each sampling block;
Screening out target vertexes of the live-action three-dimensional model in a second space range, determining the sunlight color value of each target vertex, and displaying sunlight information of the live-action three-dimensional model in the second space range according to the sunlight color value of each target vertex;
the second space range is formed based on the target area, a preset bottom elevation value and a preset relative elevation value;
determining the solar color value of each target vertex comprises: for any target vertex, determining the solar rate corresponding to the target vertex in the 3D texture and the solar rate of each adjacent sampling block of the sampling block where the target vertex is located; interpolation is carried out on the solar rate corresponding to the target vertex and the solar rate of each adjacent sampling block to obtain the solar rate of the target vertex, and linear interpolation is carried out on the color value of the lowest solar rate and the color value of the highest solar rate through the solar rate of the target vertex to obtain the solar color value corresponding to the target vertex.
2. The method of claim 1, wherein generating a bounding rectangular box of the target area comprises:
converting the longitude and latitude set of the vertex of the target area into a Cartesian coordinate set, multiplying the Cartesian coordinate set by the inverse matrix of the migration matrix of the live-action three-dimensional model to obtain a local coordinate set of the target area, and obtaining a surrounding rectangular frame of the target area according to the local coordinate set.
3. The method of claim 1, wherein determining the block of samples for the directions of the three coordinate axes within the first spatial range X, Y, Z formed based on the bounding rectangular box and the predetermined bottom elevation value, the relative elevation value, comprises:
and determining the number of sampling blocks in the directions of three coordinate axes of X, Y, Z in a first space range formed based on the surrounding rectangular frame, the preset bottom elevation value and the relative elevation value according to a preset interval value, and dividing the first space range according to the number of the sampling blocks to obtain sampling blocks in the directions of the three coordinate axes in the first space range X, Y, Z.
4. The method of claim 1, wherein generating a 3D texture from the solar rate of each sample block comprises:
setting the length, width and height pixel size value of a 3D texture as a reference value, and establishing the corresponding relation between each sampling block and the pixels of the 3D texture, wherein an alpha channel of each pixel is used for storing the sunlight rate of the corresponding sampling block; the reference value is not smaller than the maximum value of the number of sampling blocks in the directions of three coordinate axes of X, Y, Z in the first space range.
5. The method of claim 2, wherein before screening out the target vertices of the live-action three-dimensional model that lie within the second spatial range, the method further comprises:
And constructing a camera view cone in the range of the surrounding rectangular frame to obtain a depth map of the surrounding rectangular frame.
6. The method of claim 5, wherein screening out the target vertices of the live-action three-dimensional model that lie within the second spatial range comprises:
screening out the vertexes of the live-action three-dimensional model in the first space range;
determining a pixel corresponding to any vertex of the live-action three-dimensional model in the first space range in the depth map, and determining the maximum value of the depth values of the pixel and each adjacent pixel of the pixel;
and judging whether the vertex is positioned in the second space range according to whether the maximum value meets a preset condition.
7. The method of claim 6, wherein for any vertex of the live-action three-dimensional model within the first spatial range, determining a pixel in the depth map corresponding to the vertex comprises:
subtracting the X coordinate value of the starting point of the bounding rectangle frame from the X coordinate value of any vertex of the live-action three-dimensional model in the first space range, subtracting the Y coordinate value of the starting point of the bounding rectangle frame from the Y coordinate value of the vertex, and normalizing the obtained result; wherein, the plane formed by the X coordinate axis and the Y coordinate axis is parallel to the plane of the surrounding rectangular frame;
And positioning pixels in the depth map according to the normalized result to serve as pixels corresponding to the vertex.
8. The method of claim 1, wherein for any of the target vertices, the insolation rate corresponding to the target vertex in the 3D texture is the insolation rate stored in the alpha channel of the pixel corresponding to the target vertex in the 3D texture.
9. The method according to any one of claims 1 to 8, wherein determining, for any one of the target vertices, a solar rate in the 3D texture corresponding to the target vertex comprises:
subtracting the X coordinate value of the starting point of the bounding rectangle frame from the X coordinate value of any target vertex, subtracting the Y coordinate value of the starting point of the bounding rectangle frame from the Y coordinate value of the target vertex, subtracting the bottom elevation value from the Z coordinate value of the target vertex, and normalizing the obtained result;
and positioning pixels in the 3D texture according to the normalized result, and taking the solar rate stored in an alpha channel of the positioned pixels as the solar rate corresponding to the target vertex in the 3D texture.
10. A live-action model insolation display device, the device comprising:
The sampling module is used for generating a surrounding rectangular frame of the target area and determining sampling blocks in X, Y, Z coordinate axis directions in a first space range formed based on the surrounding rectangular frame, a preset bottom elevation value and a preset relative elevation value;
the texture module is used for determining the sunlight rate of each sampling block in a preset time period and generating 3D textures according to the sunlight rate of each sampling block;
the coloring module is used for screening out target vertexes of the live-action three-dimensional model in the second space range, determining the sunlight color values of the target vertexes and displaying the sunlight information of the live-action three-dimensional model in the second space range according to the sunlight color values of the target vertexes;
the second space range is formed based on the target area, a preset bottom elevation value and a preset relative elevation value;
determining the solar color value of each target vertex comprises: for any target vertex, determining the solar rate corresponding to the target vertex in the 3D texture and the solar rate of each adjacent sampling block of the sampling block where the target vertex is located; interpolation is carried out on the solar rate corresponding to the target vertex and the solar rate of each adjacent sampling block to obtain the solar rate of the target vertex, and linear interpolation is carried out on the color value of the lowest solar rate and the color value of the highest solar rate through the solar rate of the target vertex to obtain the solar color value corresponding to the target vertex.
CN202410095122.7A 2024-01-24 2024-01-24 Real model sunlight display method and device Active CN117611726B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410095122.7A CN117611726B (en) 2024-01-24 2024-01-24 Real model sunlight display method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410095122.7A CN117611726B (en) 2024-01-24 2024-01-24 Real model sunlight display method and device

Publications (2)

Publication Number Publication Date
CN117611726A true CN117611726A (en) 2024-02-27
CN117611726B CN117611726B (en) 2024-05-14

Family

ID=89953906

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410095122.7A Active CN117611726B (en) 2024-01-24 2024-01-24 Real model sunlight display method and device

Country Status (1)

Country Link
CN (1) CN117611726B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120249539A1 (en) * 2011-03-16 2012-10-04 Daipayan Bhattacharya System and method for modeling buildings and building products
US20170345208A1 (en) * 2011-04-14 2017-11-30 Suntracker Technologies Ltd. System and method for real time dynamic lighting simulation
CN107798201A (en) * 2017-11-15 2018-03-13 苏州联讯图创软件有限责任公司 The Sunlight Analysis method of BUILDINGS MODELS
CN110110445A (en) * 2019-05-09 2019-08-09 洛阳众智软件科技股份有限公司 A kind of Sunlight Analysis method, apparatus, equipment and storage medium
CN112164142A (en) * 2020-10-21 2021-01-01 江苏科技大学 Building lighting simulation method based on smart phone
CN112347547A (en) * 2020-11-30 2021-02-09 久瓴(江苏)数字智能科技有限公司 Sunshine duration simulation method and device and electronic equipment
CN115115479A (en) * 2022-05-30 2022-09-27 北京五八信息技术有限公司 Method and device for acquiring house lighting information, electronic equipment and storage medium
CN115222910A (en) * 2022-06-27 2022-10-21 北京城市网邻信息技术有限公司 Method and device for acquiring lighting information, electronic equipment and storage medium
CN115713548A (en) * 2022-09-06 2023-02-24 中国电建集团西北勘测设计研究院有限公司 Automatic registration method for multi-stage live-action three-dimensional model
CN116266361A (en) * 2021-12-14 2023-06-20 星际空间(天津)科技发展有限公司 High-definition sunlight analysis model modeling method based on point cloud

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120249539A1 (en) * 2011-03-16 2012-10-04 Daipayan Bhattacharya System and method for modeling buildings and building products
US20170345208A1 (en) * 2011-04-14 2017-11-30 Suntracker Technologies Ltd. System and method for real time dynamic lighting simulation
CN107798201A (en) * 2017-11-15 2018-03-13 苏州联讯图创软件有限责任公司 The Sunlight Analysis method of BUILDINGS MODELS
CN110110445A (en) * 2019-05-09 2019-08-09 洛阳众智软件科技股份有限公司 A kind of Sunlight Analysis method, apparatus, equipment and storage medium
CN112164142A (en) * 2020-10-21 2021-01-01 江苏科技大学 Building lighting simulation method based on smart phone
CN112347547A (en) * 2020-11-30 2021-02-09 久瓴(江苏)数字智能科技有限公司 Sunshine duration simulation method and device and electronic equipment
CN116266361A (en) * 2021-12-14 2023-06-20 星际空间(天津)科技发展有限公司 High-definition sunlight analysis model modeling method based on point cloud
CN115115479A (en) * 2022-05-30 2022-09-27 北京五八信息技术有限公司 Method and device for acquiring house lighting information, electronic equipment and storage medium
CN115222910A (en) * 2022-06-27 2022-10-21 北京城市网邻信息技术有限公司 Method and device for acquiring lighting information, electronic equipment and storage medium
CN115713548A (en) * 2022-09-06 2023-02-24 中国电建集团西北勘测设计研究院有限公司 Automatic registration method for multi-stage live-action three-dimensional model

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Z.A.AL-MOSTAFA: "Sunshine-based golbal radiation models: A review and case study", ENERGY CONVERSION AND MANAGEMENT, vol. 84, 31 August 2014 (2014-08-31), pages 209 - 216 *
姜立;王会一;任燕翔;张雷;: "建筑日照分析原理与计算方法的研究", 土木建筑工程信息技术, no. 02, 15 December 2009 (2009-12-15), pages 63 - 69 *
韦铖;王福丽;纪海英;崔现勇;沙海龙;: "低空无人机倾斜摄影测量在日照分析建模中的应用", 城市勘测, no. 01, 30 April 2020 (2020-04-30), pages 103 - 106 *

Also Published As

Publication number Publication date
CN117611726B (en) 2024-05-14

Similar Documents

Publication Publication Date Title
US10672183B2 (en) Graphics processing using directional representations of lighting at probe positions within a scene
TWI712004B (en) Coordinate system calibration method and device of augmented reality equipment
CN110717005B (en) Thermodynamic diagram texture generation method, device and equipment
TW202008328A (en) Data processing method and device for map region merging
CN116977525B (en) Image rendering method and device, storage medium and electronic equipment
CN113936091A (en) Method and system for constructing ray tracing acceleration structure
CN117611726B (en) Real model sunlight display method and device
CN115880685B (en) Three-dimensional target detection method and system based on volntet model
CN116012520B (en) Shadow rendering method, shadow rendering device, computer equipment and storage medium
CN116127599A (en) Three-dimensional geological BIM modeling method and device based on exploration data
CN112184901B (en) Depth map determining method and device
CN116843812A (en) Image rendering method and device and electronic equipment
US20060133691A1 (en) Systems and methods for representing signed distance functions
CN116612244B (en) Image generation method and device, storage medium and electronic equipment
CN116740114B (en) Object boundary fitting method and device based on convex hull detection
CN117173321B (en) Method and device for selecting three-dimensional reconstruction texture view
CN117541744B (en) Rendering method and device for urban live-action three-dimensional image
CN117274344B (en) Model training method, texture synthesis and mapping method for texture of real material
Behmann et al. Probabilistic 3d point cloud fusion on graphics processors for automotive (poster)
CN117765171A (en) Three-dimensional model reconstruction method and device, storage medium and electronic equipment
CN116188633A (en) Method, device, medium and electronic equipment for generating simulated remote sensing image
CN114445538A (en) Real-time rendering method and device of target object, electronic equipment and storage medium
Matzarakis et al. Estimation of Sky View Factor in urban environments
CN117788696A (en) Task execution method and device based on three-dimensional modeling and electronic equipment
CN117689822A (en) Three-dimensional model construction method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant