CN111968247B - Method and device for constructing three-dimensional house space, electronic equipment and storage medium - Google Patents
Method and device for constructing three-dimensional house space, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN111968247B CN111968247B CN202010665225.4A CN202010665225A CN111968247B CN 111968247 B CN111968247 B CN 111968247B CN 202010665225 A CN202010665225 A CN 202010665225A CN 111968247 B CN111968247 B CN 111968247B
- Authority
- CN
- China
- Prior art keywords
- data
- space
- target
- house
- functional
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/003—Navigation within 3D models or images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/10—Geometric CAD
- G06F30/13—Architectural design, e.g. computer-aided architectural design [CAAD] related to design of buildings, bridges, landscapes, production plants or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Geometry (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Civil Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Architecture (AREA)
- Remote Sensing (AREA)
- Structural Engineering (AREA)
- Computational Mathematics (AREA)
- Mathematical Analysis (AREA)
- Mathematical Optimization (AREA)
- Pure & Applied Mathematics (AREA)
- Evolutionary Computation (AREA)
- Processing Or Creating Images (AREA)
Abstract
The invention provides a method and a device for constructing a three-dimensional house space, electronic equipment and a storage medium. The method comprises the following steps: acquiring initial data of a target house, wherein the initial data comprises at least one of 3D (three-dimensional) house type data and 2D house type data of the target house; acquiring a three-dimensional space of each functional area in a target house according to the initial data to set decoration data of each functional area, wherein the decoration data comprises at least one of hard decoration data and soft decoration data; combining the three-dimensional space of the specific function area in the target house into a three-dimensional space, combining the three-dimensional space of the non-specific function area to obtain the three-dimensional house space of the target house, and rendering a house decoration result in the three-dimensional house space according to decoration data of each function area, wherein the non-specific function area is other function areas except the specific function area. The method has the beneficial effects of improving the accuracy and integrity of the three-dimensional house space and further improving the visual effect of VR decoration.
Description
Technical Field
The present invention relates to the field of three-dimensional space technology, and in particular, to a method and an apparatus for constructing a three-dimensional house space, an electronic device, and a storage medium.
Background
With the rapid development of the home decoration industry, the house decoration requirements are more and more diversified. As general consumers, there is a demand for intuitive feeling of different decoration styles, and as businesses, there is a demand for quick and intuitive display of on-site display effects of various furniture.
However, as a non-professional person, it is difficult to understand a flat house type diagram during house purchase or decoration design, the house type diagram also hardly represents the relation of the whole space of the house source, and 3D house type data in a scene such as VR watching a house can only visually show the inner view of each functional area, but the real visual effect in a three-dimensional space and the spatial relation between each functional area in the house source cannot be realized. Therefore, in the application field of home decoration and the like, the construction of an accurate and complete house model is particularly important.
Disclosure of Invention
Embodiments of the present invention provide a method and an apparatus for constructing a three-dimensional house space, an electronic device, and a storage medium, so as to solve the problem that existing 3D house type data or house type diagram data is difficult to represent the relationship of the whole space of a house source from the three-dimensional space, and easily affects the visual effect during subsequent VR decoration.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides a method for constructing a three-dimensional room space, including:
acquiring initial data of a target house, wherein the initial data comprises at least one of 3D (three-dimensional) house type data and 2D house type data of the target house;
acquiring a three-dimensional space of each functional area in the target house according to the initial data, wherein the functional area model data is data used for constructing a 3D functional area model of the at least one functional area;
combining the three-dimensional space of the specific function area in the target house into a three-dimensional space, and combining the three-dimensional space of the non-specific function area to obtain the three-dimensional house space of the target house, so as to render a house decoration result in the three-dimensional house space according to decoration data of each function area, wherein the non-specific function area is other function areas except the specific function area.
Optionally, the step of obtaining a three-dimensional space of each functional area in the target house according to the initial data includes:
when the initial data comprises 3D house type data, analyzing the 3D house type data of the functional area aiming at each functional area in the target house, and extracting first target data of the functional area from the analyzed 3D house type data;
constructing a three-dimensional space of the functional area according to the first target data;
the first target data at least comprises wall data, door data, window data and observation point data.
Optionally, the step of obtaining a three-dimensional space of each functional area in the target house according to the initial data includes:
under the condition that the initial data comprises 2D house type data, analyzing the 2D house type data of the functional area aiming at each functional area in the target house, and extracting second target data of the functional area from the analyzed 2D house type data, wherein the second target data at least comprises 2D wall data, 2D door data and 2D window data;
generating first target data of the functional area according to second target data of the functional area, wherein the first target data at least comprises wall data, door data, window data and observation point data;
and constructing a three-dimensional space of the functional region according to the first target data.
Optionally, the step of generating the first target data of the functional area according to the second target data of the functional area includes:
generating wall data of the functional area according to the 2D wall data of the functional area and a preset wall height;
generating door data and window data of the functional area according to the 2D door data and the 2D window data of the functional area and a preset door and window height;
and generating observation point data of the functional area according to the second target data of the functional area.
Optionally, the step of generating the viewpoint data of the functional area according to the second target data of the functional area includes:
acquiring the plane shape type of the functional area according to the second target data of the functional area;
and in response to the plane shape type of the functional area being a rectangle, taking the position data of the center point of the largest rectangle in the functional area as the viewpoint data of the functional area.
And in response to the planar shape type of the functional region being L-shaped, taking the position data of the center point of the intersection region of the two rectangles in the functional region as the observation point data of the functional region.
And in response to the planar shape type of the functional area being a U shape, respectively taking the position data of the central points of the three rectangles in the functional area as the observation point data of the functional area.
And in response to the functional area containing more than three rectangles, respectively taking the position data of the central point of each rectangle as the observation point data of the functional area.
Optionally, the step of combining the three-dimensional spaces of the specific functional areas in the target house into one three-dimensional space and combining the three-dimensional space of the non-specific functional area to obtain the three-dimensional house space of the target house includes:
acquiring a specific function area of a target type from the function areas of the target house, wherein the target type comprises at least two of a living room, a restaurant, an aisle and an entrance;
acquiring any two specific function areas with a connection relation, combining three-dimensional spaces of the two specific function areas, and removing a wall body connected between the three-dimensional spaces of the two specific function areas to obtain a combined space;
in response to the existence of the specific functional regions which are not combined yet, combining the three-dimensional space of any one of the specific functional regions which are not combined yet and have a connection relationship with the current combination space into the current combination space, and removing the wall body connected between the three-dimensional space of the specific functional regions which are not combined yet and the combination space until the combination of the three-dimensional spaces of all the specific functional regions is completed, or the three-dimensional space of the specific functional regions which are not combined and the current combination space do not have a connection relationship;
and generating the three-dimensional house space of the target house according to the finally obtained combined space and by combining the three-dimensional space of the specific functional area which is not combined with the combined space and the three-dimensional space of the non-specific functional area.
Optionally, the step of merging the three-dimensional spaces of the two specific functional regions includes:
merging the first target data of the two specific function areas to obtain the first target data of the current merging space, and removing the wall data of the wall body connected between the two specific function areas from the currently merged first target data;
and generating three-dimensional spaces of the two specific functional areas based on the merged first target data.
In a second aspect, an embodiment of the present invention provides an apparatus for constructing a three-dimensional room space, including:
the system comprises an initial data acquisition module, a data acquisition module and a data acquisition module, wherein the initial data acquisition module is used for acquiring initial data of a target house, and the initial data comprises at least one of 3D (three-dimensional) house type data and 2D house type data of the target house;
a three-dimensional space construction module, configured to obtain a three-dimensional space of each functional area in the target house according to the initial data, where the functional area model data is data used to construct a 3D functional area model of the at least one functional area;
and the house space construction module is used for combining the three-dimensional space of the specific function area in the target house into a three-dimensional space, combining the three-dimensional space of the non-specific function area to obtain the three-dimensional house space of the target house, and rendering a house decoration result in the three-dimensional house space according to decoration data of each function area, wherein the non-specific function area is other function areas except the specific function area.
Optionally, the three-dimensional space building module includes:
a first target data obtaining sub-module, configured to, when the initial data includes 3D house type data, analyze, for each functional area in the target house, the 3D house type data of the functional area, and extract, from the analyzed 3D house type data, first target data of the functional area;
the first three-dimensional space construction submodule is used for constructing a three-dimensional space of the functional area according to the first target data;
the first target data at least comprises wall data, door data, window data and observation point data.
Optionally, the three-dimensional space building module includes:
a second target data obtaining sub-module, configured to, when the initial data includes 2D house type data, analyze, for each functional area in the target house, the 2D house type data of the functional area, and extract, from the analyzed 2D house type data, second target data of the functional area, where the second target data at least includes 2D wall data, 2D door data, and 2D window data;
the third target data acquisition submodule is used for generating first target data of the functional area according to second target data of the functional area, wherein the first target data at least comprises wall data, door data, window data and observation point data;
and the second three-dimensional space construction submodule is used for constructing the three-dimensional space of the functional area according to the first target data.
Optionally, the third target data obtaining sub-module includes:
the wall data generating unit is used for generating wall data of the functional area according to the 2D wall data of the functional area and a preset wall height;
the door and window data generation unit is used for generating door data and window data of the functional area according to the 2D door data and the 2D window data of the functional area and the preset door and window height;
and the observation point data generating unit is used for generating observation point data of the functional area according to the second target data of the functional area.
Optionally, the observation point data generating unit is specifically configured to:
acquiring the plane shape type of the functional area according to the second target data of the functional area;
and in response to the plane shape type of the functional area being a rectangle, taking the position data of the center point of the largest rectangle in the functional area as the viewpoint data of the functional area.
And in response to the planar shape type of the functional region being L-shaped, taking the position data of the center point of the intersection region of the two rectangles in the functional region as the observation point data of the functional region.
And in response to the planar shape type of the functional area being a U shape, respectively taking the position data of the central points of the three rectangles in the functional area as the observation point data of the functional area.
And in response to the functional area containing more than three rectangles, respectively taking the position data of the central point of each rectangle as the observation point data of the functional area.
Optionally, the housing space constructing module is specifically configured to:
acquiring a specific function area of a target type from the function areas of the target house, wherein the target type comprises at least two of a living room, a restaurant, an aisle and an entrance;
acquiring any two specific function areas with a connection relation, combining three-dimensional spaces of the two specific function areas, and removing a wall body connected between the three-dimensional spaces of the two specific function areas to obtain a combined space;
in response to the existence of the specific functional regions which are not combined yet, combining the three-dimensional space of any one of the specific functional regions which are not combined yet and have a connection relationship with the current combination space into the current combination space, and removing the wall body connected between the three-dimensional space of the specific functional regions which are not combined yet and the combination space until the combination of the three-dimensional spaces of all the specific functional regions is completed, or the three-dimensional space of the specific functional regions which are not combined and the current combination space do not have a connection relationship;
and the house space construction submodule is used for generating the three-dimensional house space of the target house according to the finally obtained combined space and by combining the three-dimensional space of the specific functional area which is not combined with the combined space and the three-dimensional space of the non-specific functional area.
Optionally, the first specific function region merging sub-module is specifically configured to:
merging the first target data of the two specific function areas to obtain the first target data of the current merging space, and removing the wall data of the wall body connected between the two specific function areas from the currently merged first target data;
and generating three-dimensional spaces of the two specific functional areas based on the merged first target data.
In a third aspect, an embodiment of the present invention additionally provides an electronic device, including: a memory, a processor and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the method of constructing a three-dimensional room space according to the first aspect.
In a fourth aspect, the embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when executed by a processor, the computer program implements the steps of the method for constructing a three-dimensional room space according to the first aspect.
In the embodiment of the invention, in order to realize the on-line simulation of the decoration effects of different houses under different decoration styles, data required for decoration can be extracted from original data such as 3D house type data and 2D house type data of a target house, the original data is calculated and processed, complete modeling data for VR decoration is constructed, the accuracy and the integrity of the modeling data of the houses are improved, and the visual effect of subsequent VR decoration is further improved.
The foregoing description is only an overview of the technical solutions of the present invention, and the embodiments of the present invention are described below in order to make the technical means of the present invention more clearly understood and to make the above and other objects, features, and advantages of the present invention more clearly understandable.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained based on these drawings without inventive labor.
FIG. 1 is a flow chart of steps of a method for constructing a three-dimensional housing space according to an embodiment of the present invention;
FIG. 2 is a flow chart of steps in another method of constructing a three-dimensional room space in an embodiment of the present invention;
FIG. 3A is a schematic diagram of a portion of functional areas in a house layout according to an embodiment of the present invention;
FIG. 3B is a diagram of a merge space in an embodiment of the invention;
FIG. 4 is a schematic view of a viewpoint within a functional area in an embodiment of the present invention;
fig. 5 is a schematic structural view of a three-dimensional housing space constructing apparatus according to an embodiment of the present invention;
fig. 6 is a schematic configuration diagram of another three-dimensional housing space construction apparatus according to an embodiment of the present invention;
fig. 7 is a schematic diagram of a hardware structure of an electronic device in the embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, a flow chart of steps of a method for constructing a three-dimensional room space according to an embodiment of the present invention is shown.
In order to simulate decoration effects of different houses, modeling can be performed on the houses in VR decoration, three-dimensional models of the houses are built in 3D modeling software, materials and maps are given to wall surfaces, floors, ceilings and skirting lines in the houses based on the three-dimensional models of the houses, various furniture is arranged in the houses according to different functional areas, various lights are arranged in the houses according to the decoration effects, and finally the decoration effects are rendered to be virtual decoration effects.
Before simulating the decoration effect of the target house, the above process firstly needs to construct a complete 3D (3 Dimensions) house space of the target house for rendering the decoration effect in scenes such as VR decoration. Moreover, in practical applications, the initial data for building a 3D room space available for different rooms may be different in different application scenarios. For example, in an application scenario such as VR house watching, 3D house type data of each functional area included in a house may be provided; for a newly built house or a house under construction, only 2D (2 Dimensions, two-dimensional) user type data (e.g., a 2D user type graph or a hand-drawn user type graph, etc.) of the corresponding house may be provided; and so on.
The 3D house type data may include any 3D data related to the house type of the target house, for example, in a VR house viewing scene, the 3D house type data may be data for representing the house type structure except for map data of each functional area in VR panoramic data of the target house, that is, the 3D house type data may be combined with the map data of each functional area, and is used to show VR effects of each functional area in the house in electronic devices such as a PC (personal computer) end, an APP (Application) end, and a VR device end, and the 3D house type data may include information of each functional area in the house, such as information of a wall surface, a door, a window, an observation point in the functional area, an entrance door, an area of the functional area, a house orientation, and the like. In the embodiment of the present invention, the 3D house type data may be acquired by any available method, which is not limited to this embodiment of the present invention. For example, 3D house type data of a target house may be acquired from an application scene such as VR house watching, and so on.
Therefore, in the embodiment of the present invention, in order to obtain modeling data of a 3D house space of a target house, initial data of the target house may be obtained, where the initial data includes at least one of 3D house type data and 2D house type data of the target house; and then, the three-dimensional space of each functional area in the target house can be obtained according to the initial data of the target house.
For example, for 3D house type data, the data required for constructing the three-dimensional space of each functional area may be included, and in addition, other data only applicable to VR scenes, such as an indoor map, a cover picture, a nomadic scale, default roaming point data, and the like, then the data required for constructing the three-dimensional space may be extracted from the data, and the extracted data is processed to generate the three-dimensional space of each functional area in the target house; for 2D house type data, because only plane information of a target house is contained, available plane information contained in the 2D house type data can be extracted and processed into three-dimensional space data, and therefore a three-dimensional space of each functional area in the target house type data is generated; and so on.
In addition, in practical application, a house may be a combination of a plurality of functional areas, and the division manner of the functional areas included in the target house may also be set by self-definition according to requirements, for example, the functional areas may be divided according to the functions of each area in the target house, so that the divided functional areas may include bedrooms, living rooms, restaurants, toilets, hallways, galleries, kitchens, balconies, and the like; alternatively, the functional area included in the target house may be divided according to a rectangular area included in the recognition target house, or the like.
Moreover, in practical applications, the space plan of different houses may be different, and the connection manner between different spaces or functional areas may also be different, for example, there is a wall between some spaces (bedroom and living room) and the connection is performed through a door, and there is an open wall (i.e., there is no solid wall and the connection is performed through an open space) between some spaces (living room and dining room). For two spaces which are separated by a wall and connected by a door at the same time, the two spaces can be identified as two functional areas through the wall, and for two spaces between which no wall exists, the functional areas contained in the two spaces can be identified in any available mode such as rectangular areas, and the like, and the embodiment of the invention is not limited.
For example, in the case of a living room space and a restaurant space connected by an open space, they may be divided into two functional areas, i.e., a living room and a restaurant, by identifying a rectangular area, or they may be directly identified as a whole as a living room functional area, or the like.
In addition, after the three-dimensional space of each functional area is obtained, the decoration data of each functional area may be set according to the three-dimensional space of each functional area, where the decoration data may include at least one of hard-pack data and soft-pack data.
Specifically, the decoration data of each functional area may be obtained according to the spatial attribute of the three-dimensional space of each functional area, so as to display a decoration model object corresponding to the decoration data in the three-dimensional space of the functional area, and obtain an on-line decoration result of the target house; the content specifically included in the spatial attribute may be set by user according to a requirement, and the embodiment of the present invention is not limited thereto.
For example, in practical applications, the decoration modes generally applied to the spatial objects of different spatial types may differ, so that the spatial attributes of the spatial object may include at least a spatial type of the functional area, the decoration data may include at least one of hard-pack data and soft-pack data, the soft-pack data may include furniture layout data, and the furniture layout data may include at least one furniture model object and position information of the furniture model object.
The hard-fit data may include any data related to the hard-fit of the house, such as wall-fitting data, door-window fitting data, floor fitting data, ceiling fitting data, and the like. The soft-fit data may then include any data related to the house soft-fit, such as furniture layout data. The furniture layout data may include layout data related to movable interior decoration such as furniture, home appliances, curtain cloth, green plants, and the like. Furthermore, at least one furniture model object and position information corresponding to each furniture model object can be included in the furniture layout data. The furniture model object may be any of the above-described interior decoration model objects.
Furthermore, in order to identify different functional areas contained in the house in the 2D house type data, the identified different functional areas are generally marked as different closed areas, as shown in fig. 3A. Accordingly, for the 3D house type data, in order to facilitate the display of each area in the house, each functional area may be considered as a closed area in the 3D house type data, so that the roaming may be performed in each functional area in an application scene such as VR watching a house, and therefore, the above problem may also occur, thereby causing the three-dimensional spaces to be between each other when the three-dimensional space of each functional space is constructed. However, when a three-dimensional house space is constructed, in order to improve the accuracy of the three-dimensional house space obtained by construction, it is necessary to merge three-dimensional spaces of functional areas (e.g., a living room, a restaurant, an entrance, an aisle, etc.) connected therein into one three-dimensional space, and ensure that no wall or the like exists at the connection part of the three-dimensional spaces of the functional areas included in the merged three-dimensional space, so that the relative spatial relationship between the three-dimensional spaces of the functional areas in the three-dimensional house space of the target house obtained by final construction can correspond to the relative spatial relationship of the corresponding functional areas under the target house one by one.
Therefore, in the embodiment of the present invention, in order to improve the accuracy of modeling data, the three-dimensional spaces of specific functional areas in the target house may be merged into one three-dimensional space, and the three-dimensional space of a non-specific functional area, which is a functional area other than the specific functional area, is combined to obtain the three-dimensional house space of the target house, so as to render a house decoration result in the three-dimensional house space according to the decoration data of each functional area.
The specific functional area may be set by user according to a requirement or an actual situation of the target house, which is not limited in this embodiment of the present invention. For example, the specific function regions may include the living room, the dining room, the hallway, the aisle, and the like, and after the merged three-dimensional space is obtained, the three-dimensional room space of the target room may be constructed and obtained based on the merged three-dimensional space and the three-dimensional spaces of other non-specific function region regions, and specifically, the three-dimensional spaces of the function regions may be combined according to the relative spatial relationship of the function regions in the target room, so as to obtain the three-dimensional room space of the target room.
And after the three-dimensional house space of the target house is constructed, the house decoration result can be rendered in the three-dimensional house space according to the decoration data of each functional area in the subsequent VR decoration process, so that the decoration data under different decoration styles can be modeled in the three-dimensional house space, and the decoration effect under different decoration styles can be conveniently previewed by a user or displayed by a merchant.
Referring to fig. 2, in an embodiment, the step 120 may further include:
step A121, in a case that the initial data includes 3D house type data, analyzing the 3D house type data of the functional area for each functional area in the target house, and extracting first target data of the functional area from the analyzed 3D house type data;
step A122, constructing a three-dimensional space of the functional region according to the first target data; the first target data at least comprises wall data, door data, window data and observation point data.
In the embodiment of the present invention, the 3D house type data may be parsed and the first target data therein may be extracted in any available manner, which is not limited in the embodiment of the present invention.
For example, the 3D house type data of each functional area in the corresponding target house may be traversed, and the first target data, such as wall data, door data, window data, viewpoint data, etc., of each functional area may be extracted and processed.
The 3D house type data of each functional area is assumed to be a data packet named from, and different processing functions may be defined for different data dimensions to extract first target data under different data dimensions, so that the data packet of each functional area may be processed by calling different processing functions to obtain first target data corresponding to the corresponding functional area, and the obtained first target data of each functional area may be stored in a designated data packet, so as to be called when a three-dimensional space and a three-dimensional house space are subsequently constructed.
For example, functions for acquiring wall (walls) data, door (doors) data, window (windows) data, and viewpoint (hotspots) data may be set and defined as get _ wall _ from _ vr (), get _ door _ from _ vr (), get _ window _ from _ vr (), and get _ hotspots _ from _ vr (), respectively. Then, in order to obtain the wall data of any functional area at this time, the 3D house type data of the corresponding functional area can be parsed by get _ wall _ from _ vr (room) and the wall data can be read from it, and so on.
Further, a three-dimensional space of the functional region may be constructed based on the first target data. Optionally, in an embodiment, a data packet of each functional area in a target house may be generated based on first target data of the functional area, and when the data packet of the functional area is generated, a data format of data in the same data dimension when a three-dimensional space is constructed may be different from a data format of the data in 3D house type data, so that when the data packet of each functional area is generated, format adjustment may also be performed on the first target data therein to make the first target data meet the data format when the three-dimensional space is constructed, and different data dimensions (for example, a wall data dimension, a door data dimension, a window data dimension, an observation point data dimension, and the like) corresponding to the data format when the three-dimensional space is constructed may be set in a customized manner according to requirements, which is not limited in this embodiment of the present invention; moreover, in order to facilitate the subsequent generation of the three-dimensional house model of the target house at any time, the data packet of the target house can be constructed based on the data packet of each functional area. For example, for any functional area, the extracted first target data in each data dimension may be recorded into a corresponding designated data packet (e.g., a percentage _ room data packet) of the corresponding functional area, and then the designated data packet of each functional area may be recorded into a designated data packet (e.g., a percentage _ data packet) for storing the target house, so as to obtain a three-dimensional house model of the target house through rendering at any time based on the designated data packet corresponding to the target house.
For example, for the data dimension described above, in order to construct the decode _ data [ ], the first target data of any one functional region may be processed separately in the following manner to generate a decode _ room data packet based on each first target data.
decorate_room={}
decorate_room['walls']=walls
decorate_room['doors']=doors
decorate_room['windows']=windows
decorate_room['hotspots']=hotspots
decorate_room['area']=area
Furthermore, the decode _ data packet of the target house can be generated through the following function, that is, the decode _ room data packet is stored in the decode _ data packet:
decorate_data.append(decorate_room)
here, the decode _ data _ allowed function may be understood as a function for storing a decode _ room packet into a decode _ data packet.
It should be noted that the wall data may include any data related to the wall, such as coordinates of the inner and outer walls of each wall, the size of the wall, the position of the wall, the functional area to which the wall belongs, and the like; door data may also include any data related to the doors, such as the coordinates, size, location, orientation, functional area, etc. of each door; the window data may also include any data related to the windows, such as coordinates, dimensions, position, orientation, functional area, etc. of each window; the viewpoint data may also include any data relating to the viewpoint, such as position coordinates of the viewpoint, height data from the ground, and the like; and so on. Where the observation point may also be referred to as a roaming point, a camera point location, etc.
In addition, in practical application, heights of different functional areas in the same house may be different or have a deviation in the 3D house type data, so that when the first target data of different functional areas are obtained, heights of walls in the first target data are different, so that the heights of three-dimensional spaces of different functional areas in the same house are different, and accuracy or attractiveness of a three-dimensional house space obtained by subsequent construction is easily affected. Therefore, when the first target data is extracted based on the 3D house type data, any data having special requirements, such as the wall height data obtained by initial extraction, may be subjected to corresponding special processing, for example, a specified height (e.g., 2.8m) is uniformly set for the wall height data obtained by initial extraction, and the wall height data obtained after resetting is used as the wall data in the first target data, and so on.
Referring to fig. 2, in an embodiment, the step 120 may further include:
step B121, under the condition that the initial data includes 2D house type data, analyzing the 2D house type data of the functional area for each functional area in the target house, and extracting second target data of the functional area from the analyzed 2D house type data, wherein the second target data at least includes 2D wall data, 2D door data and 2D window data;
step B122, generating first target data of the functional area according to second target data of the functional area, wherein the first target data at least comprises wall data, door data, window data and observation point data;
and step B123, constructing a three-dimensional space of the functional area according to the first target data.
And in case that the initial data includes 2D house type data, the 2D house type data of the functional area may be analyzed for each functional area in the target house, and second target data of the functional area may be extracted from the analyzed 2D house type data.
The 2D house type data may be parsed and the second target data may be extracted in any available manner, which is not limited in the embodiments of the present invention. For example, according to the 2D house type data, the wall segment line in the 2D house type data may be recognized as 2D wall data, the door and window picture may be recognized as 2D door data and 2D window data, and other necessary second target data such as an entrance door and an orientation may be recognized from the 2D house type picture by any available algorithm such as ai (Artificial Intelligence) Recognition algorithm, OCR (Optical Character Recognition) algorithm, etc.
However, in the 2D house type data, the wall surface is a line segment, the position information of the door and the window is generally represented by two points, and the data of the entrance door is also represented by two points, so that the data needs to be converted into stereo data to accurately and completely construct the three-dimensional house space. Furthermore, since the viewpoint data generally exists in the three-dimensional data, the viewpoint data cannot be directly extracted from the 2D house type data.
Therefore, in the embodiment of the present invention, in order to construct a three-dimensional space of a functional region, first target data of the functional region may be generated according to second target data of the functional region, where the first target data may include at least wall data, door data, window data, and viewpoint data, so as to convert the two-dimensional second target data into the three-dimensional first target data. Furthermore, in the embodiment of the present invention, the second target data in two dimensions may be converted into the first target data in three dimensions in any available manner, which is not limited in the embodiment of the present invention. Further, a three-dimensional space of the functional region may be constructed based on the first target data.
Optionally, for 2D house type data, in an embodiment, for each functional area in the target house, a data packet of the functional area may be generated based on the first target data of the functional area, so as to generate a three-dimensional space of the functional area; furthermore, the data packet of the target house may be constructed based on the data packet of each functional area so as to generate a three-dimensional house space of the target house.
For example, the 2D house type data of each functional area in the corresponding target house may be traversed, and the first target data such as wall data, door data, window data, viewpoint data, etc. of each functional area may be processed.
The 2D house type data of a certain functional area is assumed to be a data packet named from in a 2ddat data packet, and different processing functions may be defined according to different data dimensions in the process of analyzing the 2D house type data and extracting second target data, so that the data packet of each functional area may be processed by calling different processing functions to obtain second target data corresponding to the corresponding functional area, and different processing functions may be called to obtain first target data, and the obtained first target data of each functional area may be stored in a specified data packet, so as to be called when a three-dimensional space or a three-dimensional house space is subsequently constructed.
For example, a processing function of extracting 2D wall data and converting it into wall (walls) data, a processing function of extracting 2D door data and converting it into door (doors) data, a processing function of extracting 2D window data and converting it into window (windows) data may be set and defined as get _ wall _ from _2D () get _ door _ from _2D (), get _ window _ from _2D (), a processing function of extracting an internal window, a processing function of acquiring viewpoint data (hotspots), a processing function of acquiring functional area data (area) may be set and defined as calsulte _ inner _ window (), calsullate _ hotspot (), calsullate _ area (), and calsullate _ area (), respectively. Then, in order to obtain the viewpoint data of any one functional region at this time, the viewpoint data of the corresponding functional region may be obtained by hotspot ═ computing _ hotspot (room), and so on.
Further, a three-dimensional space of the functional region may be constructed based on the first target data. Specifically, for each functional area in the target house, a three-dimensional space of the functional area may be generated based on the first target data of the functional area. The specific process may refer to the specific process of constructing the three-dimensional space of the functional region according to the first target data under the condition of the 3D house type data, which is not described herein again.
For example, for each data dimension under the first target data, the third target data of any functional area may be processed in the following manner to generate functional area model data of the functional area.
Optionally, in an embodiment, the step B122 may further include:
step B1221, generating wall data of the functional area according to the 2D wall data of the functional area and a preset wall height;
step B1222, generating door data and window data of the functional area according to the 2D door data and 2D window data of the functional area and a preset door and window height;
and step B1223, generating the observation point data of the functional area according to the second target data of the functional area.
As described above, in the 2D house type data, the wall surface is a line segment, the position information of the door and the window is generally represented by two points, and the data of the entrance door is also represented by two points, so that it is necessary to convert the data into the stereoscopic data. Optionally, in order to convert the second target data in the two-dimensional plane into the first target data in the three-dimensional space, data such as a door, a window, a wall and the like are mainly converted into stereo data, and viewpoint data is obtained.
Specifically, in order to quickly complete the conversion of the second target data, data such as a wall height, a door and window height and the like may be preset, so that the wall data of the functional area may be generated according to the 2D wall data of the functional area and the preset wall height; generating door data and window data of the functional area according to the 2D door data and the 2D window data of the functional area and a preset door and window height; and generating viewpoint data of the functional region based on the second target data of the functional region.
The wall height, the door and window height and the like can be set in a user-defined mode according to requirements and specific application scenes, and the embodiment of the invention is not limited.
For example, wall heights of 2.8m (meters), 2.5 meters, etc. may be set, door heights of 2.2m, 2.5m, etc. may be set, window heights of 900mm (millimeters) from the bottom of the window to the floor of the house, 300mm from the top of the window to the top of the house, etc. may be set.
In addition, in practical applications, the types of windows in the house can be divided into a plurality of types, such as a French window, a bay window, a normal window, and the like, and then the heights of the windows of different types can be set respectively.
For example, the window height for setting a bay window includes the bottom of the window being 400mm (millimeters) from the floor of the house, the top of the window being 300mm from the top of the house, the window height for a french window includes the bottom of the window being 300mm or 100mm from the floor of the house, the top of the window being 300mm from the top of the house, and so on.
In addition, when the observation point data is obtained, the relationship between the observation point data and the second target data may be set by user according to requirements, and the embodiment of the present invention is not limited. For example, for any functional area, coordinate information of a position at a specified height from the floor height of the house in the center line of the minimum rectangle that may include the plane area in the corresponding functional area may be acquired as viewpoint data of the functional area, and so on, based on the second target data of the functional area.
Optionally, in an embodiment, the step B1223 further includes:
step S1, acquiring the plane shape type of the functional area according to the second target data of the functional area;
step S2, in response to the plane shape type of the functional region being a rectangle, taking the position data of the center point of the largest rectangle within the functional region as the viewpoint data of the functional region.
Step S3, in response to the planar shape type of the functional region being an L shape, taking position data of a center point of an intersection region of two rectangles within the functional region as viewpoint data of the functional region.
Step S4, in response to the planar shape type of the functional region being a U shape, taking position data of center points of three rectangles in the functional region as viewpoint data of the functional region, respectively.
Step S5, in response to the functional region containing more than three rectangles, using the position data of the center point of each rectangle as the observation point data of the functional region.
In practical applications, a plurality of observation points may be included in one functional area, and in order to ensure that each corner in the three-dimensional space of each functional area can be previewed in the process of subsequently constructing and navigating the three-dimensional room space, it is required to ensure that the observation points are located so as to facilitate previewing the whole content of the three-dimensional room space.
Specifically, the plane shape type of the functional region may be acquired from second target data of the functional region. For example, the plane shape type of the functional area may be obtained from wall data such as the number of walls, the length of each wall, the connection relationship between the walls, and the positions of the walls included in the second target data. If the planar shape type of the functional area is a rectangle, the position data of the center point of the largest rectangle in the functional area can be used as the observation point data of the functional area, as shown in fig. 4(a), where the position of the ring can be an observation point; if the planar shape type of the functional region is an L shape, the position data of the center point of the intersection region of two rectangles in the functional region may be used as the viewpoint data of the functional region, as shown in fig. 4 (b); if the planar shape type of the functional region is a U shape, the position data of the center points of three rectangles in the functional region may be respectively used as the viewpoint data of the functional region, as shown in fig. 4 (c); if the functional region includes three or more rectangles, the position data of the center point of each rectangle may be used as the viewpoint data of the functional region.
The observation point can be located on the ground in the functional area or a certain height from the ground, wherein the height of the observation point relative to the ground in the functional area can be set in a user-defined mode according to requirements, and the embodiment of the invention is not limited.
Referring to fig. 2, in an embodiment, the step 130 may further include:
and step 134, generating the three-dimensional house space of the target house according to the finally obtained combined space and by combining the three-dimensional space of the specific functional area which is not combined with the combined space and the three-dimensional space of the non-specific functional area.
As described above, in practical applications, the merging process is only performed on the three-dimensional space of a part of the functional regions, and the merging process is not performed on the three-dimensional space of the functional regions partitioned by the solid walls, such as bedrooms and toilets. Therefore, in the embodiment of the present invention, a specific functional area of a target type, which may include, but is not limited to, at least two of a living room, a restaurant, an aisle, and a hallway, may be first obtained from the functional areas of the target house. The type of the functional area may be obtained by parsing and extracting from the original data, for example, for 2D house type data, the type of each functional area may be identified, for 3D house type data, the type of each functional area may be identified through mapping data in each functional area included in corresponding 3D house type data in a VR viewing room, and the like, and the type of the functional area may also be included in the first target data, which is not limited in this embodiment of the present invention.
Taking the living room, the dining room, the aisle and the hallway shown in fig. 3A as an example, any two connected functional areas, such as the living room and the dining room, may be merged first, the three-dimensional space of the living room and the three-dimensional space of the dining room are merged into one three-dimensional space, i.e., the current merged space, and the wall connected between the three-dimensional space of the living room and the three-dimensional space of the dining room is removed from the merged space, so that the three-dimensional space of the living room and the three-dimensional space of the dining room are communicated; furthermore, if the connection relationship exists between the aisle and the current merging space, the three-dimensional spaces of the guest restaurants, namely the current merging space and the three-dimensional space of the aisle, can be merged, and the wall bodies connected between the three-dimensional spaces of the guest restaurants and the three-dimensional spaces of the aisle are deleted in the merged space obtained after merging; secondly, if the vestibule and the current merging space have a connection relation, the current merging space and the three-dimensional space of the vestibule can be merged, and the wall body connected between the three-dimensional space obtained by merging in the previous step and the three-dimensional space of the vestibule is deleted from the merged space obtained by merging. The resulting merged space at this point can be as shown in fig. 3B.
Of course, in the embodiment of the present invention, if there is at least one specific functional area that is not adjacent to each of the other specific functional areas (i.e., there is no connecting wall), then it is not necessary to merge the specific functional area that is not adjacent to each of the other specific functional areas with each of the other specific functional areas.
Therefore, the three-dimensional house space of the target house can be generated according to the finally obtained merged space and by combining the three-dimensional space of the specific functional area which is not merged with the merged space and the three-dimensional space of the non-specific functional area. Of course, in the embodiment of the present invention, if the three-dimensional spaces of all the specific function areas are merged into the merged space, when the three-dimensional house space of the target house is generated, the three-dimensional spaces of the specific function areas that are not merged with the merged space do not need to be merged.
Optionally, in an embodiment, step 132 further may include:
step 1321, merging the first target data of the two specific function areas to obtain first target data of a current merging space, and removing wall data of a connected wall between the two specific function areas from the currently merged first target data;
step 1322 is to generate a three-dimensional space of the two specific functional regions based on the merged first target data.
In the embodiment of the invention, in order to facilitate the automatic combination of the three-dimensional spaces of the specific functional areas when the target house three-dimensional house model is subsequently constructed, the first target data of each specific functional area can be combined while the spaces are combined.
Taking three-dimensional space merging any two specific function areas with a connection relation as an example, merging the first target data of the two specific function areas to obtain the first target data of the current merging space, and removing the wall data of the wall body connected between the two specific function areas from the current merged first target data; and generating three-dimensional spaces of the two specific functional regions based on the merged first target data. For example, when the living room and the restaurant are merged, the wall data of the wall connected to the living room and the restaurant in the first target data of the living room and the restaurant may be removed, and then the first target data of the current merging space may be merged based on the remaining first target data, that is, the wall data of the wall connected between the two specific function areas is removed from the first target data after the current merging.
It should be noted that, in the embodiment of the present invention, when merging the three-dimensional spaces of other specific function regions into the current merging space, referring to the step 1321 and 1322 described above, first target data of a corresponding specific function region and first target data of the current merging space are merged first to obtain merged first target data, and wall data of a wall connected between the corresponding specific function region and the current merging space is removed from the currently merged first target data; the three-dimensional space of the corresponding specific function region and the current merging space is generated based on the merged first target data, so as to obtain the latest merged three-dimensional space, and the specific details may refer to the merging process of the three-dimensional spaces of the two specific function regions, which is not limited in the embodiment of the present invention.
Taking each living room, dining room, aisle and hallway shown in fig. 3A as an example, first target data of two specific function areas, such as the living room and the dining room, are merged, wall data of a wall connected between the living room and the dining room is deleted, remaining wall data is merged into the first target data of the merged space, and further, when the living room, the dining room and the aisle are merged, wall data of a wall connected between the two specific function areas is deleted, and remaining wall data is merged into the first target data of the merged space; secondly, merging the aisle and the entrance of the guest room, deleting the wall data of the wall body connected with the two specific functional areas, merging the rest wall data into the first target data of the current merging space, and finally adding other data except the wall data of all door data, window data and the like contained in the guest room, the restaurant, the aisle and the entrance into the first target data of the current merging space. Thereby obtaining the final merged space and the first target data thereof.
Moreover, in the merging process, the merged space obtained after the merging process may be an independent space, and the corresponding first target data may be separately generated, and the merged space and the first target data thereof may not affect each specific function area and the first target data of each specific function area, that is, the data for constructing the three-dimensional house model of the target house may simultaneously include the first target data of the merged space, the first target data of each specific function area, and the first target data of each non-specific function area. Of course, in the embodiment of the present invention, the first target data for generating the specific functional area of the merged space may be deleted from the data for constructing the three-dimensional house model of the target house according to the requirement, and the embodiment of the present invention is not limited thereto.
For example, a specific functional region may be identified by executing the following pseudo code;
the combination processing can be further carried out through the following pseudo codes;
# merging first target data of a particular functional area
combined_walls=combine_walls(need_combined_rooms)
combined_doors=combine_doors(need_combined_rooms)
combined_windows=combine_windows(need_combined_rooms)
combined_hotspots=combine_hotspots(need_combined_rooms)
combine_room={}
combine_room['walls']=combined_walls
combine_room['doors']=combined_doors
combine_room['windows']=combined_windows
combine_room['hotspots']=combined_hotspots
decorate_data.append(combine_room)
Here, "combine _ room" may be understood as a data packet containing first target data of the merge space, and the first target data of the merge space may be placed in the decode _ data packet through the above-mentioned decode _ data.
It should be noted that, in the embodiment of the present invention, if the original data includes both 3D house type data and 2D house type data, any one of the 3D house type data and the 2D house type data may be randomly adopted to construct a three-dimensional house space of the target house; or the priority of two kinds of original data under different target houses can be set, and the original data with higher priority is selected to construct the three-dimensional house space of the target house; or the three-dimensional house space of the target house may be constructed based on the 3D house type data and the 2D house type data at the same time, for example, the first target data obtained based on the 3D house type data and the 2D house type data may be considered comprehensively, and for any functional area, the first target data in the same data dimension obtained based on the 3D house type data and the 2D house type data may be merged and deduplicated to generate the first target data of the corresponding functional area, and so on.
In the embodiment of the invention, in order to realize the on-line simulation of decoration effects of different houses under different decoration styles, the data required in scenes of decoration and the like can be extracted from original data such as 3D house type data, 2D house type data and the like of a target house, the original data is calculated and processed, a complete three-dimensional house model for VR decoration is constructed, the accuracy and the integrity of the three-dimensional house model are improved, and the visual effect of subsequent VR decoration is further improved.
Referring to fig. 5, there is shown a schematic structural diagram of a three-dimensional building space constructing apparatus according to an embodiment of the present invention.
The construction device of the three-dimensional house space of the embodiment of the invention comprises: an initial data acquisition module 210, a three-dimensional space construction module 220, and a house space construction module 230.
The functions of the modules and the interaction relationship between the modules are described in detail below.
An initial data obtaining module 210, configured to obtain initial data of a target house, where the initial data includes at least one of 3D dwelling type data and 2D dwelling type data of the target house;
a three-dimensional space construction module 220, configured to obtain a three-dimensional space of each functional area in the target house according to the initial data;
a building space constructing module 230, configured to combine three-dimensional spaces of specific function areas in the target building into a three-dimensional space, and obtain a three-dimensional building space of the target building by combining a three-dimensional space of a non-specific function area, so as to render a building decoration result in the three-dimensional building space according to decoration data of each function area, where the non-specific function area is another function area except the specific function area.
Referring to fig. 6, in an embodiment of the present invention, the three-dimensional space constructing module 220 may include:
a first target data obtaining sub-module 221, configured to, when the initial data includes 3D house type data, analyze, for each functional area in the target house, the 3D house type data of the functional area, and extract, from the analyzed 3D house type data, first target data of the functional area;
a first three-dimensional space construction sub-module 222, configured to construct a three-dimensional space of the functional region according to the first target data;
the first target data at least comprises wall data, door data, window data and observation point data.
Referring to fig. 6, in an embodiment of the present invention, the three-dimensional space constructing module 220 may include:
a second target data obtaining sub-module 223, configured to, when the initial data includes 2D house type data, analyze, for each functional area in the target house, the 2D house type data of the functional area, and extract, from the analyzed 2D house type data, second target data of the functional area, where the second target data at least includes 2D wall data, 2D door data, and 2D window data;
a third target data obtaining sub-module 224, configured to generate first target data of the functional area according to the second target data of the functional area, where the first target data at least includes wall data, door data, window data, and viewpoint data;
and a second three-dimensional space constructing sub-module 225, configured to construct a three-dimensional space of the functional region according to the first target data.
Optionally, in an embodiment, the third target data obtaining sub-module 224 may include:
the wall data generating unit is used for generating wall data of the functional area according to the 2D wall data of the functional area and a preset wall height;
the door and window data generation unit is used for generating door data and window data of the functional area according to the 2D door data and the 2D window data of the functional area and the preset door and window height;
and the observation point data generating unit is used for generating observation point data of the functional area according to the second target data of the functional area.
Optionally, in an embodiment, the observation point data generating unit may be specifically configured to:
acquiring the plane shape type of the functional area according to the second target data of the functional area;
and in response to the plane shape type of the functional area being a rectangle, taking the position data of the center point of the largest rectangle in the functional area as the viewpoint data of the functional area.
And in response to the planar shape type of the functional region being L-shaped, taking the position data of the center point of the intersection region of the two rectangles in the functional region as the observation point data of the functional region.
And in response to the planar shape type of the functional area being a U shape, respectively taking the position data of the central points of the three rectangles in the functional area as the observation point data of the functional area.
And in response to the functional area containing more than three rectangles, respectively taking the position data of the central point of each rectangle as the observation point data of the functional area.
Optionally, in an embodiment, the housing space constructing module 230 may further include:
a specific function area obtaining sub-module 231, configured to obtain a specific function area of a target type from the function areas of the target premises, where the target type includes at least two of a living room, a restaurant, an aisle, and a hallway;
the first specific function region merging submodule 232 is configured to obtain any two specific function regions having a connection relationship, merge three-dimensional spaces of the two specific function regions, and remove a wall connected between the three-dimensional spaces of the two specific function regions to obtain a merged space;
the second specific functional region merging submodule 233 is configured to, in response to that there is still a specific functional region that has not been merged, merge the three-dimensional space of any specific functional region that has not been merged and has a connection relationship with the current merging space into the current merging space, and remove a wall connected between the three-dimensional space of the specific functional region that has not been merged and the merging space until the three-dimensional spaces of all the specific functional regions are merged completely, or there is no connection relationship between the three-dimensional space of the specific functional region that has not been merged and the current merging space;
and the house space construction sub-module 234 is configured to generate a three-dimensional house space of the target house according to the finally obtained merged space and by combining the three-dimensional space of the specific functional area that is not merged with the merged space and the three-dimensional space of the non-specific functional area.
Optionally, in an embodiment, the first specific function region merging sub-module 232 may be specifically configured to:
merging the first target data of the two specific function areas to obtain the first target data of the current merging space, and removing the wall data of the wall body connected between the two specific function areas from the currently merged first target data;
and generating three-dimensional spaces of the two specific functional areas based on the merged first target data.
The device for constructing a three-dimensional house space provided by the embodiment of the present invention can implement each process implemented in the method embodiments of fig. 1 to 2, and is not described herein again to avoid repetition.
Preferably, an embodiment of the present invention further provides an electronic device, including: the processor, the memory, and the computer program stored in the memory and capable of running on the processor, when executed by the processor, implement each process of the above-mentioned embodiment of the method for constructing a three-dimensional house space, and can achieve the same technical effect, and are not described herein again to avoid repetition.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when being executed by a processor, the computer program implements each process of the above-mentioned method for constructing a three-dimensional house space, and can achieve the same technical effect, and in order to avoid repetition, the details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
Fig. 7 is a schematic diagram of a hardware structure of an electronic device implementing various embodiments of the present invention.
The electronic device 500 includes, but is not limited to: a radio frequency unit 501, a network module 502, an audio output unit 503, an input unit 504, a sensor 505, a display unit 506, a user input unit 507, an interface unit 508, a memory 509, a processor 510, and a power supply 511. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 7 does not constitute a limitation of the electronic device, and that the electronic device may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the electronic device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 501 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 510; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 501 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 501 can also communicate with a network and other devices through a wireless communication system.
The electronic device provides wireless broadband internet access to the user via the network module 502, such as assisting the user in sending and receiving e-mails, browsing web pages, and accessing streaming media.
The audio output unit 503 may convert audio data received by the radio frequency unit 501 or the network module 502 or stored in the memory 509 into an audio signal and output as sound. Also, the audio output unit 503 may also provide audio output related to a specific function performed by the electronic apparatus 500 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 503 includes a speaker, a buzzer, a receiver, and the like.
The input unit 504 is used to receive an audio or video signal. The input Unit 504 may include a Graphics Processing Unit (GPU) 5041 and a microphone 5042, and the Graphics processor 5041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 506. The image frames processed by the graphic processor 5041 may be stored in the memory 509 (or other storage medium) or transmitted via the radio frequency unit 501 or the network module 502. The microphone 5042 may receive sounds and may be capable of processing such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 501 in case of the phone call mode.
The electronic device 500 also includes at least one sensor 505, such as light sensors, motion sensors, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 5061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 5061 and/or a backlight when the electronic device 500 is moved to the ear. As one type of motion sensor, an accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of an electronic device (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 505 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 506 is used to display information input by the user or information provided to the user. The Display unit 506 may include a Display panel 5061, and the Display panel 5061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 507 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device. Specifically, the user input unit 507 includes a touch panel 5071 and other input devices 5072. Touch panel 5071, also referred to as a touch screen, may collect touch operations by a user on or near it (e.g., operations by a user on or near touch panel 5071 using a finger, stylus, or any suitable object or attachment). The touch panel 5071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 510, and receives and executes commands sent by the processor 510. In addition, the touch panel 5071 may be implemented in various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 5071, the user input unit 507 may include other input devices 5072. In particular, other input devices 5072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 5071 may be overlaid on the display panel 5061, and when the touch panel 5071 detects a touch operation thereon or nearby, the touch operation is transmitted to the processor 510 to determine the type of the touch event, and then the processor 510 provides a corresponding visual output on the display panel 5061 according to the type of the touch event. Although in fig. 7, the touch panel 5071 and the display panel 5061 are two independent components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 5071 and the display panel 5061 may be integrated to implement the input and output functions of the electronic device, and is not limited herein.
The interface unit 508 is an interface for connecting an external device to the electronic apparatus 500. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 508 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the electronic apparatus 500 or may be used to transmit data between the electronic apparatus 500 and external devices.
The memory 509 may be used to store software programs as well as various data. The memory 509 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 509 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 510 is a control center of the electronic device, connects various parts of the whole electronic device by using various interfaces and lines, performs various functions of the electronic device and processes data by running or executing software programs and/or modules stored in the memory 509 and calling data stored in the memory 509, thereby performing overall monitoring of the electronic device. Processor 510 may include one or more processing units; preferably, the processor 510 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 510.
The electronic device 500 may further include a power supply 511 (e.g., a battery) for supplying power to various components, and preferably, the power supply 511 may be logically connected to the processor 510 via a power management system, so as to implement functions of managing charging, discharging, and power consumption via the power management system.
In addition, the electronic device 500 includes some functional modules that are not shown, and are not described in detail herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.
Claims (16)
1. A method for constructing a three-dimensional housing space, comprising:
acquiring initial data of a target house, wherein the initial data comprises at least one of 3D (three-dimensional) house type data and 2D house type data of the target house, and in the 3D house type data and the 2D house type data, different functional areas are closed areas;
according to the initial data, obtaining a three-dimensional space of each functional area in the target house to set decoration data of each functional area, wherein the decoration data comprises at least one of hard decoration data and soft decoration data, and under the condition that no solid wall exists between two specific functional areas and the two specific functional areas are connected through an open space, the three-dimensional space of each specific functional area comprises an open wall;
combining the three-dimensional space of a specific function area in the target house into a three-dimensional space, and combining the three-dimensional space of a non-specific function area to obtain the three-dimensional house space of the target house, so as to render a house decoration result in the three-dimensional house space according to decoration data of each function area, wherein the non-specific function area is other function areas except the specific function area;
combining the three-dimensional space of the specific function area in the target house into a three-dimensional space, and combining the three-dimensional space of the non-specific function area to obtain the three-dimensional house space of the target house, wherein the method comprises the following steps: combining the three-dimensional spaces of any two specific functional areas connected through an open wall, and removing the open wall to obtain a combined space; and generating the three-dimensional house space of the target house according to the combined space and combining the three-dimensional space of the specific functional area which is not combined with the combined space and the three-dimensional space of the non-specific functional area.
2. The method of claim 1, wherein said step of obtaining a three-dimensional space for each functional area in said target premises based on said initial data comprises:
when the initial data comprises 3D house type data, analyzing the 3D house type data of the functional area aiming at each functional area in the target house, and extracting first target data of the functional area from the analyzed 3D house type data;
constructing a three-dimensional space of the functional area according to the first target data;
the first target data at least comprises wall data, door data, window data and observation point data.
3. The method of claim 1, wherein said step of obtaining a three-dimensional space for each functional area in said target premises based on said initial data comprises:
under the condition that the initial data comprises 2D house type data, analyzing the 2D house type data of the functional area aiming at each functional area in the target house, and extracting second target data of the functional area from the analyzed 2D house type data, wherein the second target data at least comprises 2D wall data, 2D door data and 2D window data;
generating first target data of the functional area according to second target data of the functional area, wherein the first target data at least comprises wall data, door data, window data and observation point data;
and constructing a three-dimensional space of the functional region according to the first target data.
4. The method according to claim 3, wherein the step of generating the first target data of the functional area according to the second target data of the functional area comprises:
generating wall data of the functional area according to the 2D wall data of the functional area and a preset wall height;
generating door data and window data of the functional area according to the 2D door data and the 2D window data of the functional area and a preset door and window height;
and generating observation point data of the functional area according to the second target data of the functional area.
5. The method according to claim 4, wherein the step of generating viewpoint data of the functional area based on the second target data of the functional area comprises:
acquiring the plane shape type of the functional area according to the second target data of the functional area;
in response to the planar shape type of the functional region being a rectangle, taking position data of a center point of a largest rectangle within the functional region as viewpoint data of the functional region;
in response to that the planar shape type of the functional region is L-shaped, taking the position data of the central point of the intersection region of two rectangles in the functional region as the observation point data of the functional region;
in response to the fact that the plane shape type of the functional area is U-shaped, respectively taking position data of central points of three rectangles in the functional area as observation point data of the functional area;
and responding to the fact that more than three rectangles are contained in the functional area, and respectively taking the position data of the central point of each rectangle as the observation point data of the functional area.
6. The method according to any one of claims 1 to 5, wherein the merging the three-dimensional spaces of any two specific functional areas connected by an open wall and removing the open wall to obtain a merged space comprises:
acquiring a specific function area of a target type from the function areas of the target house, wherein the target type comprises at least two of a living room, a restaurant, an aisle and an entrance;
acquiring any two specific function areas with a connection relation, combining three-dimensional spaces of the two specific function areas, and removing an open wall connected between the three-dimensional spaces of the two specific function areas to obtain a combined space;
and in response to the existence of the specific functional regions which are not combined, combining the three-dimensional space of any one of the specific functional regions which are not combined and have a connection relationship with the current combination space into the current combination space, and removing the open wall connected between the three-dimensional space of the specific functional regions which are not combined and the combination space until the combination of the three-dimensional spaces of all the specific functional regions is completed, or the three-dimensional space of the specific functional regions which are not combined and the current combination space do not have a connection relationship, so as to obtain the combination space.
7. The method according to claim 2 or 3, wherein said step of merging the three-dimensional spaces of said two specific functional areas comprises:
merging the first target data of the two specific function areas to obtain the first target data of the current merging space, and removing the wall data of the open wall connected between the two specific function areas from the currently merged first target data;
and generating three-dimensional spaces of the two specific functional areas based on the merged first target data.
8. A device for constructing a three-dimensional room space, comprising:
the system comprises an initial data acquisition module, a data processing module and a data processing module, wherein the initial data acquisition module is used for acquiring initial data of a target house, the initial data comprises at least one of 3D (three-dimensional) house type data and 2D house type data of the target house, and in the 3D house type data and the 2D house type data, different functional areas are closed areas;
a three-dimensional space construction module, configured to obtain, according to the initial data, a three-dimensional space of each functional area in the target house, where model data of the functional area is data used to construct a 3D functional area model of at least one functional area, where, under a condition that no solid wall exists between two specific functional areas and the two specific functional areas are connected by an open space, the three-dimensional space of each of the two specific functional areas includes an open wall;
the building space construction module is used for combining the three-dimensional space of the specific function area in the target building into a three-dimensional space, combining the three-dimensional space of a non-specific function area to obtain the three-dimensional building space of the target building, and rendering a building decoration result in the three-dimensional building space according to decoration data of each function area, wherein the non-specific function area is other function areas except the specific function area;
wherein the housing space construction module at least comprises: the house space construction submodule is used for combining the three-dimensional spaces of any two specific functional areas connected through the open wall and removing the open wall to obtain a combined space; and generating the three-dimensional house space of the target house according to the combined space and combining the three-dimensional space of the specific functional area which is not combined with the combined space and the three-dimensional space of the non-specific functional area.
9. The apparatus of claim 8, wherein the three-dimensional space construction module comprises:
a first target data obtaining sub-module, configured to, when the initial data includes 3D house type data, analyze, for each functional area in the target house, the 3D house type data of the functional area, and extract, from the analyzed 3D house type data, first target data of the functional area;
the first three-dimensional space construction submodule is used for constructing a three-dimensional space of the functional area according to the first target data;
the first target data at least comprises wall data, door data, window data and observation point data.
10. The apparatus of claim 8, wherein the three-dimensional space construction module comprises:
a second target data obtaining sub-module, configured to, when the initial data includes 2D house type data, analyze, for each functional area in the target house, the 2D house type data of the functional area, and extract, from the analyzed 2D house type data, second target data of the functional area, where the second target data at least includes 2D wall data, 2D door data, and 2D window data;
the third target data acquisition submodule is used for generating first target data of the functional area according to second target data of the functional area, wherein the first target data at least comprises wall data, door data, window data and observation point data;
and the second three-dimensional space construction submodule is used for constructing the three-dimensional space of the functional area according to the first target data.
11. The apparatus of claim 10, wherein the third target data acquisition sub-module comprises:
the wall data generating unit is used for generating wall data of the functional area according to the 2D wall data of the functional area and a preset wall height;
the door and window data generation unit is used for generating door data and window data of the functional area according to the 2D door data and the 2D window data of the functional area and the preset door and window height;
and the observation point data generating unit is used for generating observation point data of the functional area according to the second target data of the functional area.
12. The apparatus according to claim 11, wherein the observation point data generating unit is specifically configured to:
acquiring the plane shape type of the functional area according to the second target data of the functional area;
in response to the planar shape type of the functional region being a rectangle, taking position data of a center point of a largest rectangle within the functional region as viewpoint data of the functional region;
in response to that the planar shape type of the functional region is L-shaped, taking the position data of the central point of the intersection region of two rectangles in the functional region as the observation point data of the functional region;
in response to the fact that the plane shape type of the functional area is U-shaped, respectively taking position data of central points of three rectangles in the functional area as observation point data of the functional area;
and in response to the functional area containing more than three rectangles, respectively taking the position data of the central point of each rectangle as the observation point data of the functional area.
13. The apparatus of any one of claims 8-12, wherein the housing space construction module further comprises:
a specific function area obtaining submodule, configured to obtain a specific function area of a target type from the function area of the target house, where the target type includes at least two of a living room, a restaurant, an aisle, and a hallway;
the first specific function area merging submodule is used for acquiring any two specific function areas with a connection relation, merging three-dimensional spaces of the two specific function areas, and removing an open wall connected between the three-dimensional spaces of the two specific function areas to obtain a merged space;
and the second specific functional region merging submodule is used for merging the three-dimensional space of any one of the non-merged specific functional regions which has a connection relation with the current merging space into the current merging space in response to the existence of the non-merged specific functional regions, and removing an open wall connected between the three-dimensional space of the non-merged specific functional regions and the merging space until the three-dimensional spaces of all the specific functional regions are merged completely, or the three-dimensional space of the non-merged specific functional regions and the current merging space do not have a connection relation, so that a merging space is obtained.
14. The apparatus according to claim 9 or 10, comprising a first specific functional area merging submodule, configured to:
merging the first target data of the two specific function areas to obtain the first target data of the current merging space, and removing the wall data of the open wall connected between the two specific function areas from the currently merged first target data;
and generating three-dimensional spaces of the two specific functional areas based on the merged first target data.
15. An electronic device, comprising: memory, processor and computer program stored on the memory and executable on the processor, which computer program, when executed by the processor, carries out the steps of the method of building a three-dimensional room space according to any one of claims 1 to 7.
16. A computer-readable storage medium, characterized in that a computer program is stored thereon, which computer program, when being executed by a processor, carries out the steps of the method of building a three-dimensional room space according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010665225.4A CN111968247B (en) | 2020-07-10 | 2020-07-10 | Method and device for constructing three-dimensional house space, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010665225.4A CN111968247B (en) | 2020-07-10 | 2020-07-10 | Method and device for constructing three-dimensional house space, electronic equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111968247A CN111968247A (en) | 2020-11-20 |
CN111968247B true CN111968247B (en) | 2021-10-19 |
Family
ID=73360538
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010665225.4A Active CN111968247B (en) | 2020-07-10 | 2020-07-10 | Method and device for constructing three-dimensional house space, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111968247B (en) |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113779661B (en) * | 2021-07-26 | 2024-06-14 | 贝壳找房(北京)科技有限公司 | Automatic wiring method for lighting circuit, electronic device, storage medium and apparatus |
CN113901540B (en) * | 2021-09-08 | 2024-09-27 | 长沙泛一参数信息技术有限公司 | Automatic identification method for type of door and window table building drawing and door and window |
CN114332428B (en) * | 2021-12-30 | 2022-08-26 | 北京发现角科技有限公司 | Method and device for realizing virtual house room segmentation effect |
CN114358836A (en) * | 2021-12-30 | 2022-04-15 | 北京有竹居网络技术有限公司 | Display method and device of house type report, readable storage medium and electronic equipment |
CN114596417B (en) * | 2022-02-22 | 2023-04-07 | 北京城市网邻信息技术有限公司 | Data processing method and device for house decoration, electronic equipment and storage medium |
CN115329420B (en) * | 2022-07-18 | 2023-10-20 | 北京五八信息技术有限公司 | Marking generation method and device, terminal equipment and storage medium |
CN116186842A (en) * | 2022-12-29 | 2023-05-30 | 杭州群核信息技术有限公司 | Decoration design restoration method, device and storage medium |
Citations (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101930627A (en) * | 2010-09-10 | 2010-12-29 | 西安新视角信息科技有限公司 | Three-dimensional dwelling size modeling method based on two-dimensional dwelling size diagram |
CN105279787A (en) * | 2015-04-03 | 2016-01-27 | 北京明兰网络科技有限公司 | Method for generating three-dimensional (3D) building model based on photographed house type image identification |
CN106023305A (en) * | 2016-05-10 | 2016-10-12 | 曹屹 | Modeling method and apparatus for three-dimensional space |
CN106600709A (en) * | 2016-12-15 | 2017-04-26 | 苏州酷外文化传媒有限公司 | Decoration information model-based VR virtual decoration method |
CN107742319A (en) * | 2017-10-27 | 2018-02-27 | 北京小米移动软件有限公司 | Model data processing method and processing device |
CN109960850A (en) * | 2019-02-20 | 2019-07-02 | 江苏艾佳家居用品有限公司 | A kind of method and system calculating indoor panorama sketch collection point and roof lamps and lanterns layout |
CN110136244A (en) * | 2019-04-12 | 2019-08-16 | 平安城市建设科技(深圳)有限公司 | Three-dimensional house type model generating method, device, equipment and storage medium |
CN110210377A (en) * | 2019-05-30 | 2019-09-06 | 南京维狸家智能科技有限公司 | A kind of wall and door and window information acquisition method rebuild for three-dimensional house type |
CN110274602A (en) * | 2018-03-15 | 2019-09-24 | 奥孛睿斯有限责任公司 | Indoor map method for auto constructing and system |
CN110634100A (en) * | 2019-08-07 | 2019-12-31 | 贝壳技术有限公司 | Household type graph generation method and device, electronic equipment and storage medium |
CN110781541A (en) * | 2019-10-08 | 2020-02-11 | 江苏艾佳家居用品有限公司 | Region merging method and system for home decoration design drawing |
CN111008416A (en) * | 2019-11-12 | 2020-04-14 | 江苏艾佳家居用品有限公司 | Method and system for generating illumination effect of house type scene |
US10645275B1 (en) * | 2019-03-11 | 2020-05-05 | Amazon Technologies, Inc. | Three-dimensional room measurement process with augmented reality guidance |
CN111125807A (en) * | 2019-11-06 | 2020-05-08 | 贝壳技术有限公司 | Decoration three-dimensional model rendering display method and system |
CN111145352A (en) * | 2019-12-20 | 2020-05-12 | 北京乐新创展科技有限公司 | House live-action picture display method and device, terminal equipment and storage medium |
CN111369424A (en) * | 2020-02-10 | 2020-07-03 | 北京城市网邻信息技术有限公司 | Method, device, equipment and storage medium for generating three-dimensional space of target house |
CN111369664A (en) * | 2020-02-10 | 2020-07-03 | 北京城市网邻信息技术有限公司 | Method, device, equipment and storage medium for displaying house type scene |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106815440A (en) * | 2017-01-19 | 2017-06-09 | 深圳市彬讯科技有限公司 | The structural commodities intelligent distribution method and system of a kind of household Size Dwelling Design |
CN108564645B (en) * | 2018-03-09 | 2020-07-03 | 平安科技(深圳)有限公司 | Rendering method of house model, terminal device and medium |
CN110781539B (en) * | 2019-09-29 | 2022-07-08 | 江苏艾佳家居用品有限公司 | Automatic design method and system for house type graph |
CN110826121B (en) * | 2019-10-10 | 2022-07-08 | 江苏艾佳家居用品有限公司 | Method and system for automatically positioning house type corridor and entrance |
CN111191306A (en) * | 2019-12-12 | 2020-05-22 | 江苏艾佳家居用品有限公司 | Room design effect display method and system |
CN111199577A (en) * | 2019-12-31 | 2020-05-26 | 上海简家信息技术有限公司 | Virtual house decoration method |
-
2020
- 2020-07-10 CN CN202010665225.4A patent/CN111968247B/en active Active
Patent Citations (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101930627A (en) * | 2010-09-10 | 2010-12-29 | 西安新视角信息科技有限公司 | Three-dimensional dwelling size modeling method based on two-dimensional dwelling size diagram |
CN105279787A (en) * | 2015-04-03 | 2016-01-27 | 北京明兰网络科技有限公司 | Method for generating three-dimensional (3D) building model based on photographed house type image identification |
CN106023305A (en) * | 2016-05-10 | 2016-10-12 | 曹屹 | Modeling method and apparatus for three-dimensional space |
CN106600709A (en) * | 2016-12-15 | 2017-04-26 | 苏州酷外文化传媒有限公司 | Decoration information model-based VR virtual decoration method |
CN107742319A (en) * | 2017-10-27 | 2018-02-27 | 北京小米移动软件有限公司 | Model data processing method and processing device |
CN110274602A (en) * | 2018-03-15 | 2019-09-24 | 奥孛睿斯有限责任公司 | Indoor map method for auto constructing and system |
CN109960850A (en) * | 2019-02-20 | 2019-07-02 | 江苏艾佳家居用品有限公司 | A kind of method and system calculating indoor panorama sketch collection point and roof lamps and lanterns layout |
US10645275B1 (en) * | 2019-03-11 | 2020-05-05 | Amazon Technologies, Inc. | Three-dimensional room measurement process with augmented reality guidance |
CN110136244A (en) * | 2019-04-12 | 2019-08-16 | 平安城市建设科技(深圳)有限公司 | Three-dimensional house type model generating method, device, equipment and storage medium |
CN110210377A (en) * | 2019-05-30 | 2019-09-06 | 南京维狸家智能科技有限公司 | A kind of wall and door and window information acquisition method rebuild for three-dimensional house type |
CN110634100A (en) * | 2019-08-07 | 2019-12-31 | 贝壳技术有限公司 | Household type graph generation method and device, electronic equipment and storage medium |
CN110781541A (en) * | 2019-10-08 | 2020-02-11 | 江苏艾佳家居用品有限公司 | Region merging method and system for home decoration design drawing |
CN111125807A (en) * | 2019-11-06 | 2020-05-08 | 贝壳技术有限公司 | Decoration three-dimensional model rendering display method and system |
CN111008416A (en) * | 2019-11-12 | 2020-04-14 | 江苏艾佳家居用品有限公司 | Method and system for generating illumination effect of house type scene |
CN111145352A (en) * | 2019-12-20 | 2020-05-12 | 北京乐新创展科技有限公司 | House live-action picture display method and device, terminal equipment and storage medium |
CN111369424A (en) * | 2020-02-10 | 2020-07-03 | 北京城市网邻信息技术有限公司 | Method, device, equipment and storage medium for generating three-dimensional space of target house |
CN111369664A (en) * | 2020-02-10 | 2020-07-03 | 北京城市网邻信息技术有限公司 | Method, device, equipment and storage medium for displaying house type scene |
Also Published As
Publication number | Publication date |
---|---|
CN111968247A (en) | 2020-11-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111968247B (en) | Method and device for constructing three-dimensional house space, electronic equipment and storage medium | |
CN111951374B (en) | House decoration data processing method and device, electronic equipment and storage medium | |
CN111417028B (en) | Information processing method, information processing device, storage medium and electronic equipment | |
US9983592B2 (en) | Moving robot, user terminal apparatus and control method thereof | |
US9398413B1 (en) | Mapping electronic devices within an area | |
KR101638378B1 (en) | Method and program for modeling 3-dimension structure by 2-dimension floor plan | |
US20190041972A1 (en) | Method for providing indoor virtual experience based on a panorama and a 3d building floor plan, a portable terminal using the same, and an operation method thereof | |
CN111145352A (en) | House live-action picture display method and device, terminal equipment and storage medium | |
CN109409244B (en) | Output method of object placement scheme and mobile terminal | |
CN108564274B (en) | Guest room booking method and device and mobile terminal | |
JPWO2019069575A1 (en) | Information processing equipment, information processing methods and programs | |
CN112068752A (en) | Space display method and device, electronic equipment and storage medium | |
WO2024193610A1 (en) | Method and apparatus for generating environment model, display method and apparatus, device and storage medium | |
CN109472825B (en) | Object searching method and terminal equipment | |
CN113269877B (en) | Method and electronic equipment for acquiring room layout plan | |
CN115713616B (en) | House source space model generation method and device, terminal equipment and storage medium | |
CN113963108A (en) | Medical image cooperation method and device based on mixed reality and electronic equipment | |
CN111882650A (en) | Spatial light processing method and device, electronic equipment and storage medium | |
CN115830280A (en) | Data processing method and device, electronic equipment and storage medium | |
CN115731349A (en) | Method and device for displaying house type graph, electronic equipment and storage medium | |
CN111079032A (en) | Information recommending method and electronic equipment | |
CN116943216A (en) | Laminating relation detection method, laminating relation detection device, laminating relation detection equipment and storage medium | |
CN115729393A (en) | Prompting method and device in information processing process, electronic equipment and storage medium | |
KR20200041877A (en) | Information processing device, information processing method and program | |
CN111882651B (en) | Spatial light processing method and device, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |