EP3844724A1 - Automatic generation of a virtual reality walkthrough - Google Patents

Automatic generation of a virtual reality walkthrough

Info

Publication number
EP3844724A1
EP3844724A1 EP19766058.2A EP19766058A EP3844724A1 EP 3844724 A1 EP3844724 A1 EP 3844724A1 EP 19766058 A EP19766058 A EP 19766058A EP 3844724 A1 EP3844724 A1 EP 3844724A1
Authority
EP
European Patent Office
Prior art keywords
bim
interest
point
points
space
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP19766058.2A
Other languages
German (de)
French (fr)
Inventor
Juhani KORPINEN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tridify Oy
Original Assignee
Tridify Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tridify Oy filed Critical Tridify Oy
Publication of EP3844724A1 publication Critical patent/EP3844724A1/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/04Architectural design, interior design

Definitions

  • the present invention relates to a method, a system and a computer program product related to virtual reality. More particularly, the present invention relates to automated generation of a virtual reality walkthrough based on building information model data.
  • Building Information Modeling also referred to as Building Information Model, is a digital representation of physical and functional characteristics of a facility.
  • a BIM is a shared knowledge resource for information about a facility forming a reliable basis for decisions during its life -cycle.
  • a BIM involves representing a design as combinations of objects that carry their geometry, relations and attributes. However, BIM goes beyond planning and design, extending throughout the building life cycle.
  • IFC Industry Foundation Classes
  • IfcProduct is an IFC base class that represents occurrences in space, such as physical building elements and spatial locations.
  • IFC relationships (IfcRelationship) capture relationships among objects. IFC relationships may indicate a whole-part relationship having exclusive containment, such as subdividing a building into floors and rooms.
  • IFC product is a base class for any physical objects, and may be subdivided into spatial elements, physical elements, structural analysis items and other concepts. Products may have associated materials, shape representations, and placement in space. Spatial elements comprise for example sites (IfcSite), buildings (IfcBuilding), building storeys (IfcBuildingStorey) and spaces (IfcSpace) such as apartments or rooms. Physical building elements comprise for example walls (IfcWall), beams (IfcBeam), doors (IfcBeam), windows (IcfWindow), stairs (IfcStair) and so on. Various connectivity relationships may be used for IFC products, such as walls having openings filled by doors or windows.
  • IFC products may also have placement, which can be defined locally (IfcLocalPlacement) in relation to an enclosing element, or in a grid (IfcGridPlacement) relative to a grid with user-defined axes.
  • Physical elements may have defined materials, with optional properties such as mechanical, thermal properties and styles, such as colors and textures.
  • BIM standards are for example various national standards such as CIS/2 standard by American Institute of Steel Construction, BuildingSmart Flong Kong, Virtual Design and Construction (VDC) in India, and a variety of national BIM implementations such as ONORM A6241 in Austria and building transition digital plan (PTNB) in France, to name a few.
  • CIS/2 standard by American Institute of Steel Construction, BuildingSmart Flong Kong, Virtual Design and Construction (VDC) in India
  • VDC Virtual Design and Construction
  • PTNB building transition digital plan
  • a virtual walkthrough also referred to as a virtual tour, is a simulation of a location or facility, usually composed of a sequence of videos or still images.
  • a virtual walkthrough enables for example a buyer of a house or a flat to virtually experience the property even if it only exists on a drawing table or is still in early phase of construction.
  • a modern version of a virtual walkthrough comprises 360° images, which enable the viewer to explore the facility without being at the location, for example online.
  • Patent application US 20160300392 A1 discloses systems, media, and methods for providing virtual reality tours.
  • a user selects one or more vantage points on a 2D floorplan, and the system generates a virtual reality tour based on the 2D floorplan, vantage point coordinates and associated virtual reality content.
  • An object is to provide a method and apparatus so as to solve the problem of automating creation of virtual walkthroughs.
  • the objects of the present invention are achieved with a method according to the characterizing portion of claim 1.
  • the objects of the present invention are further achieved with a computer program product according to the characterizing portion of claim 10, a computer readable medium according to the characterizing portion of claim 11 and a virtual reality walkthrough generation system according to the characterizing portion of claim 12.
  • the preferred embodiments of the invention are disclosed in the dependent claims.
  • the present invention is based on the idea of automating calculation of navigation points needed for the virtual walkthrough and 360° images needed for generation of the virtual walkthrough.
  • the present invention has the advantage that the automatic virtual walkthrough generation is fully automated, which allows providing the entire generation of a virtual walkthrough for example as a web service.
  • the customer simply uploads relevant BIM data for the service, and in response receives an automatically generated full virtual walkthrough.
  • a computer-implemented method for automatically generating a virtual reality walkthrough based on Building Information Model (BIM) data comprises obtaining serialized BIM data, finding a plurality of BIM space elements among the serialized BIM data and automatically selecting a plurality of points of interest, wherein each point of interest represents a location in one of the plurality of BIM space elements.
  • the method also comprises calculating a viewpoint for each point of interest, rendering a 360° image from each of the viewpoints and associating the rendered 360° image with the respective one of the plurality of points of interest.
  • the method also comprises locating one or more openings associated to each of the plurality of BIM space elements, identifying adjacent BIM space elements that are connected to the respective BIM space element through via the openings and placing a navigation point at each opening connecting adjacent BIM space elements.
  • a navigation mesh is calculated, the navigation mesh comprising the plurality of points of interest and the plurality of navigation points placed at the locations of the openings connecting BIM space elements.
  • a virtual reality walkthrough data object is combined for the plurality of BIM space elements, the virtual reality walkthrough data object comprising the navigation mesh and any one of the 360 images and links to the 360 images. The virtual reality walkthrough data object is stored.
  • the selecting of the points of interest for a BIM space element comprises, if a size measure of the BIM space element is above a first threshold value but below a second threshold value, calculating a geometrical center of a floor of a plurality of the BIM space elements, and placing the respective point of interest at the calculated geometrical center of the respective BIM space element; if the size measure of the BIM space element is above a second threshold value, defining a grid with a predefined cell size, calculating a geometrical center for a plurality of cells of the grid overlapping the BIM space element, and placing a plurality of points of interest at the calculated geometrical centers of the overlapping cells, and if the size measure of the BIM space element is below the first threshold value, not placing any point of interest.
  • a point of interest is placed at a cell if size measure of an overlapping section of the BIM space element and the respective cell of the grid exceeds a third threshold value.
  • the third threshold value is equal to the first threshold value.
  • the size measure is any one of volume, area, diameter and length of at least one side of an area.
  • the rendering the 360° image comprises placing a 360° image probe at a predetermined height above the point of interest, and calculating at least one of a 2D 360° image and a 3D 360° image on basis of BIM data of physical building elements connected to the BIM space element associated with the point of interest.
  • the locating one or more openings comprises identifying an BIM product associated with the respective BIM space element that has an opening associated with it and obtaining coordinates of the opening connecting BIM space elements from BIM data.
  • the method further comprises identifying a plurality of information points based on selected BIM elements comprised the serialized BIM data, associating each information point with a location in the space, including the information points in the navigation mesh, and storing the plurality of information points in the virtual reality walkthrough data object.
  • a computer program product for automatically generating a virtual reality walkthrough from Building Information Model (BIM) data is provided, wherein the computer program product is configured to perform the computer-implemented method steps according to any one of the first to eighth aspect.
  • BIM Building Information Model
  • a computer readable medium having stored thereon a computer program product according to the above aspect.
  • a virtual reality walkthrough generation system that generates a virtual reality automatically based on Building Information Model (BIM) data by carrying out the computer- implemented method according to any one of the first to eighth aspect.
  • BIM Building Information Model
  • a computer program product having instructions which, when executed by a computing device or system cause the computing device or system to perform a method according to any one of the first to eighth aspect.
  • Figure 1 shows an exemplary BIM space element diagram of an apartment as a 2D floor plan.
  • Figure 2 illustrates a high-level flow chart of an automatic virtual reality (VR) walkthrough generation service.
  • VR virtual reality
  • Figure 3 illustrates first phases of the method of automatic generation of a virtual walkthrough.
  • Figure 4 illustrates an example of a BIM space element for which a plurality of points of interest are to be defined.
  • Figure 5 is a flow chart illustrating phases of a method related to handling relationships of the BIM space elements via openings.
  • Figure 6 illustrates phases of the method of automatic generation of a virtual walkthrough.
  • Figure 7 illustrates features of a navigation mesh on top of a perspective view of a flat.
  • Figure 8 illustrates a system, in particular a computer system. Detailed description
  • Navigation point is a point or node by pointing which a user can navigate to a node of a navigation mesh.
  • a navigation point also refers to coordinates of a navigation button to a neighboring point of interest that is connected via the navigation mesh. Coordinates of the navigation points may be automatically calculated from IFC data.
  • a navigation mesh is a network that declares all possible navigation routes.
  • a navigation mesh may comprise navigation points, points of interest and information points.
  • a navigation route refers to a virtual walk through path from node to node.
  • An information point refers an IFC element about which information is to be shown and also the coordinates for the information button.
  • space element and BIM space element refer to a space element defined in any type of BIM data.
  • a space element or a BIM space element may be an IFC Space Element (IfcSpace).
  • a BIM space element may refer for example to an apartment or a room.
  • the terms space element and BIM space element may be used alternatively.
  • the figure 1 shows an exemplary, simplified space element diagram of an apartment shown as a 2D floor plan.
  • the exemplary space comprises an apartment space element (100) that comprises three room space elements (101).
  • the space elements (100, 101) may be associated with walls that have openings (103).
  • the space elements (100, 101) are spatial elements, and they are associated with physical elements such as walls, beams, doors, windows, stairs and so on.
  • the openings define connectivity relationships between space elements (100, 101), and represent for example doorways, doors and/or windows. Some of the openings are associated with two adjacent room space elements, thus connecting the rooms, while others may connect a room towards outside.
  • At least one point of interest (104) may be defined for at least some of the space elements (100, 101).
  • the point of interest (104) represents coordinates of a navigation point in the space element.
  • the point of interest is disposed on the floor of the respective space element. If the point of interest is on the floor, above the point of interest (104) a view point is defined, at which a 360° image of the respective space is to be rendered.
  • the rendered 360° image is, as the name defines, a 360-degree rendered image from the view point associated with the respective point of interest, which is wrapped around a sphere to enable the viewer to virtually look at any direction around him/her from the point of view.
  • the location of a point of interest may be expressed in cartesian (XYZ) coordinates.
  • the x- and y-coordinates of the point of interest are preferably defined so that the point of interest (104) resides within the space element (101).
  • the z-coordinate, in other words the vertical height, of the point of interest may be defined so that it resides at the floor level of the space element.
  • the associated view point is preferably placed on a predefined height above the point of interest.
  • the point of interest (104) may be defined to be disposed at a predefined height, in other words at a vertical distance above the floor level of the space element (101). In the latter case, the point of interest (104) and the respective associated view point may be collocated.
  • FIG. 2 illustrates a high-level flow chart of a service that is enabled by the disclosed method of automatic generation of a virtual reality (VR) walkthrough.
  • the service may be entirely web based, in which a user uploads BIM data to the service and, in return, receives an automatically generated VR walkthrough, which may be provided as a single data object.
  • the VR walkthrough may also comprise a plurality of data objects, that also comprise 360° images and for example audiovisual elements. Such additional elements may be originally linked to BIM data, and these may also become linked or included in the VR walkthrough.
  • the service may begin with receiving BIM data from the user in the phase 201.
  • the BIM data may be provided in various forms. If the BIM data received from the user is not in serial form, it may be first serialized in the phase 202. If the BIM data is received in serial form, there is no need for serializing, and phase 202 may be omitted.
  • the VR walkthrough is automatically generated in the method described in short as phase 203, which method will be described in more detail in relation to following figures 3, 5 and 6.
  • the automatically generated VR walkthrough is stored in the phase 204 to a web server, at which the VR walkthrough is available for the user and may be used for example by a user application running either on the web server or at a user device.
  • Figure 3 illustrates first phases of the method of automatic generation of a virtual walkthrough.
  • the automated method utilizes serialized BIM data.
  • Serialized BIM data (300) is then received as input to the method.
  • Serializing BIM data refers to a process of translating data structures or object state into a format comprising a series of bytes that can be stored or transmitted and reconstructed later.
  • a repeating process is then performed to identify a plurality of space elements in the serialized BIM data. Each round of the process starts with finding a space element in the serialized BIM data in the phase 301. If the serialized BIM data comprises IFC data, the found space element may be an IfcSpace element.
  • the process ends in step 307. This may occur when all space elements in the data have been handled in the process, or if no space element is found in the BIM data.
  • the found space element is tested against at least one pre-defined selection rule in the phase 303.
  • a selection rule may define that the plurality of space elements simply comprises all space elements comprised in the serialized BIM data, or the selection rule or a combination of selection rules may cause only part of all space elements to be selected. If the currently identified space element does not fulfill the selection rule(s), the process returns to step 301, and next space element is looked for in the serialized BIM data.
  • Testing the space element against selection rules in the phase 303 and selection of space elements in the phase 304 can be fully automatic, while predefined rules for the selection of space elements may be defined to control the automated process of selecting the space elements. For example, only space elements of a certain type or types of space, such as room space elements, may be selected. Size of the space may be used as a selection rule. For example, space element having a size measure below a first threshold value may be excluded from selection. The size measure and the first threshold value may be defined by area, in particular the floor area, of the space element, so that the first threshold defines a minimum area of a space element to be selected to be subject to next steps of the method.
  • the size measure may comprise volume of the space element, diameter of the space element (100, 101) via a computed middle point of the space or between two opposite corners of the floor of the space element, or the size measure may be defined as a length of at least one side of the floor of the space element (100, 101).
  • more complex selection criteria may be defined as any combination of the above-mentioned size measures and the respective first threshold values.
  • At least one point of interest (104) is calculated automatically for the space element in the phase 305.
  • a geometrical center of the floor of the space element is calculated, and the point of interest (104) is placed at the calculated geometrical center.
  • the point of interest, above which an 360° image probe is to be disposed is located on the floor of the space element, and a view point is defined at a predefined height above the floor level of the space element so that the 360° image view corresponds to that of an average person if he would actually stand on the floor of the respective space.
  • each point of interest is defined at the respective view point.
  • the point of interest is disposed above the floor level of the space element at a predefined height.
  • the point of interest and the respective view point are located in the same location in the cartesian XYZ-coordinates.
  • the space element information may comprise information on the area of a room represented by the space element. If the size of the space element exceeds a predefined first threshold value, at least one point of interest is calculated for a space element.
  • a plurality of points of interest may be calculated for a single space element. This will be described later in connection with figure 5.
  • Each automatically calculated point of interest (104) has a defined location in the space defined by the space elements that may be defined with three-dimensional coordinates, for example cartesian coordinates. All calculated points of interest and their coordinates are stored, so that they can be later used for example in generation of a navigation mesh.
  • the points of interest are stored in a database.
  • a 360° image is rendered for it in the phase 306.
  • Rendering a 360° image is, as such, known technology.
  • 360° image associated to a point of interest may be rendered from a view point vertically above the respective point of interest. If the point of interest is readily defined at a convenient view height, the 360° image may be rendered at the point of interest.
  • the height of the view point and/or the height of the point of interest above the floor plane of the space element may be adjusted via setting a parameter value for the automatic calculation algorithm.
  • any IFC data related to physical objects associated with the space element may be utilized for rendering the 360° image.
  • the physical elements, their placement, shape and materials may be utilized for 360° image rendering.
  • the rendered 360° image is stored.
  • the rendered 360° image is stored in a database.
  • the database may be a dedicated image database, or the same database may be used that is used for storing all data related to the virtual reality walktrough, including but not limited to the points of interest.
  • the location of the rendered 360° image in the database is also stored to enable establishment of a link between the 360° image and the respective point of interest.
  • Some identified space elements may be left without any points of interest if they do not fulfill selection criteria in the phase 303. Further, some space elements that initially fulfill the selection criteria in the phase 303 may fail in calculation of point of interest for various reasons. In such cases, no point of interest is calculated for the space element, and no 360° image is rendered.
  • the process of finding space components in the serialized BIM data is repeated until no more new space elements are found in the serialized BIM data. This is illustrated with the decision box phase 302 in which it is checked whether there are more space elements in the serialized BIM data to be processed. If, based on the checking in the phase 302, there are more space elements found, the process will continue with performing the phase 303. If there are no more space elements in the serialized BIM data, the process stops in the phase 307.
  • a plurality of points of interest (104) may be calculated, each representing a location in one of the plurality of space elements comprised in the serialized BIM data, and a respective 360° image has been rendered and stored for each of the plurality of points of interest.
  • All points of interest (104) may be calculated automatically on basis of the spaces defined by the BIM data, and a 360° image is automatically rendered for all of them also using the BIM data. All this can be done on basis of the serialized IBM data without requiring any manual steps in the process.
  • Various parameters of the BIM data may be utilized in the rendering process. If BIM data comprises IFC data, a non-limiting list of IFC parameters that may be used in the rendering comprises IfcMaterial Properties, IfcLightFixtureType, IfcSite and/or IfcGeometricRepresentationContext. Any method for 360° image rendering known in the art may be utilized.
  • Figure 4 illustrates an example of a space element for which a plurality of points of interest are to be defined.
  • a space element may be elected for defining a plurality of points of interest if it fulfills a size-based selection criterion.
  • size-based selection criterion may comprise a size measure of the space element.
  • a space element is elected for placement of multiple points of interest if the relevant size measure of the space element exceeds a second threshold value.
  • the size measure may comprise any one of volume of the space element, the area (floor area), diameter and length of at least one side of the floor of the space element.
  • a grid (120), illustrated with a dash dot lines, may be used for defining the locations of the plurality of points of interest.
  • Side length of the cells (121) and thus area of a cell of the grid (120) may be freely selected.
  • the side length of the cells (121) is defined by a parameter.
  • size of each cell (121) of the grid (120) equals to the minimum size of a space element that may be selected in the phases 303 and 304 to be included in the process steps for defining a point of interest.
  • the selected space element (101) that fulfills the size-based selection criteria for multiple points of interest is placed in the grid (120).
  • At least one criterion is defined for selecting which cells (121) of the grid are used for defining points of interest.
  • the criterion may comprise a selected size measure of the overlapping section of a cell (121) of the grid and the space element (101). Similar to selection rules defined for selecting space elements, the size measure may comprise for example area, diameter or length of at least one side, and the size measure may be correspondingly compared to a third threshold value.
  • the third threshold value may thus comprise a minimum area, minimum diameter or minimum length of at least one side of the overlapping section of a cell (121) of the grid and the space element (101).
  • the third threshold value is equal to the first threshold value so that a minimum section of a space element defined by the grid for which a point of interest to be generated has the same minimum size limit as a space element that is selected for receiving a single point of interest.
  • Further criteria may be defined for example on basis of walls within the section. For example, if the geometrical center of the cell (121) of the grid (120) falls outside the space element (101), the section may be omitted as point of interest. In the exemplary case of figure 4, six cells (121) of the grid with overlapping sections with the space element (101) have been selected on basis of the selection criteria, and thus six points of interest are defined in the geometrical center of the cells (121) for this particular space element (101).
  • the process also continues to another sub-process as illustrated with the connector "A" (400) in the figure 3.
  • the second sub-process is disclosed in more detail in connection with figure 5.
  • a second part of the method may also be started that defines relationships between found space elements. Phases of the method related to handling relationships of the space elements by openings are illustrated in the figure 5.
  • Each space element may have one or more openings (103) associated with it.
  • one or more openings associated with the space element are automatically found.
  • the opening(s) may be found in the BIM data.
  • an IFC space element may have an IfcOpeningElement defined for it.
  • the algorithm checks whether the opening represents a type of opening that is to be used. Typically, for virtual reality walkthrough purposes, openings that represents doors are selected for further processing, while other types of openings, such as windows or technical inlets intended for example for plumbing, air conditioning or like are omitted from further processing. Coordinates of the found opening are obtained in the phase 402. Coordinates may be obtained from BIM data either directly or by calculating a coordinate for the opening. For example, calculation of the coordinates of the opening may receive as input data known as Box Geometry of an IFC object, which represents dimension of the smallest box bounding the object. The obtained coordinates are preferably stored so that they may later be used.
  • any space elements connected to the opening are automatically found.
  • the opening connects two adjacent space elements.
  • the opening connects two space elements and the opening, more particularly the coordinates of the opening obtained in the phase 402, may thus be used as a navigation point between the two space elements. Openings which do not provide any connection between space elements may be omitted from further processing.
  • Handling of the openings acting as connections between space elements may be facilitated by naming the openings so that the particular opening has a name that indicates the names of the space element and the neighboring space element connected via the opening.
  • data on the openings including their names and coordinates may be stored in a database.
  • a name of the opening may be defined on basis of information found in the respective IfcSpaceElement.
  • the name of the opening may comprise the name of the space element into which the opening leads or names of the adjacent space elements that are connected by the opening.
  • the name of the opening is stored together with its coordinates. Similar to the coordinates of the points of interest, coordinates of the openings may be defined with three- dimensional coordinates, for example cartesian coordinates.
  • the method of finding openings is repeated as long as new openings are found that relate to the space element that is currently been handled. This is illustrated with the decision phase 405, in which it is checked whether there are more openings associated to the current space element. If there are more openings to be found, the process is repeated from the phase 401.
  • Figure 6 illustrates final phases of the method of automatic generation of a virtual walkthrough.
  • the final stages of the automated VR walkthrough generation process receive as input all the points of interest representing the plurality of space elements, coordinates of the points of interest and the respective rendered 360° images, the found openings, coordinates of the openings and information on which space elements are connected via the openings as illustrated by the connector "B" (500).
  • information points may be defined on basis of the BIM data and stored in the phase 501, which information points are to be shown in the appropriate VR view at a place that corresponds to coordinates of an IFC element.
  • Location of an information point may be defined based on location of the respective IFC element.
  • a rule or a ruleset may be defined for selecting which IFC elements are selected for generation of information points.
  • the rule or ruleset may comprise for example type of IFC element.
  • Data related to information points that will be shown to the user is stored in the database together with the coordinates of the information point, preferably cartesian coordinates, and all stored information points are included in the navigation mesh.
  • Information points may for example provide additional information on particular features of an IFC element.
  • an information point may give more technical or commercial details of a particular appliance, more detailed material information, information on possibility to make individual choices relating to this particular IFC element and so on.
  • the additional information comprised in the information point may be directly and automatically derived from the respective BIM data.
  • the user may receive the information available at the information point. Examples of the information provided by an information point are for example various details of household appliances or surface materials and/or colors of any item.
  • the phase of the method at which information points are defined may vary.
  • an information point may be defined and stored whenever an IFC element fulfilling a selection criterion for defining an information point is detected in the serialized BIM data. Only critical rule for defining information point is that as they should be defined and stored, thus available by calculating a navigation mesh.
  • the navigation mesh is calculated.
  • the navigation mesh is a mesh network that connects all points of interest, the openings connecting the respective space elements and the information points into a navigation network that enables navigating within the spaces defined by the space elements by moving between points of interest within the spaces.
  • a user of the VR walkthrough may point at any neighboring node of the navigation mesh to move between spaces in the VR walkthrough. For example, by pointing at a via node (a navigation point) residing at an opening in the 360° image of a space, the user may move into a point of interest in an adjacent space connected to the current space via the via node representing the opening.
  • a starting point is defined for the navigation mesh at the phase 503.
  • the starting point refers to a preselected node of the navigation mesh at which the user virtually resides when the VR walkthrough is initiated.
  • the starting point may be, for example, a space representing an entry hall of the apartment.
  • a virtual reality walkthrough data object is generated that comprises all VR walkthrough data found and calculated during the above method phases.
  • the virtual reality walkthrough data object thus combines all data created during the process into a VR walkthrough.
  • the virtual reality walkthrough data object comprises information on the navigation mesh, including points of interest and navigation points, as well as on information points, if such were defined, and links to the rendered 360° images associated with the respective points of interest.
  • the virtual reality walkthrough data object may further have a connection to the BIM data or to an external database, from which information may be obtained on basis of identifiers, such as a globally unique identifier (GUID).
  • GUID globally unique identifier
  • the virtual reality walkthrough data object which comprises data needed for running the VR walkthrough, is preferably stored in JavaScript Object Notation (JSON) format.
  • JSON is a lightweight data-interchange format that uses human-readable text to transmit data objects consisting of attribute-value pairs and array data types or any other serializable values, that is broadly utilized in web services.
  • Figure 7 illustrates features of a navigation mesh on top of a perspective view of an apartment.
  • the navigation mesh preferably comprises a plurality of points of interest (104) and a plurality of navigation points (605) that represent openings connecting spaces.
  • Two of the space elements (101-A, 101-B), representing two rooms of the apartment are marked in the figure.
  • Black two-ended arrows (608) between the points of interest (104) and navigation points (605) illustrate navigation mesh connections between the points of interest and navigation points over which the user of the VR walkthrough may virtually move around in and between the spaces.
  • the figure shows two information points (606).
  • Location of an information point (606) is defined by coordinates, such as cartesian coordinates, so that they may be placed in the navigation mesh and shown in the 360° image, so that any defined information points may be pointed and/or clicked at any 360° image view in which they are visible. Pointing and/or clicking an information point in the 360° image does not cause movement along the navigation mesh but provides additional information. For example, selecting an information point (606) may open a window in the virtual reality view that presents further information to the user.
  • a view point (610) illustrates a location from which a 360° image has been rendered for the associated point of interest (104).
  • a double-ended arrow (611) illustrates height of the view point above the floor level.
  • Similar view points are defined for all points of interest in the navigation mesh as disclosed above. Navigation between points of interest is possible either via navigation points or directly between points of interest. Especially, if a space includes multiple points of interest, it is possible to move between the points of interest directly for example by selecting, for example by pointing and/or clicking, another point of interest visualized in the current 360° image.
  • a BIM server (701) may store BIM data (700).
  • BIM data (700) may be stored in any applicable form.
  • the BIM data (700) may comprise serialized BIM data. If the BIM data is not serialized, the BIM server may be configured to serialize BIM data, or the BIM data may be serialized by the VR walkthrough generation apparatus (702).
  • the BIM server is communicatively connected to an VR walkthrough generation apparatus (702), which is configured to perform the steps of the method of automatically creating a VR walkthrough from the serialized BIM data as disclosed above.
  • the VR walkthrough generation apparatus (702) may be implemented in a physical or virtual server or a network of physical or virtual servers running computer software that performs the method steps.
  • the VR walkthrough generation apparatus (702) has a defined application programming interface (API) towards a web server (703).
  • API application programming interface
  • the virtual reality walkthrough data object may be provided to the web server over the API.
  • the virtual reality walkthrough data object is a JSON data object and it comprises the navigation mesh and all the 360° images of the VR walkthrough, or links to the 360° images.
  • a client application (704) residing at a client computer capable of reading JSON data objects and representing 360° images in one way or another may then contact the web server (703) for running the VR walkthrough.
  • the client application may be a browser application that shows a 2D view of the 360° image at its display.
  • the client application may be a virtual reality application that is capable of showing 360° images as 3D images for example via a 3D display.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The present invention relates to a method, a system and a computer program product for automatic generation of a virtual reality walkthrough based on building information model data. A plurality of space elements is found among serialized BIM data. A plurality of points of interest are automatically selected, each point of interest representing a location in one of the plurality of space elements. A viewpoint is calculated for each point of interest and a 360-degree image is rendered from the viewpoint that is associated with a respective one of the plurality of points of interest. One or more openings associated to each of the plurality of space elements are located, and adjacent space elements that are connected to the respective space element through via the openings are identified. A navigation point is placed at each opening connecting adjacent space elements. A navigation mesh comprising the plurality of points of interest and the plurality of navigation points is calculated. A virtual reality walkthrough data object is combined, the data object comprising the navigation mesh and links to the 360 images, and the data object is stored for future use.

Description

Automatic generation of a virtual reality walkthrough
Field
The present invention relates to a method, a system and a computer program product related to virtual reality. More particularly, the present invention relates to automated generation of a virtual reality walkthrough based on building information model data.
Background
Building Information Modeling (BIM), also referred to as Building Information Model, is a digital representation of physical and functional characteristics of a facility. A BIM is a shared knowledge resource for information about a facility forming a reliable basis for decisions during its life -cycle.
A BIM involves representing a design as combinations of objects that carry their geometry, relations and attributes. However, BIM goes beyond planning and design, extending throughout the building life cycle.
One standardized and commonly used way to describe building and construction data in BIM is to use Industry Foundation Classes (IFC). It is a platform neutral, open file format specification that is not controlled by a single vendor or group of vendors. IFC defines an entity-relationship model of entities organized into an object-based inheritance hierarchy. Examples of entities include building elements, geometry and basic constructs.
IfcProduct is an IFC base class that represents occurrences in space, such as physical building elements and spatial locations. IFC relationships (IfcRelationship) capture relationships among objects. IFC relationships may indicate a whole-part relationship having exclusive containment, such as subdividing a building into floors and rooms.
IFC product (IfcProduct) is a base class for any physical objects, and may be subdivided into spatial elements, physical elements, structural analysis items and other concepts. Products may have associated materials, shape representations, and placement in space. Spatial elements comprise for example sites (IfcSite), buildings (IfcBuilding), building storeys (IfcBuildingStorey) and spaces (IfcSpace) such as apartments or rooms. Physical building elements comprise for example walls (IfcWall), beams (IfcBeam), doors (IfcBeam), windows (IcfWindow), stairs (IfcStair) and so on. Various connectivity relationships may be used for IFC products, such as walls having openings filled by doors or windows. IFC products may also have placement, which can be defined locally (IfcLocalPlacement) in relation to an enclosing element, or in a grid (IfcGridPlacement) relative to a grid with user-defined axes. Physical elements may have defined materials, with optional properties such as mechanical, thermal properties and styles, such as colors and textures.
Other examples of BIM standards are for example various national standards such as CIS/2 standard by American Institute of Steel Construction, BuildingSmart Flong Kong, Virtual Design and Construction (VDC) in India, and a variety of national BIM implementations such as ONORM A6241 in Austria and building transition digital plan (PTNB) in France, to name a few. National and regional BIM promoting authorities and associations have been established. A Green building XML (gbXML) is an emerging schema focused on green building design and operation.
A virtual walkthrough, also referred to as a virtual tour, is a simulation of a location or facility, usually composed of a sequence of videos or still images. A virtual walkthrough enables for example a buyer of a house or a flat to virtually experience the property even if it only exists on a drawing table or is still in early phase of construction. A modern version of a virtual walkthrough comprises 360° images, which enable the viewer to explore the facility without being at the location, for example online.
Description of the related art
Patent application US 20160300392 A1 discloses systems, media, and methods for providing virtual reality tours. A user selects one or more vantage points on a 2D floorplan, and the system generates a virtual reality tour based on the 2D floorplan, vantage point coordinates and associated virtual reality content.
Even if prior art systems are capable of creating virtual walkthroughs, the process of generating the virtual walkthrough requires manual work, which makes the process laborious, slow and expensive. Summary
An object is to provide a method and apparatus so as to solve the problem of automating creation of virtual walkthroughs. The objects of the present invention are achieved with a method according to the characterizing portion of claim 1. The objects of the present invention are further achieved with a computer program product according to the characterizing portion of claim 10, a computer readable medium according to the characterizing portion of claim 11 and a virtual reality walkthrough generation system according to the characterizing portion of claim 12. The preferred embodiments of the invention are disclosed in the dependent claims. The present invention is based on the idea of automating calculation of navigation points needed for the virtual walkthrough and 360° images needed for generation of the virtual walkthrough.
The present invention has the advantage that the automatic virtual walkthrough generation is fully automated, which allows providing the entire generation of a virtual walkthrough for example as a web service. The customer simply uploads relevant BIM data for the service, and in response receives an automatically generated full virtual walkthrough.
According to a first aspect, a computer-implemented method for automatically generating a virtual reality walkthrough based on Building Information Model (BIM) data is provided. The method comprises obtaining serialized BIM data, finding a plurality of BIM space elements among the serialized BIM data and automatically selecting a plurality of points of interest, wherein each point of interest represents a location in one of the plurality of BIM space elements. The method also comprises calculating a viewpoint for each point of interest, rendering a 360° image from each of the viewpoints and associating the rendered 360° image with the respective one of the plurality of points of interest. The method also comprises locating one or more openings associated to each of the plurality of BIM space elements, identifying adjacent BIM space elements that are connected to the respective BIM space element through via the openings and placing a navigation point at each opening connecting adjacent BIM space elements. A navigation mesh is calculated, the navigation mesh comprising the plurality of points of interest and the plurality of navigation points placed at the locations of the openings connecting BIM space elements. A virtual reality walkthrough data object is combined for the plurality of BIM space elements, the virtual reality walkthrough data object comprising the navigation mesh and any one of the 360 images and links to the 360 images. The virtual reality walkthrough data object is stored.
According to a second aspect, the selecting of the points of interest for a BIM space element comprises, if a size measure of the BIM space element is above a first threshold value but below a second threshold value, calculating a geometrical center of a floor of a plurality of the BIM space elements, and placing the respective point of interest at the calculated geometrical center of the respective BIM space element; if the size measure of the BIM space element is above a second threshold value, defining a grid with a predefined cell size, calculating a geometrical center for a plurality of cells of the grid overlapping the BIM space element, and placing a plurality of points of interest at the calculated geometrical centers of the overlapping cells, and if the size measure of the BIM space element is below the first threshold value, not placing any point of interest. According to a third aspect, a point of interest is placed at a cell if size measure of an overlapping section of the BIM space element and the respective cell of the grid exceeds a third threshold value.
According to a fourth aspect, the third threshold value is equal to the first threshold value. According to a fifth aspect, the size measure is any one of volume, area, diameter and length of at least one side of an area.
According to a sixth aspect, the rendering the 360° image comprises placing a 360° image probe at a predetermined height above the point of interest, and calculating at least one of a 2D 360° image and a 3D 360° image on basis of BIM data of physical building elements connected to the BIM space element associated with the point of interest. According to a seventh aspect, the locating one or more openings comprises identifying an BIM product associated with the respective BIM space element that has an opening associated with it and obtaining coordinates of the opening connecting BIM space elements from BIM data. According to an eighth aspect, the method further comprises identifying a plurality of information points based on selected BIM elements comprised the serialized BIM data, associating each information point with a location in the space, including the information points in the navigation mesh, and storing the plurality of information points in the virtual reality walkthrough data object.
According to another aspect, a computer program product for automatically generating a virtual reality walkthrough from Building Information Model (BIM) data is provided, wherein the computer program product is configured to perform the computer-implemented method steps according to any one of the first to eighth aspect.
According to a further aspect, a computer readable medium having stored thereon a computer program product according to the above aspect.
According to yet another aspect, a virtual reality walkthrough generation system is provided, that generates a virtual reality automatically based on Building Information Model (BIM) data by carrying out the computer- implemented method according to any one of the first to eighth aspect.
According to another aspect, a computer program product having instructions which, when executed by a computing device or system cause the computing device or system to perform a method according to any one of the first to eighth aspect. Brief description of the drawings
In the following the invention will be described in greater detail, in connection with preferred embodiments, with reference to the attached drawings, in which Figure 1 shows an exemplary BIM space element diagram of an apartment as a 2D floor plan.
Figure 2 illustrates a high-level flow chart of an automatic virtual reality (VR) walkthrough generation service.
Figure 3 illustrates first phases of the method of automatic generation of a virtual walkthrough.
Figure 4 illustrates an example of a BIM space element for which a plurality of points of interest are to be defined.
Figure 5 is a flow chart illustrating phases of a method related to handling relationships of the BIM space elements via openings. Figure 6 illustrates phases of the method of automatic generation of a virtual walkthrough.
Figure 7 illustrates features of a navigation mesh on top of a perspective view of a flat.
Figure 8 illustrates a system, in particular a computer system. Detailed description
Navigation point is a point or node by pointing which a user can navigate to a node of a navigation mesh. A navigation point also refers to coordinates of a navigation button to a neighboring point of interest that is connected via the navigation mesh. Coordinates of the navigation points may be automatically calculated from IFC data. A navigation mesh is a network that declares all possible navigation routes. A navigation mesh may comprise navigation points, points of interest and information points. A navigation route refers to a virtual walk through path from node to node. An information point refers an IFC element about which information is to be shown and also the coordinates for the information button.
Both terms space element and BIM space element refer to a space element defined in any type of BIM data. For example, a space element or a BIM space element may be an IFC Space Element (IfcSpace). A BIM space element may refer for example to an apartment or a room. The terms space element and BIM space element may be used alternatively.
The figure 1 shows an exemplary, simplified space element diagram of an apartment shown as a 2D floor plan. The exemplary space comprises an apartment space element (100) that comprises three room space elements (101). The space elements (100, 101) may be associated with walls that have openings (103). When IFC is used, the space elements (100, 101) are spatial elements, and they are associated with physical elements such as walls, beams, doors, windows, stairs and so on. The openings define connectivity relationships between space elements (100, 101), and represent for example doorways, doors and/or windows. Some of the openings are associated with two adjacent room space elements, thus connecting the rooms, while others may connect a room towards outside. At least one point of interest (104) may be defined for at least some of the space elements (100, 101). The point of interest (104) represents coordinates of a navigation point in the space element. Preferably, the point of interest is disposed on the floor of the respective space element. If the point of interest is on the floor, above the point of interest (104) a view point is defined, at which a 360° image of the respective space is to be rendered. The rendered 360° image is, as the name defines, a 360-degree rendered image from the view point associated with the respective point of interest, which is wrapped around a sphere to enable the viewer to virtually look at any direction around him/her from the point of view. The location of a point of interest may be expressed in cartesian (XYZ) coordinates. The x- and y-coordinates of the point of interest, which correspond to the x- and y-coordinates of the respective view point, are preferably defined so that the point of interest (104) resides within the space element (101). The z-coordinate, in other words the vertical height, of the point of interest may be defined so that it resides at the floor level of the space element. In such case, the associated view point is preferably placed on a predefined height above the point of interest. Alternatively, the point of interest (104) may be defined to be disposed at a predefined height, in other words at a vertical distance above the floor level of the space element (101). In the latter case, the point of interest (104) and the respective associated view point may be collocated. Both the point of interest and the associated view point should be located within the volume of the space element. Figure 2 illustrates a high-level flow chart of a service that is enabled by the disclosed method of automatic generation of a virtual reality (VR) walkthrough. The service may be entirely web based, in which a user uploads BIM data to the service and, in return, receives an automatically generated VR walkthrough, which may be provided as a single data object. The VR walkthrough may also comprise a plurality of data objects, that also comprise 360° images and for example audiovisual elements. Such additional elements may be originally linked to BIM data, and these may also become linked or included in the VR walkthrough. The service may begin with receiving BIM data from the user in the phase 201. The BIM data may be provided in various forms. If the BIM data received from the user is not in serial form, it may be first serialized in the phase 202. If the BIM data is received in serial form, there is no need for serializing, and phase 202 may be omitted.
Based on the serialized BIM data, the VR walkthrough is automatically generated in the method described in short as phase 203, which method will be described in more detail in relation to following figures 3, 5 and 6.
The automatically generated VR walkthrough is stored in the phase 204 to a web server, at which the VR walkthrough is available for the user and may be used for example by a user application running either on the web server or at a user device.
Figure 3 illustrates first phases of the method of automatic generation of a virtual walkthrough. The automated method utilizes serialized BIM data. Thus, if the BIM data is not serial to begin with, it needs to be serialized first. Serialized BIM data (300) is then received as input to the method. Serializing BIM data refers to a process of translating data structures or object state into a format comprising a series of bytes that can be stored or transmitted and reconstructed later. A repeating process is then performed to identify a plurality of space elements in the serialized BIM data. Each round of the process starts with finding a space element in the serialized BIM data in the phase 301. If the serialized BIM data comprises IFC data, the found space element may be an IfcSpace element. In the phase 302 it is tested, whether a new space element was identified. If not, the process ends in step 307. This may occur when all space elements in the data have been handled in the process, or if no space element is found in the BIM data. After a space element has been found in the phases 301 and 302, the found space element is tested against at least one pre-defined selection rule in the phase 303. A selection rule may define that the plurality of space elements simply comprises all space elements comprised in the serialized BIM data, or the selection rule or a combination of selection rules may cause only part of all space elements to be selected. If the currently identified space element does not fulfill the selection rule(s), the process returns to step 301, and next space element is looked for in the serialized BIM data. Testing the space element against selection rules in the phase 303 and selection of space elements in the phase 304 can be fully automatic, while predefined rules for the selection of space elements may be defined to control the automated process of selecting the space elements. For example, only space elements of a certain type or types of space, such as room space elements, may be selected. Size of the space may be used as a selection rule. For example, space element having a size measure below a first threshold value may be excluded from selection. The size measure and the first threshold value may be defined by area, in particular the floor area, of the space element, so that the first threshold defines a minimum area of a space element to be selected to be subject to next steps of the method. Using area as a selection parameter facilitates simple and straightforward selection criterion, since area, typically floor area, of the space element is a typical parameter included regularly in the BIM data for each space element. Alternatively, the size measure may comprise volume of the space element, diameter of the space element (100, 101) via a computed middle point of the space or between two opposite corners of the floor of the space element, or the size measure may be defined as a length of at least one side of the floor of the space element (100, 101). Further, more complex selection criteria may be defined as any combination of the above-mentioned size measures and the respective first threshold values.
At least one point of interest (104) is calculated automatically for the space element in the phase 305. In one embodiment, a geometrical center of the floor of the space element is calculated, and the point of interest (104) is placed at the calculated geometrical center. Preferably, the point of interest, above which an 360° image probe is to be disposed, is located on the floor of the space element, and a view point is defined at a predefined height above the floor level of the space element so that the 360° image view corresponds to that of an average person if he would actually stand on the floor of the respective space.
In an alternative embodiment, each point of interest is defined at the respective view point. In other words, the point of interest is disposed above the floor level of the space element at a predefined height. In such case, the point of interest and the respective view point are located in the same location in the cartesian XYZ-coordinates.
Various parameters may be used for guiding the automatic point of interest calculation into the wanted end result. For example, the space element information may comprise information on the area of a room represented by the space element. If the size of the space element exceeds a predefined first threshold value, at least one point of interest is calculated for a space element.
In another embodiment, a plurality of points of interest may be calculated for a single space element. This will be described later in connection with figure 5. Each automatically calculated point of interest (104) has a defined location in the space defined by the space elements that may be defined with three-dimensional coordinates, for example cartesian coordinates. All calculated points of interest and their coordinates are stored, so that they can be later used for example in generation of a navigation mesh. Preferably, the points of interest are stored in a database.
After calculating a point of interest (104) in the phase 305, a 360° image is rendered for it in the phase 306. Rendering a 360° image is, as such, known technology. To ensure a convenient and natural height for the view point, 360° image associated to a point of interest may be rendered from a view point vertically above the respective point of interest. If the point of interest is readily defined at a convenient view height, the 360° image may be rendered at the point of interest. The height of the view point and/or the height of the point of interest above the floor plane of the space element may be adjusted via setting a parameter value for the automatic calculation algorithm.
Any IFC data related to physical objects associated with the space element may be utilized for rendering the 360° image. Especially, the physical elements, their placement, shape and materials may be utilized for 360° image rendering. The rendered 360° image is stored. Preferably, the rendered 360° image is stored in a database. The database may be a dedicated image database, or the same database may be used that is used for storing all data related to the virtual reality walktrough, including but not limited to the points of interest. The location of the rendered 360° image in the database is also stored to enable establishment of a link between the 360° image and the respective point of interest. Some identified space elements may be left without any points of interest if they do not fulfill selection criteria in the phase 303. Further, some space elements that initially fulfill the selection criteria in the phase 303 may fail in calculation of point of interest for various reasons. In such cases, no point of interest is calculated for the space element, and no 360° image is rendered.
The process of finding space components in the serialized BIM data is repeated until no more new space elements are found in the serialized BIM data. This is illustrated with the decision box phase 302 in which it is checked whether there are more space elements in the serialized BIM data to be processed. If, based on the checking in the phase 302, there are more space elements found, the process will continue with performing the phase 303. If there are no more space elements in the serialized BIM data, the process stops in the phase 307. As a result of this part of the method, a plurality of points of interest (104) may be calculated, each representing a location in one of the plurality of space elements comprised in the serialized BIM data, and a respective 360° image has been rendered and stored for each of the plurality of points of interest. All points of interest (104) may be calculated automatically on basis of the spaces defined by the BIM data, and a 360° image is automatically rendered for all of them also using the BIM data. All this can be done on basis of the serialized IBM data without requiring any manual steps in the process. Various parameters of the BIM data may be utilized in the rendering process. If BIM data comprises IFC data, a non-limiting list of IFC parameters that may be used in the rendering comprises IfcMaterial Properties, IfcLightFixtureType, IfcSite and/or IfcGeometricRepresentationContext. Any method for 360° image rendering known in the art may be utilized.
Figure 4 illustrates an example of a space element for which a plurality of points of interest are to be defined. A space element may be elected for defining a plurality of points of interest if it fulfills a size-based selection criterion. Such size-based selection criterion may comprise a size measure of the space element. A space element is elected for placement of multiple points of interest if the relevant size measure of the space element exceeds a second threshold value. The size measure may comprise any one of volume of the space element, the area (floor area), diameter and length of at least one side of the floor of the space element. If the space element (101) was selected for placing a plurality of points of interest, a grid (120), illustrated with a dash dot lines, may be used for defining the locations of the plurality of points of interest. Side length of the cells (121) and thus area of a cell of the grid (120) may be freely selected. Preferably, the side length of the cells (121) is defined by a parameter. In one embodiment, size of each cell (121) of the grid (120) equals to the minimum size of a space element that may be selected in the phases 303 and 304 to be included in the process steps for defining a point of interest.
The selected space element (101) that fulfills the size-based selection criteria for multiple points of interest is placed in the grid (120). At least one criterion is defined for selecting which cells (121) of the grid are used for defining points of interest. The criterion may comprise a selected size measure of the overlapping section of a cell (121) of the grid and the space element (101). Similar to selection rules defined for selecting space elements, the size measure may comprise for example area, diameter or length of at least one side, and the size measure may be correspondingly compared to a third threshold value. The third threshold value may thus comprise a minimum area, minimum diameter or minimum length of at least one side of the overlapping section of a cell (121) of the grid and the space element (101). In one embodiment, the third threshold value is equal to the first threshold value so that a minimum section of a space element defined by the grid for which a point of interest to be generated has the same minimum size limit as a space element that is selected for receiving a single point of interest. Further criteria may be defined for example on basis of walls within the section. For example, if the geometrical center of the cell (121) of the grid (120) falls outside the space element (101), the section may be omitted as point of interest. In the exemplary case of figure 4, six cells (121) of the grid with overlapping sections with the space element (101) have been selected on basis of the selection criteria, and thus six points of interest are defined in the geometrical center of the cells (121) for this particular space element (101). Thus, also six 360° images will be rendered for this space element (101), each associated with the respective one of the plurality of points of interest (104). Other cells (121) have been omitted, since the section of the space element (101) within the cell is either too small, or because the geometrical center of the cell (121) falls outside the respective section of the space element (101). Each point of interest is stored with its coordinates, as well as the associated 360° images.
When a space element has been found in the phase 301, the process also continues to another sub-process as illustrated with the connector "A" (400) in the figure 3. The second sub-process is disclosed in more detail in connection with figure 5. After finding a space element (101) in the phase 301, a second part of the method may also be started that defines relationships between found space elements. Phases of the method related to handling relationships of the space elements by openings are illustrated in the figure 5. Each space element may have one or more openings (103) associated with it. In the phase 401, one or more openings associated with the space element are automatically found. The opening(s) may be found in the BIM data. For example, an IFC space element may have an IfcOpeningElement defined for it. If an opening (103) is found, the algorithm checks whether the opening represents a type of opening that is to be used. Typically, for virtual reality walkthrough purposes, openings that represents doors are selected for further processing, while other types of openings, such as windows or technical inlets intended for example for plumbing, air conditioning or like are omitted from further processing. Coordinates of the found opening are obtained in the phase 402. Coordinates may be obtained from BIM data either directly or by calculating a coordinate for the opening. For example, calculation of the coordinates of the opening may receive as input data known as Box Geometry of an IFC object, which represents dimension of the smallest box bounding the object. The obtained coordinates are preferably stored so that they may later be used.
In the phase 403, any space elements connected to the opening are automatically found. Thus, the opening connects two adjacent space elements. When two space elements are connected to the same opening, the opening connects two space elements and the opening, more particularly the coordinates of the opening obtained in the phase 402, may thus be used as a navigation point between the two space elements. Openings which do not provide any connection between space elements may be omitted from further processing.
Handling of the openings acting as connections between space elements may be facilitated by naming the openings so that the particular opening has a name that indicates the names of the space element and the neighboring space element connected via the opening. Like any other data obtained during the process, also data on the openings including their names and coordinates may be stored in a database. For example, in BIM data according to IFC standard, a name of the opening may be defined on basis of information found in the respective IfcSpaceElement. For example, the name of the opening may comprise the name of the space element into which the opening leads or names of the adjacent space elements that are connected by the opening. The name of the opening is stored together with its coordinates. Similar to the coordinates of the points of interest, coordinates of the openings may be defined with three- dimensional coordinates, for example cartesian coordinates.
The method of finding openings is repeated as long as new openings are found that relate to the space element that is currently been handled. This is illustrated with the decision phase 405, in which it is checked whether there are more openings associated to the current space element. If there are more openings to be found, the process is repeated from the phase 401.
When all space elements fulfilling the selection rules have been identified, 360° images for them have been rendered and all openings connecting the space elements have been found and stored, the method may enter into a third phase as illustrated with connector "B" (500). Figure 6 illustrates final phases of the method of automatic generation of a virtual walkthrough.
The final stages of the automated VR walkthrough generation process receive as input all the points of interest representing the plurality of space elements, coordinates of the points of interest and the respective rendered 360° images, the found openings, coordinates of the openings and information on which space elements are connected via the openings as illustrated by the connector "B" (500).
In addition to points of interest and navigation points, information points may be defined on basis of the BIM data and stored in the phase 501, which information points are to be shown in the appropriate VR view at a place that corresponds to coordinates of an IFC element. Location of an information point may be defined based on location of the respective IFC element. A rule or a ruleset may be defined for selecting which IFC elements are selected for generation of information points. The rule or ruleset may comprise for example type of IFC element. Data related to information points that will be shown to the user is stored in the database together with the coordinates of the information point, preferably cartesian coordinates, and all stored information points are included in the navigation mesh. Information points may for example provide additional information on particular features of an IFC element. For example, an information point may give more technical or commercial details of a particular appliance, more detailed material information, information on possibility to make individual choices relating to this particular IFC element and so on. The additional information comprised in the information point may be directly and automatically derived from the respective BIM data. By pointing at an information point shown at a particular location in the 360° view of the space, the user may receive the information available at the information point. Examples of the information provided by an information point are for example various details of household appliances or surface materials and/or colors of any item. The phase of the method at which information points are defined may vary. For example, instead of the disclosed phase just before calculating the navigation mesh, an information point may be defined and stored whenever an IFC element fulfilling a selection criterion for defining an information point is detected in the serialized BIM data. Only critical rule for defining information point is that as they should be defined and stored, thus available by calculating a navigation mesh.
In the phase 502, the navigation mesh is calculated. The navigation mesh is a mesh network that connects all points of interest, the openings connecting the respective space elements and the information points into a navigation network that enables navigating within the spaces defined by the space elements by moving between points of interest within the spaces. When at a specific navigation point of the navigation mesh, a user of the VR walkthrough may point at any neighboring node of the navigation mesh to move between spaces in the VR walkthrough. For example, by pointing at a via node (a navigation point) residing at an opening in the 360° image of a space, the user may move into a point of interest in an adjacent space connected to the current space via the via node representing the opening. After moving to another space, the user will be virtually moved to a point of interest of the adjacent space connected to the via node and shown the respective 360° image associated with the point of interest in that space. When a space has multiple points of interest defined in it, the user can virtually move between these simply by pointing and/or clicking any of the nodes representing other points of interest shown in the current 360° image. After the navigation mesh has been calculated in the phase 502, a starting point is defined for the navigation mesh at the phase 503. The starting point refers to a preselected node of the navigation mesh at which the user virtually resides when the VR walkthrough is initiated. The starting point may be, for example, a space representing an entry hall of the apartment.
In the phase 504, a virtual reality walkthrough data object is generated that comprises all VR walkthrough data found and calculated during the above method phases. The virtual reality walkthrough data object thus combines all data created during the process into a VR walkthrough. The virtual reality walkthrough data object comprises information on the navigation mesh, including points of interest and navigation points, as well as on information points, if such were defined, and links to the rendered 360° images associated with the respective points of interest. The virtual reality walkthrough data object may further have a connection to the BIM data or to an external database, from which information may be obtained on basis of identifiers, such as a globally unique identifier (GUID).
The virtual reality walkthrough data object, which comprises data needed for running the VR walkthrough, is preferably stored in JavaScript Object Notation (JSON) format. JSON is a lightweight data-interchange format that uses human-readable text to transmit data objects consisting of attribute-value pairs and array data types or any other serializable values, that is broadly utilized in web services.
Figure 7 illustrates features of a navigation mesh on top of a perspective view of an apartment. The navigation mesh preferably comprises a plurality of points of interest (104) and a plurality of navigation points (605) that represent openings connecting spaces. Two of the space elements (101-A, 101-B), representing two rooms of the apartment are marked in the figure. An opening (103), in this case a door opening, connects one of the space elements (101-A) with another space element (101-B), and a navigation point (605) has been placed at the opening to allow navigation between the points of interest (104) corresponding to the two adjacent space elements. Black two-ended arrows (608) between the points of interest (104) and navigation points (605) illustrate navigation mesh connections between the points of interest and navigation points over which the user of the VR walkthrough may virtually move around in and between the spaces. Furthermore, the figure shows two information points (606). Location of an information point (606) is defined by coordinates, such as cartesian coordinates, so that they may be placed in the navigation mesh and shown in the 360° image, so that any defined information points may be pointed and/or clicked at any 360° image view in which they are visible. Pointing and/or clicking an information point in the 360° image does not cause movement along the navigation mesh but provides additional information. For example, selecting an information point (606) may open a window in the virtual reality view that presents further information to the user. A view point (610) illustrates a location from which a 360° image has been rendered for the associated point of interest (104). A double-ended arrow (611) illustrates height of the view point above the floor level. Although not shown in the figure, similar view points are defined for all points of interest in the navigation mesh as disclosed above. Navigation between points of interest is possible either via navigation points or directly between points of interest. Especially, if a space includes multiple points of interest, it is possible to move between the points of interest directly for example by selecting, for example by pointing and/or clicking, another point of interest visualized in the current 360° image.
Figure 8 illustrates a system. A BIM server (701) may store BIM data (700). BIM data (700) may be stored in any applicable form. For example, the BIM data (700) may comprise serialized BIM data. If the BIM data is not serialized, the BIM server may be configured to serialize BIM data, or the BIM data may be serialized by the VR walkthrough generation apparatus (702). The BIM server is communicatively connected to an VR walkthrough generation apparatus (702), which is configured to perform the steps of the method of automatically creating a VR walkthrough from the serialized BIM data as disclosed above. The VR walkthrough generation apparatus (702) may be implemented in a physical or virtual server or a network of physical or virtual servers running computer software that performs the method steps. The VR walkthrough generation apparatus (702) has a defined application programming interface (API) towards a web server (703). Upon generating a virtual reality walkthrough data object in the phase 504 that comprises data needed for the VR walkthrough, the virtual reality walkthrough data object may be provided to the web server over the API. Preferably, the virtual reality walkthrough data object is a JSON data object and it comprises the navigation mesh and all the 360° images of the VR walkthrough, or links to the 360° images. A client application (704) residing at a client computer capable of reading JSON data objects and representing 360° images in one way or another may then contact the web server (703) for running the VR walkthrough. In one embodiment, the client application may be a browser application that shows a 2D view of the 360° image at its display. Alternatively, the client application may be a virtual reality application that is capable of showing 360° images as 3D images for example via a 3D display.
It is apparent to a person skilled in the art that as technology advanced, the basic idea of the invention can be implemented in various ways. The invention and its embodiments are therefore not restricted to the above examples, but they may vary within the scope of the claims.

Claims

Claims
1. A computer-implemented method for automatically generating a virtual reality walkthrough based on Building Information Model (BIM) data, wherein the method comprises
- obtaining serialized BIM data;
- finding a plurality of BIM space elements among the serialized BIM data; characterized in that the method further comprises:
- automatically selecting a plurality of points of interest for each space element with a size that exceeds a first threshold value, wherein each point of interest representing a location in one of the plurality of BIM space elements;
- calculating a viewpoint for each point of interest;
- rendering a 360° image from each of the viewpoints and associating the rendered 360° image with the respective one of the plurality of points of interest;
- locating one or more openings associated to each of the plurality of BIM space elements;
- identifying adjacent BIM space elements that are connected to each other via the openings;
- automatically placing a navigation point at each opening connecting adjacent BIM space elements to define a plurality of navigation points;
- calculating a navigation mesh comprising the plurality of points of interest and the plurality of navigation points;
- combining a virtual reality walkthrough data object for the plurality of BIM space elements, the virtual reality walkthrough data object comprising the navigation mesh and either the 360° images or links to the 360° images associated with the plurality of points of interest; and
- storing the virtual reality walkthrough data object.
2. The computer-implemented method according to claim 1, wherein selecting point of interest for a BIM space element comprises:
- if a size measure of the BIM space element is above a first threshold value but below a second threshold value, calculating a geometrical center of a floor of the BIM space element, and placing the respective point of interest at the calculated geometrical center of the respective BIM space element,
- if the size measure of the BIM space element is above a second threshold value, defining a grid with a predefined cell size, calculating a geometrical center for a plurality of cells of the grid overlapping the BIM space element, and placing a plurality of points of interest at the calculated geometrical centers of the overlapping cells,
- if the size measure of the BIM space element is below the first threshold value, not placing any point of interest.
3. The computer-implemented method according to claim 2, wherein a point of interest is placed at a cell if a size measure of an overlapping section of the BIM space element and the respective cell of the grid exceeds a third threshold value.
4. The computer-implemented method according to claim 3, wherein the third threshold value is equal to the first threshold value.
5. The computer-implemented method according to any one of claims 2 to 4, wherein the size measure is any one of volume, area, diameter and length of at least one side of an area.
6. The computer-implemented method according to any of claims 1 to 5, wherein the rendering the 360° image comprises:
- placing the viewpoint at a predetermined height above the respective point of interest, and
- calculating at least one of a 2D 360° image and a 3D 360° image on basis of BIM data of physical building elements connected to the BIM space element associated with the point of interest.
7. The computer-implemented method according to any of claims 1 to 6, wherein the locating one or more openings comprises:
- identifying a BIM element associated with the respective BIM space element that has an opening associated with it, and
- obtaining coordinates of the opening connecting BIM space elements from BIM data.
8. The computer-implemented method according to any of claims
1 to 7, further comprising:
- identifying a plurality of information points based on selected BIM elements comprised in the serialized BIM data,
- associating each information point with a location, - including the information points in the navigation mesh, and
- storing the plurality of information points in the virtual reality walkthrough data object.
9. A computer program product for automatically generating a virtual reality walkthrough from BIM data, wherein the computer program product is configured to perform the computer-implemented method steps according to any one of claims 1 to 8.
10. A computer readable medium having stored thereon a computer program product according to claim 9.
11. A virtual reality walkthrough generation system that generates a virtual reality walkthrough automatically based on BIM data by carrying out the computer-implemented method according to any one of claims 1 to 8.
EP19766058.2A 2018-08-30 2019-08-27 Automatic generation of a virtual reality walkthrough Withdrawn EP3844724A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FI20185717A FI20185717A1 (en) 2018-08-30 2018-08-30 Automatic generation of a virtual reality walkthrough
PCT/FI2019/050604 WO2020043942A1 (en) 2018-08-30 2019-08-27 Automatic generation of a virtual reality walkthrough

Publications (1)

Publication Number Publication Date
EP3844724A1 true EP3844724A1 (en) 2021-07-07

Family

ID=67909414

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19766058.2A Withdrawn EP3844724A1 (en) 2018-08-30 2019-08-27 Automatic generation of a virtual reality walkthrough

Country Status (3)

Country Link
EP (1) EP3844724A1 (en)
FI (1) FI20185717A1 (en)
WO (1) WO2020043942A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111460542B (en) * 2020-04-03 2023-08-01 湖南翰坤实业有限公司 Building design drawing processing method and system based on BIM and VR
CN112507427B (en) * 2020-11-30 2022-12-06 同济大学建筑设计研究院(集团)有限公司 Automatic conversion method, medium and system for EICAD and IFC route data

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6633317B2 (en) * 2001-01-02 2003-10-14 Microsoft Corporation Image-based walkthrough system and process employing spatial video streaming
JP3967343B2 (en) * 2004-08-03 2007-08-29 福井コンピュータ株式会社 Moving path setting device, moving path setting program
WO2010052550A2 (en) * 2008-11-05 2010-05-14 Easywalk Capital S.A. System and method for creating and broadcasting interactive panoramic walk-through applications
US20160300392A1 (en) 2015-04-10 2016-10-13 VR Global, Inc. Systems, media, and methods for providing improved virtual reality tours and associated analytics

Also Published As

Publication number Publication date
WO2020043942A1 (en) 2020-03-05
FI20185717A1 (en) 2020-03-01

Similar Documents

Publication Publication Date Title
Tran et al. Shape grammar approach to 3D modeling of indoor environments using point clouds
Hong et al. Semi-automated approach to indoor mapping for 3D as-built building information modeling
CN109408044B (en) BIM data and GIS data integration method based on glTF
Lilis et al. Automatic generation of second-level space boundary topology from IFC geometry inputs
Ma et al. 3D object classification using geometric features and pairwise relationships
Xiong et al. Automatic creation of semantically rich 3D building models from laser scanner data
Batty et al. Geographical information systems and urban design
Fisher-Gewirtzman et al. Voxel based volumetric visibility analysis of urban environments
CN110647610B (en) Construction method and device of power system visualization platform
WO2020043942A1 (en) Automatic generation of a virtual reality walkthrough
CN114095716B (en) Monitoring camera three-dimensional simulation method and system based on BIM technology and GIS technology
Xiao et al. Coupling point cloud completion and surface connectivity relation inference for 3D modeling of indoor building environments
Kaufmann et al. ScaleBIM: Introducing a scalable modular framework to transfer point clouds into semantically rich building information models
CN107665500A (en) Use the interior design system of Real-time Rendering Technology
Ying et al. A two‐stage recursive ray tracing algorithm to automatically identify external building objects in building information models
Isikdag et al. Interactive modelling of buildings in Google Earth: A 3D tool for Urban Planning
Park et al. A comparison of network model creation algorithms based on the quality of wayfinding results
Sand et al. Incremental reconstruction of planar B-Rep models from multiple point clouds
Liu et al. Semi-automated processing and routing within indoor structures for emergency response applications
CN113538562B (en) Indoor area determination method and device, electronic equipment and storage medium
Ng et al. Syntable: A synthetic data generation pipeline for unseen object amodal instance segmentation of cluttered tabletop scenes
Sahebdivani et al. Deep learning based classification of color point cloud for 3D reconstruction of interior elements of buildings
Boochs et al. Integration of knowledge to support automatic object reconstruction from images and 3D data
Cheng Algorithm of CAD surface generation for complex pipe model in Industry 4.0 background
Utkucu et al. MEP domain object classification through interdomain rule-based semantic enrichment on knowledge graphs

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20210210

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20230301