CN115546453A - Virtual exhibition hall information synchronization method, device and system based on offline exhibition hall - Google Patents

Virtual exhibition hall information synchronization method, device and system based on offline exhibition hall Download PDF

Info

Publication number
CN115546453A
CN115546453A CN202211523425.1A CN202211523425A CN115546453A CN 115546453 A CN115546453 A CN 115546453A CN 202211523425 A CN202211523425 A CN 202211523425A CN 115546453 A CN115546453 A CN 115546453A
Authority
CN
China
Prior art keywords
virtual
exhibition hall
offline
elements
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211523425.1A
Other languages
Chinese (zh)
Other versions
CN115546453B (en
Inventor
叶颂洪
石蕊
陈玉荣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tus Digital Technology Shenzhen Co ltd
Original Assignee
Tus Digital Technology Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tus Digital Technology Shenzhen Co ltd filed Critical Tus Digital Technology Shenzhen Co ltd
Priority to CN202211523425.1A priority Critical patent/CN115546453B/en
Publication of CN115546453A publication Critical patent/CN115546453A/en
Application granted granted Critical
Publication of CN115546453B publication Critical patent/CN115546453B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/023Services making use of location information using mutual or relative location information between multiple location based services [LBS] targets or of distance thresholds
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/33Services specially adapted for particular environments, situations or purposes for indoor environments, e.g. buildings
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W64/00Locating users or terminals or network equipment for network management purposes, e.g. mobility management
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The invention discloses a virtual exhibition hall information synchronization method, a device and a system based on an offline exhibition hall, wherein the method comprises the following steps: step S100, obtaining the user' S off-line position information; step S200, determining a virtual position in the virtual exhibition hall according to the offline position information; step S300, extracting model elements corresponding to the virtual positions based on the virtual positions to obtain target model elements, wherein the model elements are elements pre-constructed in the virtual exhibition hall, and the model elements correspond to exhibition elements in the off-line exhibition hall one to one; step S400, enabling the virtual visual angle of the virtual camera to face to a target model element in the virtual exhibition hall; step S500, model elements in the virtual visual angle range are presented in a visual mode. The following roaming of the virtual exhibition hall is realized, the data transmission quantity is reduced, the real-time performance is improved, the data transmission efficiency is improved, the virtual scene is displayed as completely as possible, and the synchronous cooperation of the exhibition information of the off-line exhibition hall and the exhibition information of the on-line virtual exhibition hall is realized.

Description

Virtual exhibition hall information synchronization method, device and system based on offline exhibition hall
Technical Field
The invention relates to the technical field of virtual exhibition hall data processing, in particular to a method, a device and a system for realizing virtual exhibition hall information synchronization based on an offline exhibition hall.
Background
The exhibition hall is used for displaying enterprise information and products, meeting communication, information transmission economic trade and the like. With the social needs, virtual exhibition halls have come into play in order to meet the needs of online exhibition, thereby enabling users to use the virtual exhibition halls for online exhibition. Generally, a virtual exhibition hall generally constructs a three-dimensional model scene based on a Building Information Model (BIM), provides a complete Building engineering information base consistent with an actual situation for a model by using a digital technology, can create a three-dimensional model scene consistent with a real Building, and is applied to daily operation, management and online service of a physical scene of the Building.
In order to realize online and offline information synchronization, two technical routes exist in the prior art, one is that offline video data is transmitted to an online in an online live broadcast mode, so that a user can watch live broadcast online, and online exhibition is realized, and the mode is also widely applied to online exhibition scenes; another way of online and offline information synchronization is to arrange a plurality of sensors in an online and offline physical place, collect user behaviors through the sensors and transmit the user behaviors to an online virtual scene, so as to display various behaviors in the virtual scene, and this way is mostly used in scenes for user behavior analysis. However, in both the live broadcast mode and the sensor data acquisition mode, the synchronization of the online and offline viewing angles is poor, which causes a stuck phenomenon, and the online information lags behind the offline behavior. Especially, when a plurality of users respectively browse online information, the pictures of the online users cannot be synchronized due to different network environments, thereby causing the problems of poor cooperativity and reduced interactivity.
Therefore, on the premise of ensuring the completeness of the information of the virtual exhibition hall, how to realize the synchronous collaboration of the off-line exhibition and the on-line exhibition information of the virtual exhibition hall becomes a technical problem to be solved urgently.
Disclosure of Invention
Based on the above situation, a primary objective of the present invention is to provide a method, an apparatus, and a system for synchronizing information of a virtual exhibition hall implemented based on an offline exhibition hall, so as to implement synchronization and collaboration between the online exhibition hall and the exhibition information of an online virtual exhibition hall under the premise of ensuring the integrity of the information of the virtual exhibition hall as much as possible.
In order to achieve the purpose, the technical scheme adopted by the invention is as follows:
according to a first aspect, an embodiment of the present invention discloses a method for synchronizing information of a virtual exhibition hall based on an offline exhibition hall, including:
step S100, obtaining user offline position information, wherein the user offline position is an actual coordinate position obtained by collecting the position of a user in the process that the user moves in an offline exhibition hall;
step S200, determining a virtual position in a virtual exhibition hall according to the offline position information, wherein coordinates in a virtual exhibition hall coordinate system correspond to coordinates in an offline exhibition hall coordinate system one to one; the virtual exhibition hall is pre-established;
step S300, extracting model elements corresponding to the virtual positions based on the virtual positions to obtain target model elements, wherein the model elements are elements pre-constructed in the virtual exhibition hall, and the model elements correspond to exhibition elements in the off-line exhibition hall one to one;
step S400, enabling the virtual visual angle of the virtual camera to face to a target model element in the virtual exhibition hall;
step S500, visually presenting model elements within a virtual perspective range, wherein the virtual perspective range at least includes a part of the target model elements.
Optionally, step S300 includes:
searching according to a preset radius by taking the virtual position as a center to obtain a plurality of virtual elements;
and taking the virtual element closest to the virtual position in the plurality of virtual elements as the target model element.
Optionally, in step S100, obtaining the information of the subscriber' S off-line location at preset time intervals;
between step S200 and step S300, further comprising:
after the virtual position of the virtual exhibition hall is determined for the ith time, comparing the ith virtual position with the (i-1) th virtual position, wherein i is more than or equal to 2;
when the ith virtual position changes relative to the (i-1) th virtual position, driving the virtual visual angle of the virtual camera to move from the (i-1) th virtual position to the ith virtual position;
in step S300, extracting model elements within the virtual view range in order according to the sequence of the virtual view movement of the virtual camera;
in step S500, model elements within the virtual viewing angle range are sequentially presented in a visual manner, so that during the movement in the off-line exhibition hall, the user moves to roam along with the movement of the off-line user in the virtual exhibition hall.
Optionally, between step S300 and step S500, the method further includes:
determining a virtual distance between the virtual camera and the model element;
the length and/or the posture of a virtual spring arm of the virtual camera are adjusted in a self-adaptive mode according to the virtual distance, so that the virtual camera is prevented from colliding with model elements, and/or the virtual visual angle of the virtual camera captures the model elements according to a preset rule.
According to a second aspect, an embodiment of the present invention discloses a virtual exhibition hall information synchronization apparatus implemented based on an offline exhibition hall, including:
the system comprises a position acquisition module, a position acquisition module and a user management module, wherein the position acquisition module is used for acquiring the offline position information of a user, and the offline position information of the user is an actual coordinate position obtained by acquiring the position of the user in the process that the user moves in an offline exhibition hall;
the position determining module is used for determining a virtual position in the virtual exhibition hall according to the offline position information, wherein coordinates in a coordinate system of the virtual exhibition hall correspond to coordinates in a coordinate system of the offline exhibition hall one to one; the virtual exhibition hall is pre-established;
the element extraction module is used for extracting model elements corresponding to the virtual positions based on the virtual positions to obtain target model elements, wherein the model elements are elements pre-constructed in a virtual exhibition hall, and the model elements correspond to exhibition elements in an off-line exhibition hall one to one;
the visual angle adjusting module is used for enabling the virtual visual angle of the virtual camera to face the target model element in the virtual exhibition hall;
and the element presenting module is used for presenting the model elements in the virtual visual angle range in a visual mode, wherein the virtual visual angle range at least comprises part of the target model elements.
Optionally, the element extraction module comprises:
the searching unit is used for searching and obtaining a plurality of virtual elements according to a preset radius by taking the virtual position as a center;
and the target determining unit is used for taking the virtual element which is closest to the virtual position in the plurality of virtual elements as the target model element.
Optionally, in the location obtaining module, obtaining the information of the subscriber's offline location according to a preset time interval;
the virtual exhibition hall information synchronizer also comprises:
the position comparison module is used for comparing the ith virtual position with the (i-1) th virtual position after the ith virtual position of the virtual exhibition hall is determined, wherein i is more than or equal to 2;
the driving module is used for driving the virtual visual angle of the virtual camera to move from the ith-1 th virtual position to the ith virtual position when the ith virtual position changes relative to the ith-1 st virtual position;
in an element extraction module, extracting model elements in a virtual visual angle range in sequence according to the sequence of the movement of the virtual visual angle of a virtual camera;
and in the element presentation module, model elements in the virtual visual angle range are presented in sequence in a visual mode, so that the model elements roam in the virtual exhibition hall along with the movement of the offline user in the process of moving in the online exhibition hall.
Optionally, the method further comprises:
a distance determination module for determining a virtual distance between the virtual camera and the model element;
and the spring arm adjusting module is used for adaptively adjusting the length and/or the posture of a virtual spring arm of the virtual camera according to the virtual distance so as to prevent the virtual camera from colliding with the model elements and/or enable the virtual visual angle of the virtual camera to capture the model elements according to a preset rule.
According to a third aspect, an embodiment of the present invention discloses a computer-readable storage medium, on which a computer program is stored, and when the computer program is executed, the computer program can implement the method disclosed in the first aspect.
According to a fourth aspect, an embodiment of the present invention discloses a virtual exhibition hall information synchronization system implemented based on an offline exhibition hall, including:
the positioning device is used for acquiring the position of a user on line to obtain the information of the position under the user line;
the visualization device comprises the virtual exhibition hall information synchronization device based on the offline exhibition hall disclosed by the second aspect.
According to the method, the device and the system for realizing the synchronization of the information of the virtual exhibition hall based on the offline exhibition hall disclosed by the embodiment of the invention, after the information of the offline exhibition hall is obtained, the virtual position of the virtual exhibition hall is determined according to the information of the offline exhibition hall, the virtual exhibition hall is pre-established, and the coordinates of the virtual exhibition hall correspond to the coordinates of the offline exhibition hall one by one, so that the model elements corresponding to the virtual position can be extracted based on the virtual position to obtain the target model elements, then the virtual visual angle of the virtual camera is directed to the target model elements in the virtual exhibition hall to visually present the model elements in the virtual visual angle range, and as the model elements are pre-established elements in the virtual exhibition hall, the model elements correspond to the exhibition elements in the off-line exhibition hall one to one, so that in the whole virtual exhibition hall information synchronization process, the virtual exhibition hall information and the off-line exhibition hall information can be synchronized only by transmitting the position information of the user, namely, the model elements are prestored, the model elements can be called based on the off-line position information of the user without transmitting the model elements to the virtual exhibition hall, the following roaming of the virtual exhibition hall is realized, the data transmission quantity is reduced, the real-time performance is improved, then the virtual scene is displayed as completely as possible while the data transmission efficiency is improved, and the synchronous cooperation of the off-line exhibition and the on-line exhibition hall is realized.
Other advantages of the present invention will be described in the detailed description, and those skilled in the art will understand the technical features and technical solutions presented in the description.
Drawings
Embodiments of the present invention will be described below with reference to the accompanying drawings. In the figure:
fig. 1 is a flowchart of a method for synchronizing information of a virtual exhibition hall implemented based on an offline exhibition hall disclosed in this embodiment;
fig. 2 is a schematic diagram illustrating a principle of obtaining an offline position according to the embodiment;
fig. 3 is a schematic structural diagram of a virtual exhibition hall information synchronization device implemented based on an offline exhibition hall disclosed in this embodiment.
Detailed Description
The present invention will be described below based on examples, but the present invention is not limited to only these examples. In the following detailed description of the present invention, certain specific details are set forth in order to avoid obscuring the nature of the present invention, well-known methods, procedures, and components have not been described in detail.
Furthermore, those of ordinary skill in the art will appreciate that the drawings provided herein are for illustrative purposes and are not necessarily drawn to scale.
Unless the context clearly requires otherwise, throughout the description and the claims, the words "comprise", "comprising", and the like are to be construed in an inclusive sense as opposed to an exclusive or exhaustive sense; that is, what is meant is "including but not limited to".
In the description of the present invention, it is to be understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. In addition, in the description of the present invention, "a plurality" means two or more unless otherwise specified.
In order to realize the synchronization and coordination of the off-line exhibition and the on-line exhibition information of the virtual exhibition hall under the premise of ensuring the information of the virtual exhibition hall to be complete as much as possible, the embodiment discloses a method for synchronizing the information of the virtual exhibition hall realized based on the off-line exhibition hall, please refer to fig. 1, which is a flow chart of the method for synchronizing the information of the virtual exhibition hall realized based on the off-line exhibition hall disclosed by the embodiment, and the method comprises the following steps: step S100, step S200, step S300, step S400, and step S500, wherein:
step S100, obtaining the user' S off-line position information. In this embodiment, the user location is an actual coordinate location obtained by collecting a location of a user when the user moves to an off-line exhibition hall. Specifically, the collection of the position under the subscriber line can be realized based on the UWB positioning technology, specifically, the UWB positioning can include a UWB base station, a UWB tag and a positioning server, the UWB base station is arranged in an online exhibition hall, the UWB tag is worn by a user, the specific working principle is as follows, the UWB tag worn by the user is responsible for sending signals, a plurality of UWB base stations are responsible for receiving the signals sent by the tag and submitting the signals to the positioning server, the positioning server is responsible for analyzing and calculating signal data submitted by the plurality of UWB base stations to obtain the current position of the user under the subscriber line, and sending the calculation result to the BIM model scene visualization device, so that the visualization device obtains the position information under the subscriber line. As an example, please refer to fig. 2, which is a schematic diagram illustrating a principle of obtaining a downlink position disclosed in the present embodiment, a real-time position signal detection system is formed by using 4 UWB base stations and 1 UWB tag, and the real-time position signal detection system is respectively a first base station 1, a second base station 2, a third base station 3, a fourth base station 4, and a UWB tag 5. The UWB tag 5 transmits a signal at a frequency of, for example, once every 0.5 second, and the first base station 1, the second base station 2, the third base station 3, and the fourth base station 4 receive the signal and submit signal data to the location server 6; the position server 6 is responsible for configuring the coordinate positions of the first base station 1, the second base station 2, the third base station 3 and the fourth base station 4 in a real scene, the base station positions can be configured in a file mode, the configuration file is started to be read by the position server 6 and the loading of the position data of the base stations is completed, and the position data of the base stations participate in the calculation of the position of the UWB tag; thereafter, the location server 6 may actively transmit the coordinate location to the BIM model scene visualization device 7, or may transmit the coordinate location to the BIM model scene visualization device 7 in response to a request of the BIM model scene visualization device 7.
Step S200, determining the virtual position in the virtual exhibition hall according to the offline position information. In the embodiment, the virtual exhibition hall is pre-established, the off-line exhibition hall has a coordinate system (for example, a three-dimensional coordinate system) of the off-line exhibition hall, and the virtual exhibition hall also has a coordinate system of the off-line exhibition hall. Specifically, a virtual exhibition hall corresponding to the offline exhibition hall can be built in advance by using a BIM model, generally speaking, a BIM model scene client (visualization device) is an application program developed by using an unregeal Engine, and the internal logic of the client is as follows: leading BIM model data into a three-dimensional model scene by using an unknown Engine; in this embodiment, the virtual camera is used for capturing simulation elements by simulating the view angle of a physical camera, so that a visualization device can present the simulation elements by the view angle of the camera, and specifically, the virtual camera can determine the view angles of different distances and angles by preset rules; and the visual angle control adaptation class is responsible for receiving the offline user position updating data pushed by the position server and completing the visual angle coordination task according to the data. In this embodiment, the coordinates of the virtual exhibition hall correspond to the coordinates of the off-line exhibition hall one to one, so that after the off-line position information is acquired, the virtual position of the virtual exhibition hall can be directly determined according to the off-line position information.
And step S300, extracting model elements corresponding to the virtual positions based on the virtual positions to obtain target model elements. In a specific embodiment, the model elements are elements pre-constructed in the virtual exhibition hall, and the model elements correspond to the exhibition elements in the off-line exhibition hall one to one. Specifically, in the process of constructing the virtual exhibition hall, a BIM model can be imported and a three-dimensional model scene can be formed, elements in the scene can be divided into two types of environment elements and view elements, and view attribute marking can be performed on related model elements of the BIM model scene, for example, in an unregistered Engine application program, a corresponding scene element is selected and a "tag" attribute value of the scene element is set to MoT-Target, so that the element can be a view Target element, other elements which are not set are environment elements, in the specific implementation process, a user can construct the model elements according to needs, and the number or the types of the model elements can be increased or decreased relative to actual exhibition elements; in addition to the exhibits to be exhibited, the model elements may include one or any combination of walls, ceilings, floors, doors and windows, lighting lamps and the like. In this embodiment, since the model elements and the virtual exhibition halls are pre-established and correspond to the offline exhibition halls one to one, after the virtual position of the virtual exhibition halls is determined, the model elements associated with the virtual position can be extracted, and the model elements within the view angle range can be used as the target model elements.
In order to determine the target model element more accurately, in an alternative embodiment, step S300 includes: searching and obtaining a plurality of virtual elements according to a preset radius by taking the virtual position as a center; and taking the virtual element closest to the virtual position in the plurality of virtual elements as the target model element. Specifically, after the virtual position of the virtual exhibition hall is determined, a sphere range search can be performed in the virtual exhibition hall with a preset radius, if a visual model element (i.e., a "tag" attribute is a model element such as MoT-Target) exists, the closest visual model element is selected as a Target model element, and the virtual camera is roamed and the view angle is directed to the Target model element.
In this embodiment, the virtual element closest to the virtual position among the plurality of virtual elements is used as the target model element, so that the target model element can be determined without other sensors, that is, the target model element can be determined more accurately only by the position data.
In order to determine whether the user is in a moving state, in an alternative embodiment, when the interval between the update time of the subscriber line location data and the current time is greater than, for example, 2 seconds, that is, it indicates that the UWB tag is in a non-moving state, that is, the subscriber line has no moving location, at this time, the BIM model scene application performs roaming perspective coordination in the static roaming mode.
Step S400, the virtual perspective of the virtual camera is oriented towards the target model element in the virtual exhibition hall. Specifically, after the target model element is extracted, the virtual view angle of the virtual camera can be oriented to the target model element, so that the target model element can be "shot and recorded" (i.e., the target model element is extracted at the current view angle), that is, the shooting angle of the virtual camera is basically the same as the view angle of the offline user, and the synchronous cooperation of the exhibition information of the online virtual exhibition hall and the offline exhibition is realized.
Step S500, visually presenting model elements within a virtual perspective range, wherein the virtual perspective range at least includes a part of the target model elements. In this embodiment, since the virtual view angle of the virtual camera faces the target model element, in the process of visually presenting the model element within the virtual view angle range, the virtual view angle is equivalent to the actual element photographed by the offline view angle, that is, the virtual view angle and the offline user view angle are co-roamed, and in the virtual and actual view angle co-roamed process, the offline scene does not need to be collected, and the scene element does not need to be transmitted to the virtual exhibition hall, so that the data transmission amount is reduced, and the real-time performance is improved. Of course, in the process of explaining the exhibition hall, the virtual exhibition hall can extract the introduction of the corresponding elements and play the corresponding elements through audio equipment, so that the effect that the visual angle and the audio frequency are coordinated is achieved.
In order to implement the synchronous roaming between the virtual exhibition hall and the offline exhibition hall and improve the user experience of the roaming of the virtual exhibition hall, in an optional embodiment, in step S100, the offline position information of the user is acquired according to a preset time interval; between step S200 and step S300, further comprising: after the virtual position of the virtual exhibition hall is determined for the ith time, comparing the ith virtual position with the (i-1) th virtual position, wherein i is more than or equal to 2; when the ith virtual position changes relative to the (i-1) th virtual position, driving the virtual visual angle of the virtual camera to move from the (i-1) th virtual position to the ith virtual position; in step S300, extracting model elements within the virtual view range in order according to the order of movement of the virtual views of the virtual camera; in step S500, model elements within the virtual viewing angle range are sequentially presented in a visual manner, so that during the movement in the off-line exhibition hall, the user moves to roam along with the movement of the off-line user in the virtual exhibition hall. Specifically, it can be determined whether the ith virtual position changes relative to the (i-1) th virtual position by using the position data acquired by the location server, please refer to fig. 2, the location server 6 is responsible for receiving signal data submitted by 4 base stations in real time, and calculating the current XYZcurrent position data of the UWB tag in the real scene by using the time difference of arrival algorithm, and performing displacement comparison calculation with the old position xyzcold (the initial value of xyzcold may be, for example, x =0, y =0, z = 0), and calculating whether the effective displacement of the UWB tag occurs.
Comparing whether the x coordinate, the y coordinate or the z coordinate of the two positions of the XYZcurrent and the XYZold is changed more than a threshold value, wherein the threshold value is taken according to the indoor positioning precision of the UWB, the specific value is not 0.3, and the unit is meter:
DX = Xcurrent - Xold
DY = Ycurrent - Yold
DZ = Zcurrent - Zold
when the absolute value of any one of DX, DY or DZ is more than or equal to 0.3, the UWB tag is represented to have effective displacement. In this case, the location server updates the value of xycoled with XYZcurrent and pushes a new location message to the BIM model scene client, and the location data takes the value of XYZcurrent.
And when the absolute value of any one of DX, DY or DZ is less than 0.3, namely representing that the UWB tag does not generate effective displacement, the position server discards the XYZcurrent position data and ends the current processing flow.
In this embodiment, the time interval between the ith virtual position and the (i-1) th virtual position may be, for example, 2 seconds, and when the ith virtual position is different from the (i-1) th virtual position, and the time interval between the ith virtual position and the (i-1) th virtual position is less than or equal to 2 seconds, it indicates that the UWB tag is in a moving state, that is, the BIM model scene should perform roaming perspective coordination in a moving following roaming mode. In this mode, the virtual camera is moved at a new position by the virtual visual angle control adaptation type controller, and the orientation of the virtual camera is automatically adjusted to move from the i-1 st virtual position to the i-th virtual position, that is, the orientation of the virtual camera is a forward direction, so that the forward direction of the off-line user can be consistent, and the roaming coordination of the virtual exhibition hall and the off-line exhibition hall is realized.
In order to avoid the situation of view angle occlusion and ensure the validity of the optimal preset value of the view angle range as much as possible, and better simulate the situation of off-line exhibition, in an optional embodiment, between step S300 and step S500, the method further includes: determining a virtual distance between the virtual camera and the model element; the length and/or the posture of a virtual spring arm of the virtual camera are adjusted in a self-adaptive mode according to the virtual distance, so that the virtual camera is prevented from colliding with model elements, and/or the virtual visual angle of the virtual camera captures the model elements according to a preset rule. In this embodiment, since the virtual camera is used to capture the simulation elements by simulating the view angle of the physical camera, the length and/or posture of the virtual spring arm adjusted by the physical camera can be simulated in the process of capturing the simulation elements by the virtual camera; in a specific implementation process, the relationship between the imaging of the virtual camera and the length and/or posture of the virtual spring arm may be trained in a machine learning manner, for example, when the length of the virtual spring arm is shortened so that the virtual camera is far away from the model element, an image captured by the virtual camera is a reduced image, when the virtual spring arm rotates, a picture presented by the virtual camera is a rotation picture, and the like; the length and/or posture of the virtual spring arm can be adjusted by presetting a sample for training, for example, when the virtual camera is close to the obstacle for a certain distance, the virtual spring arm can be contracted so as not to touch the obstacle, and then when the virtual camera needs to cross the obstacle, the posture of the virtual spring arm can be adjusted. In this embodiment, the posture of the virtual spring arm includes a rotation angle, an orientation, and the like of the virtual spring arm, and when the virtual distance between the virtual camera and the model element is smaller than the preset distance, it is indicated that the virtual camera may collide with the model element (similar to the collision between an actual camera and an exhibition element in reality), at this time, the length and/or the posture of the virtual spring arm may be automatically adjusted during the roaming movement or rotation process of the virtual camera, so as to avoid the virtual camera from colliding with the model element, or the virtual view angle of the virtual camera may capture the model element according to a preset rule, and the mode of capturing the model element by the virtual view angle may be capturing all of the model element or part of the model element, thereby avoiding the situation of view angle shielding and ensuring the validity of the optimal preset value of the view angle range as much as possible.
The present embodiment further discloses a virtual exhibition hall information synchronization device implemented based on an offline exhibition hall, please refer to fig. 3, which is a schematic structural diagram of the virtual exhibition hall information synchronization device implemented based on the offline exhibition hall disclosed in the present embodiment, and the device includes: a position acquisition module 100, a position determination module 200, an element extraction module 300, a perspective adjustment module 400, and an element presentation module 500, wherein:
the position obtaining module 100 is configured to obtain information about an offline user position, where the offline user position is an actual coordinate position obtained by collecting a position of a user when the user moves in an offline exhibition hall;
the position determining module 200 is configured to determine a virtual position in a virtual exhibition hall according to the offline position information, wherein coordinates in a coordinate system of the virtual exhibition hall correspond to coordinates in a coordinate system of the offline exhibition hall one to one; the virtual exhibition hall is pre-established;
the element extraction module 300 is configured to extract model elements corresponding to the virtual positions based on the virtual positions to obtain target model elements, where the model elements are pre-constructed elements in a virtual exhibition hall, and the model elements correspond to exhibition elements in an offline exhibition hall one to one;
the perspective adjustment module 400 is used to orient the virtual perspective of the virtual camera to the target model element in the virtual gym;
the element rendering module 500 is configured to render model elements in a virtual perspective range in a visual manner, wherein the virtual perspective range includes at least a portion of the target model elements.
In an alternative embodiment, the element extraction module 300 includes: the searching unit is used for searching and obtaining a plurality of virtual elements according to a preset radius by taking the virtual position as a center; and the target determining unit is used for taking the virtual element which is closest to the virtual position in the plurality of virtual elements as the target model element.
In an optional embodiment, in the location obtaining module 100, the subscriber line location information is obtained at preset time intervals; the virtual exhibition hall information synchronizer also comprises: the position comparison module is used for comparing the ith virtual position with the (i-1) th virtual position after the ith virtual position of the virtual exhibition hall is determined, wherein i is more than or equal to 2; the driving module is used for driving the virtual visual angle of the virtual camera to move from the ith-1 th virtual position to the ith virtual position when the ith virtual position changes relative to the ith-1 st virtual position; in the element extraction module 300, according to the sequence of the virtual view angle movement of the virtual camera, extracting model elements in the virtual view angle range in sequence; in the element rendering module 500, model elements within the virtual visual angle range are sequentially rendered in a visual manner, so that the model elements roam in the virtual exhibition hall following the movement of the offline user during the movement in the online exhibition hall.
In an optional embodiment, the apparatus further comprises: a distance determination module for determining a virtual distance between the virtual camera and the model element; and the spring arm adjusting module is used for adaptively adjusting the length and/or the posture of a virtual spring arm of the virtual camera according to the virtual distance so as to prevent the virtual camera from colliding with the model elements and/or enable the virtual visual angle of the virtual camera to capture the model elements according to a preset rule.
The embodiment also discloses a computer readable storage medium, such as a chip, an optical disc, etc., on which a computer program is stored, and the computer program can implement the method disclosed in the above embodiment when executed.
It should be noted that the computer-readable storage medium according to the embodiments of the present disclosure is not limited to the above embodiments, and may be, for example, an electric, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the above. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In embodiments of the disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
This embodiment also discloses a virtual exhibition hall information synchronization system based on off-line exhibition hall realization, includes:
the positioning device is used for acquiring the position of a user on line to obtain the information of the position under the user line;
the visualization device comprises the virtual exhibition hall information synchronization device based on the offline exhibition hall disclosed by the embodiment.
According to the method, the device and the system for synchronizing the information of the virtual exhibition hall based on the offline exhibition hall disclosed by the embodiment of the invention, after the information of the user lower position is obtained, the virtual position of the virtual exhibition hall is determined according to the information of the offline exhibition hall, the virtual exhibition hall is pre-established, the coordinates of the virtual exhibition hall correspond to the coordinates of the offline exhibition hall one by one, therefore, the model element corresponding to the virtual position can be extracted based on the virtual position to obtain the target model element, then, the virtual visual angle of the virtual camera is oriented to the target model element in the virtual exhibition hall to present the model element in the virtual visual angle range, as the model element is the pre-established element in the virtual exhibition hall, and the model element corresponds to the exhibition element in the offline exhibition hall one by one, in the whole process of synchronizing the information of the virtual exhibition hall, the information of the virtual exhibition hall and the information of the offline exhibition hall can be synchronized by only transmitting the position information of the user, namely, the model element is pre-stored, the model element can be called based on the information of the offline exhibition hall without transmitting the model element, the information of the virtual exhibition hall, the transmission efficiency can be realized, the real-time, the transmission of the virtual exhibition hall can be reduced, the data can be displayed on the virtual exhibition hall, and the real-time, the data can be improved, and the data can be synchronized, and the data can be displayed in the real-time, and the data can be improved, and the exhibition hall can be synchronized.
It will be appreciated by those skilled in the art that the various preferences described above can be freely combined, superimposed without conflict. The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures, for example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. The numbering of the steps herein is for convenience of description and reference only and is not intended to limit the order of execution, the specific order of execution being determined by the technology itself, and one skilled in the art can determine various permissible and reasonable orders based on the technology itself.
It should be noted that step numbers (letter or number numbers) are used to refer to some specific method steps in the present invention only for the purpose of convenience and brevity of description, and the order of the method steps is not limited by letters or numbers in any way. It will be clear to a person skilled in the art that the order of the steps of the method concerned, which is to be determined by the technique itself, should not be unduly limited by the presence of step numbers, and that a person skilled in the art can determine various permissible and reasonable orders of steps in accordance with the technique itself.
It will be appreciated by those skilled in the art that the above-described preferred embodiments may be freely combined, superimposed, without conflict.
It will be understood that the embodiments described above are illustrative only and not restrictive, and that various obvious and equivalent modifications and substitutions for details described herein may be made by those skilled in the art without departing from the basic principles of the invention.

Claims (10)

1. A virtual exhibition hall information synchronization method realized based on an offline exhibition hall is characterized by comprising the following steps:
step S100, obtaining user offline position information, wherein the user offline position is an actual coordinate position obtained by collecting the position of a user in the process that the user moves in an offline exhibition hall;
step S200, determining a virtual position in a virtual exhibition hall according to the offline position information, wherein coordinates in a virtual exhibition hall coordinate system correspond to coordinates in an offline exhibition hall coordinate system one to one; the virtual exhibition hall is pre-established;
step S300, extracting model elements corresponding to the virtual positions based on the virtual positions to obtain target model elements, wherein the model elements are elements pre-constructed in the virtual exhibition hall, and the model elements correspond to exhibition elements in an offline exhibition hall one to one;
step S400, orienting the virtual visual angle of a virtual camera to the target model element in the virtual exhibition hall;
step S500, presenting model elements in the virtual viewing angle range in a visual manner, where the virtual viewing angle range at least includes a part of the target model elements.
2. The method for synchronizing the information of the virtual exhibition hall based on the offline exhibition hall as claimed in claim 1, wherein said step S300 comprises:
searching according to a preset radius by taking the virtual position as a center to obtain a plurality of virtual elements;
and taking the virtual element which is closest to the virtual position in the plurality of virtual elements as a target model element.
3. The method for synchronizing the information of the virtual exhibition hall based on the offline exhibition hall as claimed in claim 1,
in step S100, obtaining the information of the subscriber' S off-line location at preset time intervals;
between step S200 and step S300, further comprising:
after the virtual position of the virtual exhibition hall is determined for the ith time, comparing the ith virtual position with the (i-1) th virtual position, wherein i is more than or equal to 2;
when the ith virtual position changes relative to the (i-1) th virtual position, driving the virtual visual angle of the virtual camera to move from the (i-1) th virtual position to the ith virtual position;
in step S300, sequentially extracting model elements within the virtual view range according to the moving sequence of the virtual view of the virtual camera;
in the step S500, model elements within the virtual viewing angle range are sequentially presented in a visualized manner, so that the user moves to roam along with the movement of the user under the line in the virtual exhibition hall during the movement under the line.
4. The method for synchronizing the information of the virtual exhibition hall implemented based on the offline exhibition hall as claimed in any one of claims 1-3, further comprising, between the step S300 and the step S500:
determining a virtual distance between the virtual camera and the model element;
and adaptively adjusting the length and/or the posture of a virtual spring arm of the virtual camera according to the virtual distance so as to prevent the virtual camera from colliding with the model element, and/or enabling a virtual visual angle of the virtual camera to capture the model element according to a preset rule.
5. The utility model provides a virtual exhibition hall information synchronizer based on exhibition hall under line realizes which characterized in that includes:
the system comprises a position acquisition module (100) for acquiring the information of the off-line position of a subscriber, wherein the off-line position of the subscriber is an actual coordinate position obtained by acquiring the position of the subscriber in the process that the subscriber moves in an off-line exhibition hall;
the position determining module (200) is used for determining a virtual position in the virtual exhibition hall according to the offline position information, wherein the coordinates in the virtual exhibition hall coordinate system correspond to the coordinates in the offline exhibition hall coordinate system one by one; the virtual exhibition hall is pre-established;
an element extraction module (300) configured to extract, based on the virtual position, a model element corresponding to the virtual position to obtain a target model element, where the model element is an element pre-constructed in the virtual exhibition hall, and the model element corresponds to an exhibition element in an offline exhibition hall one to one;
a perspective adjustment module (400) for orienting a virtual perspective of a virtual camera towards the target model element in the virtual gym;
an element rendering module (500) for rendering model elements within the virtual perspective range in a visualized manner, wherein the virtual perspective range contains at least part of the target model elements.
6. The offline exhibition-implemented-based virtual exhibition information synchronization apparatus of claim 5, wherein said element extraction module (300) comprises:
the searching unit is used for searching and obtaining a plurality of virtual elements according to a preset radius by taking the virtual position as a center;
and the target determining unit is used for taking the virtual element which is closest to the virtual position in the plurality of virtual elements as a target model element.
7. The offline exhibition-based virtual exhibition hall information synchronization apparatus of claim 5, wherein,
in the position acquisition module (100), acquiring the information of the position under the subscriber line according to a preset time interval;
the virtual exhibition hall information synchronization device further comprises:
the position comparison module is used for comparing the ith virtual position with the (i-1) th virtual position after the ith virtual position of the virtual exhibition hall is determined, wherein i is more than or equal to 2;
the driving module is used for driving the virtual visual angle of the virtual camera to move from the ith-1 th virtual position to the ith virtual position when the ith virtual position changes relative to the ith-1 st virtual position;
in the element extraction module (300), extracting model elements in the virtual visual angle range in sequence according to the moving sequence of the virtual visual angle of the virtual camera;
in the element presentation module (500), model elements within the virtual visual angle range are sequentially presented in a visual manner, so that during the movement in the off-line exhibition hall, the user can roam along with the movement of the off-line user in the virtual exhibition hall.
8. The offline exhibition hall-implemented virtual exhibition hall information synchronization apparatus according to any one of claims 5 to 7, further comprising:
a distance determination module to determine a virtual distance between the virtual camera and the model element;
and the spring arm adjusting module is used for adaptively adjusting the length and/or the posture of a virtual spring arm of the virtual camera according to the virtual distance so as to prevent the virtual camera from colliding with the model element and/or enable a virtual visual angle of the virtual camera to capture the model element according to a preset rule.
9. A computer-readable storage medium, on which a computer program is stored, which computer program, when executed, is capable of carrying out the method according to any one of claims 1-4.
10. A virtual exhibition hall information synchronization system based on offline exhibition hall realization is characterized by comprising:
the positioning device is used for acquiring the position of a user on line to obtain the information of the position under the user line;
visualization device comprising the offline exhibition hall-implemented virtual exhibition hall information synchronization apparatus according to any one of claims 5-8.
CN202211523425.1A 2022-12-01 2022-12-01 Virtual exhibition hall information synchronization method, device and system based on offline exhibition hall Active CN115546453B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211523425.1A CN115546453B (en) 2022-12-01 2022-12-01 Virtual exhibition hall information synchronization method, device and system based on offline exhibition hall

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211523425.1A CN115546453B (en) 2022-12-01 2022-12-01 Virtual exhibition hall information synchronization method, device and system based on offline exhibition hall

Publications (2)

Publication Number Publication Date
CN115546453A true CN115546453A (en) 2022-12-30
CN115546453B CN115546453B (en) 2023-03-14

Family

ID=84722000

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211523425.1A Active CN115546453B (en) 2022-12-01 2022-12-01 Virtual exhibition hall information synchronization method, device and system based on offline exhibition hall

Country Status (1)

Country Link
CN (1) CN115546453B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080111832A1 (en) * 2006-10-23 2008-05-15 International Business Machines Corporation System and method for generating virtual images according to position of viewers
CN105338483A (en) * 2014-08-12 2016-02-17 中国电信股份有限公司 Method, device and system for realizing exhibition hall tour guide based on augmented reality technology
CN106803283A (en) * 2016-12-29 2017-06-06 东莞新吉凯氏测量技术有限公司 Interactive three-dimensional panorama multimedium virtual exhibiting method based on entity museum
WO2018086224A1 (en) * 2016-11-11 2018-05-17 歌尔科技有限公司 Method and apparatus for generating virtual reality scene, and virtual reality system
CN208172877U (en) * 2018-05-18 2018-11-30 镇江启迪数字天下科技有限公司 A kind of Museum guiding device based on the identification of AR cloud

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080111832A1 (en) * 2006-10-23 2008-05-15 International Business Machines Corporation System and method for generating virtual images according to position of viewers
CN105338483A (en) * 2014-08-12 2016-02-17 中国电信股份有限公司 Method, device and system for realizing exhibition hall tour guide based on augmented reality technology
WO2018086224A1 (en) * 2016-11-11 2018-05-17 歌尔科技有限公司 Method and apparatus for generating virtual reality scene, and virtual reality system
CN106803283A (en) * 2016-12-29 2017-06-06 东莞新吉凯氏测量技术有限公司 Interactive three-dimensional panorama multimedium virtual exhibiting method based on entity museum
CN208172877U (en) * 2018-05-18 2018-11-30 镇江启迪数字天下科技有限公司 A kind of Museum guiding device based on the identification of AR cloud

Also Published As

Publication number Publication date
CN115546453B (en) 2023-03-14

Similar Documents

Publication Publication Date Title
CN110199321B (en) Positioning determination for mixed reality systems
US8803992B2 (en) Augmented reality navigation for repeat photography and difference extraction
US10692288B1 (en) Compositing images for augmented reality
WO2019233445A1 (en) Data collection and model generation method for house
US20140375684A1 (en) Augmented Reality Technology
CN101390130B (en) Method and apparatus for making a virtual movie for use in exploring a site
US20120192088A1 (en) Method and system for physical mapping in a virtual world
WO2023093217A1 (en) Data labeling method and apparatus, and computer device, storage medium and program
CN110246235B (en) Power distribution room on-site inspection method and system based on Hololens mixed reality technology
EP3190581B1 (en) Interior map establishment device and method using cloud point
CN105338369A (en) Method and apparatus for synthetizing animations in videos in real time
CN102157011A (en) Method for carrying out dynamic texture acquisition and virtuality-reality fusion by using mobile shooting equipment
CN105635669A (en) Movement contrast system based on three-dimensional motion capture data and actually photographed videos and method thereof
CN105704507A (en) Method and device for synthesizing animation in video in real time
CN107038949A (en) A kind of generation method of live-action map, method for pushing and its device
CN109120901B (en) Method for switching pictures among cameras
JP2938845B1 (en) 3D CG live-action image fusion device
KR102464271B1 (en) Pose acquisition method, apparatus, electronic device, storage medium and program
TWI764366B (en) Interactive method and system based on optical communication device
CN115546453B (en) Virtual exhibition hall information synchronization method, device and system based on offline exhibition hall
CN110751616B (en) Indoor and outdoor panoramic house-watching video fusion method
CN109840948B (en) Target object throwing method and device based on augmented reality
CN111242107B (en) Method and electronic device for setting virtual object in space
CN110427936B (en) Wine storage management method and system for wine cellar
CN108168555B (en) Operation guiding method and system based on coordinate positioning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant