CN117221633A - Virtual reality live broadcast system based on meta universe and digital twin technology - Google Patents

Virtual reality live broadcast system based on meta universe and digital twin technology Download PDF

Info

Publication number
CN117221633A
CN117221633A CN202311483894.XA CN202311483894A CN117221633A CN 117221633 A CN117221633 A CN 117221633A CN 202311483894 A CN202311483894 A CN 202311483894A CN 117221633 A CN117221633 A CN 117221633A
Authority
CN
China
Prior art keywords
virtual reality
live broadcast
meta
broadcast system
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311483894.XA
Other languages
Chinese (zh)
Inventor
郭可才
高楚天
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Shenxin Dacheng Technology Co ltd
Original Assignee
Beijing Shenxin Dacheng Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Shenxin Dacheng Technology Co ltd filed Critical Beijing Shenxin Dacheng Technology Co ltd
Priority to CN202311483894.XA priority Critical patent/CN117221633A/en
Publication of CN117221633A publication Critical patent/CN117221633A/en
Pending legal-status Critical Current

Links

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The application discloses a virtual reality live broadcast system based on meta universe and digital twin technology, which comprises: hardware subsystem and terminal subsystem, wherein: the hardware subsystem comprises: laser radar, camera device, tracking equipment, metauniverse server and communication module: the terminal subsystem includes: virtual reality equipment, data processing module and interactive module. The application combines the meta-universe technology and the digital twin technology, brings brand new viewing experience for users, and enables the users to enter the meta-universe scene for real-time viewing in a virtual identity and synchronously interact with the real scene in real time. Compared with the traditional live broadcast mode, the virtual reality live broadcast system greatly improves the participation feeling of the user and the immersive viewing experience, is not limited to a fixed viewing angle any more, can freely select to view different viewing angles, performs body action interaction, and enhances the interactivity and the immersive feeling of the viewing experience.

Description

Virtual reality live broadcast system based on meta universe and digital twin technology
Technical Field
The application relates to the technical field of virtual reality and live broadcast, in particular to a virtual reality live broadcast system based on meta universe and digital twin technology.
Background
In the age of rapid development of digital technology today, virtual reality and metauniverse technologies are increasingly receiving widespread attention. The virtual reality technology attracts a large number of applications of users in the fields of games, entertainment, training and the like by simulating a real scene and realizing immersive experience of the users. Meanwhile, the metauniverse is used as a digital space which exists in parallel with the real world, and a wider virtual social contact and creation space is provided for users. In the field of sports, sports games have been interesting and loving sports events, attracting attention of a large number of spectators. However, conventional live sports games are often limited by the limitations of live viewing, and it is difficult for viewers to personally feel the violence and the wonderful and poor experience of the games. Meanwhile, the existing virtual reality sports games cannot be synchronized with real games in real time, and participation feeling of users is limited.
Therefore, a live broadcast system combining the virtual reality and meta-space technology is needed, real-time virtual reality watching experience can be achieved, a user can enter a scene in a virtual identity, and participation and experience of the user are improved.
Disclosure of Invention
In view of the above technical problems, the present application provides a virtual reality live broadcast system based on meta space and digital twin technology, which at least solves some of the above technical problems, and utilizes the virtual reality technology to bring viewers into meta space scenes conveniently, so as to realize virtual identities of users in virtual scenes, and realize real-time viewing in an immersive manner, and synchronously interact with real scenes, thereby being beneficial to improving participation and experience of users.
In order to achieve the above purpose, the technical scheme adopted by the application is as follows:
the application provides a virtual reality live broadcast system based on meta universe and digital twin technology, which comprises: hardware subsystem and terminal subsystem, wherein:
the hardware subsystem includes: laser radar, camera device, tracking equipment, metauniverse server and communication module, wherein:
the laser radar is used for scanning and acquiring three-dimensional point cloud information of a target;
the camera device is used for acquiring a live real-time image;
the tracking equipment is used for tracking action data of the capture target and feeding the action data back to a corresponding model of the virtual reality scene in real time;
the meta space server is used for storing and processing preset scene and model data;
the communication module is used for transmitting three-dimensional point cloud information, on-site real-time images and action data to the terminal system in real time;
the terminal subsystem comprises: virtual reality equipment, data processing module and interactive module, wherein:
the virtual reality device is used for enabling a user to enter a virtual reality scene and watch live broadcast of virtual reality in real time;
the data processing module is used for carrying out point cloud registration and format conversion on the action data transmitted in real time;
the interaction module is used for enabling a user to select to watch different visual angles in a virtual identity in a virtual reality scene and performing body action interaction.
Preferably, the hardware subsystem further comprises: the embedded microcontroller and the time synchronization module are used for synchronizing time information among the components.
Preferably, the terminal subsystem further comprises: and the acousto-optic electric simulation device is used for increasing the environmental sound and light and shadow effect in the virtual reality scene.
Preferably, the laser radar is integrally provided with: multi-line solid-state three-dimensional laser scanner, rotary platform, motor module, two level bubble appearance and compass, wherein:
the multi-line solid-state three-dimensional laser scanner is used for scanning a target and acquiring three-dimensional point cloud information;
the rotating platform is connected with the motor module and used for controlling the movement of the multi-line solid three-dimensional laser scanner;
the double-level bubble meter and the compass are used for ensuring the initial north of the multi-line solid-state three-dimensional laser scanner.
Preferably, the image pickup device is a high-definition camera or a high-definition industrial camera.
Preferably, the communication module is: a 5G communication module or a WIFI module.
Preferably, the virtual reality device is: VR head mounted display device.
Preferably, the tracking device adopts a Kalman filtering tracking algorithm to track the motion change of the target in real time.
Compared with the prior art, the technical scheme of the application has at least the following beneficial technical effects:
1. the application provides a virtual reality live broadcast system based on meta universe and digital twin technology, by utilizing the virtual reality technology, a user can enter a virtual reality scene to watch in real time in a virtual identity mode, and interact with the real scene synchronously in real time. Compared with the traditional live broadcast mode, the virtual reality live broadcast system greatly improves the participation and the immersive experience of the user, so that the user is not limited to a fixed visual angle any more, can freely select and watch different visual angles, performs body action interaction such as head rotation, hand action and the like, and enhances the interactivity and the immersive experience of the watching experience.
2. The application solves the problem of remote transmission of massive point cloud information by using a 5G communication technology, and is convenient for ensuring the fluency and stability of live broadcast in real time.
3. The application has wide application prospect, can be applied to the fields of sports entertainment, education, cultural heritage digital protection, building high-precision real-time monitoring, smart city planning, digital twinning and the like, and is beneficial to providing powerful data support for digital development.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the application. The objectives and other advantages of the application may be realized and attained by the structure particularly pointed out in the written description and drawings.
The technical scheme of the application is further described in detail through the drawings and the embodiments.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
The accompanying drawings are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate the application and together with the embodiments of the application, serve to explain the application.
Fig. 1 is a schematic diagram of the working principle of a virtual reality live broadcast system based on meta universe and digital twin technology provided by the application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments of the present application. Moreover, various numbers and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
The embodiment of the application provides a virtual reality live broadcast system based on meta universe and digital twin technology, which comprises: hardware subsystem and terminal subsystem, wherein:
the hardware subsystem comprises: laser radar, camera device, tracking equipment, metauniverse server and communication module, wherein:
the laser radar is used for scanning and acquiring three-dimensional point cloud information of a target; the camera device is used for acquiring a live real-time image; the tracking equipment is used for tracking action data of the capture target and feeding the action data back to a corresponding model of the virtual reality scene in real time; the meta-universe server is used for storing and processing preset scene and model data; the communication module is used for transmitting three-dimensional point cloud information, on-site real-time images and action data to the terminal system in real time;
the terminal subsystem includes: virtual reality equipment, data processing module and interactive module, wherein:
the virtual reality device is used for enabling a user to enter a virtual reality scene to watch live broadcast of virtual reality in real time; the data processing module is used for carrying out point cloud registration and format conversion on the action data transmitted in real time; the interaction module is used for enabling the user to select to watch different visual angles in a virtual identity mode in the virtual reality scene and performing body action interaction.
The following describes in detail a specific embodiment of the virtual reality live broadcast system according to the present application, taking as an example the application of the system to a sports game as shown in fig. 1:
in the embodiment, the virtual reality live broadcast system based on the meta universe and digital twinning technology is applied to sports games, digital twinning of the sports games is realized through the digital twinning processing technology, and a high-quality 3D model of various objects including venues, game props, athletes, spectators and the like is built in advance. Through careful design and careful modeling, the models enable the virtual game scene to have the appearance and characteristics consistent with those of the real scene, ensure the authenticity and fidelity of the virtual scene, and enable the 3D models of game props and athletes in the virtual scene to accurately present the same actions as those of the real scene. When in use, the corresponding model can be directly called from the database, and real-time model establishment can be performed according to the photography or three-dimensional scanning technology.
In this embodiment, the system of the present application mainly comprises a hardware system and a terminal system. The hardware system comprises the following main components: laser radar, camera device, tracking equipment, metauniverse server and communication module, wherein:
in a specific embodiment, the laser radar is used for scanning the real environment of a stadium and a competition prop, and a multi-line solid state three-dimensional laser scanner is preferably used as a laser radar sensor, so that the tested stadium and the competition prop can be efficiently scanned, and the point cloud information of a court can be acquired in real time. The laser radar is also integrally provided with: rotary platform, motor module, two level bubble appearance and compass. The rotating platform of the laser radar is connected with the motor module, and a high-precision motor is preferably used, so that the laser radar sensor can be ensured to stably rotate in the scanning process, and all-dimensional point cloud information can be obtained. The double-level bubble and the compass are used for ensuring initial north pointing of laser radar scanning and ensuring accuracy and stability of scanning data.
The laser radar scanning acquisition of the point cloud is realized by the radiation and reflection principles of laser beams. The laser radar transmits a laser pulse to the target object, and when the laser beam irradiates the surface of the target object, part of the laser energy is reflected back by the surface of the object. The laser radar calculates the time of the laser beam from the emission to the reflection and back by receiving the reflected laser signal, and further calculates the distance of the laser beam in space by measuring the relationship between the time and the speed of light. Distance information in different directions can be obtained by rotating the laser radar sensor or changing the scanning angle of the laser beam, and three-dimensional point cloud data are finally formed. The calculation formula of the laser radar scanning acquisition point cloud is as follows:
d= (t*c)/2
wherein d represents the distance between the laser beam and the target object; t represents the time that the laser beam has elapsed from emission to reflection back; c represents the speed of light in vacuum, about 3 x 10 x 8m/s.
By constantly changing the scanning angle and position of the laser beam, the laser radar can acquire three-dimensional coordinate data of the target object, thereby constituting point cloud information.
In a specific embodiment, the camera device is a high-definition camera or a high-definition industrial camera, and the comparison of the scenes of the competition includes the scene conditions of the stadium, the actions of athletes and spectators and the movement track of the competition props, so as to perform real-time shooting and recording.
In one particular embodiment, a tracking device (target detector) is used to track motion data of a captured target (including motions of athletes and spectators, play objects, and the like), and the motion data is fed back to a corresponding model of the virtual reality scene in real time; preferably, the tracking device and the camera device are integrally arranged; capturing and tracking relevant motion data while shooting real-time images.
In one embodiment, the moving object tracking is preferably performed by using a kalman filter tracking algorithm, where kalman filter is the most classical moving object tracking algorithm, and is based on the ideas of state estimation and state prediction. The algorithm is typically used to estimate and predict a continuous-time linear system, and is applicable to moving objects where position and velocity changes are relatively continuous. The basic formula of the kalman filter is as follows:
1) Prediction step (prediction):
predicting the state and covariance of the target at the next time step:
prediction state:
x_k = F * x_{k-1} + u
prediction covariance:
P_k = F * P_{k-1} * F^T + Q
where x_k represents the state vector of the target at the kth time, F is the state transition matrix, u is the external input control, p_k is the state covariance matrix, and Q is the process noise covariance matrix.
2) Update step (Update):
the predicted value is updated based on the measured value at the current time (typically the position information provided by the object detector):
measurement residual:
y_k = z_k - H * x_k
covariance residual:
S_k = H * P_k * H^T + R
kalman gain:
K_k = P_k * H^T * S_k^-1
updating the state:
x_k = x_k + K_k * y_k
updating covariance:
P_k = (I - K_k * H) * P_k
where z_k represents the measurement value at the current time (typically the position information provided for the target detector), H is the measurement matrix and R is the measurement noise covariance matrix.
And feeding back the motion data acquired in real time to the corresponding 3D model, so that the ball and the athlete in the virtual scene can make the same action as the real scene in the virtual reality scene.
In one embodiment, the hardware system further comprises a positioning module, preferably using Beidou/GNSS positioning technology, to ensure that the system obtains accurate position information in the stadium and provides accurate positioning data for 3D models of athletes and play objects (e.g., football or basketball, etc.).
In a specific embodiment, the hardware system further comprises an embedded microcontroller and a time synchronization module, wherein the embedded microcontroller is responsible for decoding the time information into a time stamp, and the time information is synchronized to a scanner, a motor module and the like of the laser radar through the time synchronization module, so that the time synchronism of each component is ensured.
For example, assuming that the time information received by the embedded microcontroller is t, the time synchronization module is used to synchronize the time information t to a scanner and motor module of the lidar, etc. The method comprises the following steps:
first, the embedded microcontroller decodes the time information t into a time stamp. The time stamp generally represents the time elapsed, typically in seconds, from some reference point in time (e.g., system start-up time). The timestamp t_stamp can be expressed as:
t_stamp = t - t_0
where t_0 is time information of a reference time point, typically the time at which the system is started.
Next, the timestamp t_stamp needs to be transmitted to the scanner and motor module of the lidar. During transmission, there may be a transmission delay (Transmission Delay), i.e., the time required for the timestamp to be transmitted to the laser radar's scanner and motor module. In order to ensure time synchronization, the effect of transmission delay on the time stamp needs to be considered.
Let the transmission delay be Δt, and the time stamp after transmission be t_stamp_transmitted.
t_stamp_transmitted = t_stamp + Δt
At the receiving end, after the scanner and motor module of the laser radar receives the timestamp t_stamp_transmitted, correction is required according to the transmission delay deltat so as to recover the actual timestamp t_received.
t_received = t_stamp_transmitted - Δt
= t_stamp + Δt - Δt
= t_stamp
According to the formula, after the transmission delay is corrected, the timestamp t_received actually received by the scanner and the motor module of the laser radar is equal to the timestamp t_stamp obtained by decoding of the embedded microcontroller at the transmitting end, so that the time synchronism among all the components is ensured.
It should be noted that: different transmission delay and clock jitter may exist in an actual system, so various factors need to be comprehensively considered in designing and implementing a time synchronization mechanism to ensure high-precision time synchronization. The specific implementation and algorithm may vary depending on the application scenario and hardware configuration. The above description is only illustrative of a simple time synchronization principle, and more complex factors need to be comprehensively considered in a practical system.
In one embodiment, a metauniverse server is preferably employed to store and process scene and character model data to provide support for virtual reality scenes.
In a specific embodiment, the communication module uses a 5G communication module or a WIFI module, preferably uses a 5G communication module, and uses a 5G communication technology to transmit data of athletes, game props, and the like to the terminal system in real time, so as to ensure smoothness and stability of live broadcast in real time.
In this embodiment, the terminal system is a main tool for the audience to participate in the virtual game scene, and mainly includes the following components:
virtual reality device: preferably, VR head-mounted display devices are used to enable viewers to enter virtual game scenes in virtual identities and live sports games in real-time virtual reality.
And a data processing module: the module can perform point cloud registration and format conversion on the motion data of athletes and game props transmitted in real time, and ensure that the motions of characters and balls in the virtual reality scene are synchronous with the real game scene. Wherein:
1. the point cloud registration realizes accurate matching mainly through 'coarse registration' + 'fine registration'. The coarse registration preferably employs a method based on point feature matching. A point cloud registration method based on point features registers two point clouds by using a set of predetermined feature points. It first extracts feature points from the reference point cloud and the source point cloud, which are points with unique geometric or structural features, such as corners or edges, and are less affected by noise and outliers. The corresponding homonymous feature points of the two sets of point clouds may be represented as p= { P 1 ,p 2 ,p 3 ,...,p n Sum q= { Q 1 ,q 2 ,q 3 ,...,q n }. A point correspondence is established between the reference point cloud and the source point cloud using the matched feature points, which is used to calculate a transformation that aligns the two point clouds. The correspondence between the two sets of point clouds can be represented by a rotation matrix R and a translation parameter t. If P is rotated and transformed and then aligned with Q, then the following relationship exists between P and Q:
where λ is the transformation scale, λ=1 in general point cloud registration. The direct calculation of the rotation matrix by adopting the azimuth element calculation method has large workload and can only be suitable for smaller rotation angles. And solving the angle parameters of the large rotation angle space change by adopting a Rodriger matrix construction method. An antisymmetric matrix S is first constructed,
wherein a, b, c are independent elements of the matrix of the rondrign. R is formed by S to form a matrix of rodrich,
wherein I is a 3-order identity matrix. The rotation matrix after the expansion is as follows,
substituting the rotation matrix R of the formula into the relation between P and Q, firstly calculating the scale parameter, then calculating the rotation matrix, and finally calculating the translation parameter. The above parameters have unique solutions when there are 3 pairs of common points in the two coordinate systems.
In practice, the common points of two coordinate systems are often more than three. Under the redundant observation condition, the parameters are preferably understood according to the least square source, a point error equation can be constructed,
wherein the parameters are as follows:
wherein V is 1 For the correction of observed value, A 1 The related coefficient matrix after linearization of the rotation matrix is that r is the rotation parameter correction, B is the coefficient matrix of undetermined point, t 1 To be fixed point correction value, L 1 Is the observation residual. The computation here involves common feature points in the two pairs of point clouds to be matched. Characteristic points (x) 0 ,y 0 ,z 0 ) As a true value, feature points (x, y, z) of the point cloud to be matched are used as observation values. Since the exact absolute coordinates of the TLS point cloud are not required, any point cloud can be used as the source point cloud.
The ICP algorithm is also calculated for the relationship between P and Q, and its core is to minimize an objective function, namely:
the optimal solution of this equation is:
2. matching motion capture (MoCap) data with video data is a process involving multiple steps, specifically:
first, motion data needs to be collected by a motion capture system. This typically involves capturing the motion of a person or object using a set of cameras and/or motion capture sensors. At the same time, video data needs to be recorded, so that the time stamps of the video and the motion capture data are ensured to be corresponding.
Data preprocessing, cleaning and processing motion capture data to eliminate noise and inconsistencies. Similar pre-processing of the video data is required, such as stabilizing, cropping or adjusting the color.
Data synchronization, using a time stamp or manually, finds the correspondence between motion capture data and video data to ensure that they are synchronized.
Three-dimensional reconstruction and matching, using specialized software, such as Blender or Maya, imports motion capture data and creates a 3D model. A virtual camera is placed in 3D space to match the angle and position of the actual video capture.
Rendering and compositing, using a rendering engine of 3D software, to render motion capture scenes that match the video. Rendered motion capture data is synthesized with the original video data using synthesis software (e.g., adobe After Effects or Nuke).
Fine tuning and optimization, fine tuning as necessary to ensure seamless matching of motion to video, may include manual adjustment of motion data, changing illumination or camera position, etc.
And (3) exporting and evaluating, namely exporting a final synthesized video and evaluating the result. If necessary, the procedure returns to the previous procedure for adjustment and optimization.
And an interaction module: the module allows the audience to freely select to watch different visual angles in a virtual match scene in a virtual identity mode, and performs body action interaction, such as head rotation, hand action and the like, so as to enhance the feeling of the personally on-site viewing.
In one embodiment, based on artificial intelligence and a big data platform, a big language data model for human-computer interaction is preferably used to provide communication services for users.
In a specific embodiment, elements such as environmental sound and light and shadow effects are added in the virtual reality scene through the acousto-optic electric simulation device, so that the sense of reality of the virtual competition scene is enhanced, and a viewer obtains more immersive viewing experience.
Through the description of the above embodiments, those skilled in the art can know that the application provides an innovative virtual reality live broadcast system by combining meta universe and digital twin technology, and brings brand new viewing experience for audience. By utilizing the virtual reality technology, audiences can enter the meta-universe scene in a virtual identity, watch sports games in real time and synchronously interact with the real game scene in real time. Compared with the traditional live broadcasting mode, the virtual reality live broadcasting system greatly improves the participation feeling of the audience and the immersive viewing experience, so that the audience is not limited to a fixed viewing angle any more, can freely select and watch different viewing angles, performs body action interaction such as head rotation, hand action and the like, and enhances the interactivity and the immersion feeling of the viewing experience.
The virtual reality live broadcast system can be applied to the field of sports and entertainment, and has a wide application prospect. The application can provide powerful data support for digital development in the fields of education, cultural heritage digital protection, high-precision real-time building monitoring, smart city planning, digital twinning and the like. Through the digital twin technology, the application can copy the point cloud information of the sports competition scene and the athlete action data in real time, and realize the seamless connection of the real scene and the virtual reality scene. Meanwhile, the application solves the problem of remote transmission of massive point cloud data by using a 5G communication technology, and ensures the fluency and stability of live broadcast in real time. The characteristics enable the application to have wide application value and great economic benefit in the fields of digital development and sports entertainment, provide brand new viewing experience and interactive experience for users, and facilitate the wide application of the virtual reality technology.
It will be appreciated by those skilled in the art that embodiments of the application may be provided as a system, method, or computer program product, or the like. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media having computer-usable program code embodied therein.
It is to be noticed that the term 'comprising', does not exclude the presence of elements or steps other than those listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The application may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer.
In the present specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different point from other embodiments, and identical and similar parts between the embodiments are all enough to refer to each other.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (8)

1. A virtual reality live broadcast system based on meta universe and digital twin technology, characterized in that the system comprises: hardware subsystem and terminal subsystem, wherein:
the hardware subsystem includes: laser radar, camera device, tracking equipment, metauniverse server and communication module, wherein:
the laser radar is used for scanning and acquiring three-dimensional point cloud information of a target;
the camera device is used for acquiring a live real-time image;
the tracking equipment is used for tracking action data of the capture target and feeding the action data back to a corresponding model of the virtual reality scene in real time;
the meta space server is used for storing and processing preset scene and model data;
the communication module is used for transmitting three-dimensional point cloud information, on-site real-time images and action data to the terminal system in real time;
the terminal subsystem comprises: virtual reality equipment, data processing module and interactive module, wherein:
the virtual reality device is used for enabling a user to enter a virtual reality scene and watch live broadcast of virtual reality in real time;
the data processing module is used for carrying out point cloud registration and format conversion on the action data transmitted in real time;
the interaction module is used for enabling a user to select to watch different visual angles in a virtual identity in a virtual reality scene and performing body action interaction.
2. The virtual reality live broadcast system of claim 1, wherein the hardware subsystem further comprises: the embedded microcontroller and the time synchronization module are used for synchronizing time information among the components.
3. The virtual reality live broadcast system based on meta space and digital twinning techniques of claim 1, wherein the terminal subsystem further comprises: and the acousto-optic electric simulation device is used for increasing the environmental sound and light and shadow effect in the virtual reality scene.
4. The virtual reality live broadcast system based on meta space and digital twin technology according to claim 1, wherein the laser radar is integrally provided with: multi-line solid-state three-dimensional laser scanner, rotary platform, motor module, two level bubble appearance and compass, wherein:
the multi-line solid-state three-dimensional laser scanner is used for scanning a target and acquiring three-dimensional point cloud information;
the rotating platform is connected with the motor module and used for controlling the movement of the multi-line solid three-dimensional laser scanner;
the double-level bubble meter and the compass are used for ensuring the initial north of the multi-line solid-state three-dimensional laser scanner.
5. The virtual reality live broadcast system based on the meta space and digital twin technology according to claim 1, wherein the camera device is a high-definition camera or a high-definition industrial camera.
6. The virtual reality live broadcast system based on meta space and digital twin technology according to claim 1, wherein the communication module is: a 5G communication module or a WIFI module.
7. The virtual reality live broadcast system based on meta space and digital twin technology of claim 1, wherein the virtual reality device is: VR head mounted display device.
8. The virtual reality live broadcast system based on meta space and digital twin technology according to claim 1, wherein the tracking device adopts a kalman filter tracking algorithm to track motion changes of a target in real time.
CN202311483894.XA 2023-11-09 2023-11-09 Virtual reality live broadcast system based on meta universe and digital twin technology Pending CN117221633A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311483894.XA CN117221633A (en) 2023-11-09 2023-11-09 Virtual reality live broadcast system based on meta universe and digital twin technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311483894.XA CN117221633A (en) 2023-11-09 2023-11-09 Virtual reality live broadcast system based on meta universe and digital twin technology

Publications (1)

Publication Number Publication Date
CN117221633A true CN117221633A (en) 2023-12-12

Family

ID=89046698

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311483894.XA Pending CN117221633A (en) 2023-11-09 2023-11-09 Virtual reality live broadcast system based on meta universe and digital twin technology

Country Status (1)

Country Link
CN (1) CN117221633A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117692610A (en) * 2024-02-02 2024-03-12 建龙西林钢铁有限公司 AR workshop inspection system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190025241A (en) * 2017-09-01 2019-03-11 전남과학대학교 산학협력단 Virtual Reality exploration interaction using head mounted display
CN115079709A (en) * 2022-06-07 2022-09-20 曹欣 Unmanned aerial vehicle distribution network tower inspection task planning method and system based on metauniverse
CN115454240A (en) * 2022-09-05 2022-12-09 无锡雪浪数制科技有限公司 Meta universe virtual reality interaction experience system and method
CN115525144A (en) * 2022-04-18 2022-12-27 颜峰 Multi-object interaction equipment based on virtual reality and interaction method thereof
CN116668605A (en) * 2023-05-10 2023-08-29 北京国际云转播科技有限公司 Meta-universe studio system, playing method, storage medium and electronic device
CN116744027A (en) * 2023-06-12 2023-09-12 武汉灏存科技有限公司 Meta universe live broadcast system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190025241A (en) * 2017-09-01 2019-03-11 전남과학대학교 산학협력단 Virtual Reality exploration interaction using head mounted display
CN115525144A (en) * 2022-04-18 2022-12-27 颜峰 Multi-object interaction equipment based on virtual reality and interaction method thereof
CN115079709A (en) * 2022-06-07 2022-09-20 曹欣 Unmanned aerial vehicle distribution network tower inspection task planning method and system based on metauniverse
CN115454240A (en) * 2022-09-05 2022-12-09 无锡雪浪数制科技有限公司 Meta universe virtual reality interaction experience system and method
CN116668605A (en) * 2023-05-10 2023-08-29 北京国际云转播科技有限公司 Meta-universe studio system, playing method, storage medium and electronic device
CN116744027A (en) * 2023-06-12 2023-09-12 武汉灏存科技有限公司 Meta universe live broadcast system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117692610A (en) * 2024-02-02 2024-03-12 建龙西林钢铁有限公司 AR workshop inspection system
CN117692610B (en) * 2024-02-02 2024-04-26 建龙西林钢铁有限公司 AR workshop inspection system

Similar Documents

Publication Publication Date Title
CN111698390B (en) Virtual camera control method and device, and virtual studio implementation method and system
US10491887B2 (en) System and method of limiting processing by a 3D reconstruction system of an environment in a 3D reconstruction of an event occurring in an event space
CN110650354B (en) Live broadcast method, system, equipment and storage medium for virtual cartoon character
CN109889914B (en) Video picture pushing method and device, computer equipment and storage medium
KR101203243B1 (en) Interactive viewpoint video system and process
US10321117B2 (en) Motion-controlled body capture and reconstruction
Kanade et al. Virtualized reality: Concepts and early results
US6573912B1 (en) Internet system for virtual telepresence
CN107925753A (en) The method and system of 3D rendering seizure is carried out using dynamic camera
US20060165310A1 (en) Method and apparatus for a virtual scene previewing system
JP2015521419A (en) A system for mixing or synthesizing computer generated 3D objects and video feeds from film cameras in real time
US10545215B2 (en) 4D camera tracking and optical stabilization
CN117221633A (en) Virtual reality live broadcast system based on meta universe and digital twin technology
CN109257584B (en) User watching viewpoint sequence prediction method for 360-degree video transmission
EP4111677B1 (en) Multi-source image data synchronization
JP2023546739A (en) Methods, apparatus, and systems for generating three-dimensional models of scenes
CN114363689A (en) Live broadcast control method and device, storage medium and electronic equipment
Kanade et al. Virtualized reality: perspectives on 4D digitization of dynamic events
CN116012509A (en) Virtual image driving method, system, equipment and storage medium
CN102118574A (en) Method for sports event live broadcast
CN102118573A (en) Virtual sports system with increased virtuality and reality combination degree
US20230394749A1 (en) Lighting model
CN118118643A (en) Video data processing method and related device
Amar et al. Live Free-View Video for Soccer Games
WO2023157005A1 (en) An augmented reality interface for watching live sport games

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination