CN112423014A - Remote review method and device - Google Patents

Remote review method and device Download PDF

Info

Publication number
CN112423014A
CN112423014A CN202011304916.8A CN202011304916A CN112423014A CN 112423014 A CN112423014 A CN 112423014A CN 202011304916 A CN202011304916 A CN 202011304916A CN 112423014 A CN112423014 A CN 112423014A
Authority
CN
China
Prior art keywords
panoramic
live
live video
video
dimensional model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011304916.8A
Other languages
Chinese (zh)
Inventor
杨顺超
徐欣
毕航
钱广璞
方玮祎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Electric Group Corp
Original Assignee
Shanghai Electric Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Electric Group Corp filed Critical Shanghai Electric Group Corp
Priority to CN202011304916.8A priority Critical patent/CN112423014A/en
Publication of CN112423014A publication Critical patent/CN112423014A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/23424Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving splicing one content stream with another content stream, e.g. for inserting or substituting an advertisement
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4223Cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/426Internal components of the client ; Characteristics thereof
    • H04N21/42653Internal components of the client ; Characteristics thereof for processing graphics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44016Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving splicing one content stream with another content stream, e.g. for substituting a video clip

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application relates to the technical field of virtual reality, in particular to a remote review method and a device, wherein a cloud server acquires three-dimensional model information of an object to be reviewed, which is sent by an application platform, and acquires a panoramic live broadcast video transmitted by a panoramic camera; and sending the three-dimensional model information and the panoramic live video to a third-party platform so as to enable the third-party platform to generate a panoramic Virtual Reality (VR) live video according to the three-dimensional model information and the panoramic live video, and sending the panoramic VR live video to a client for display so as to enable a user to remotely review the object to be reviewed according to the panoramic VR live video, so that the panoramic VR live video is generated according to the three-dimensional model information and the panoramic live video, and the object to be reviewed can be remotely reviewed.

Description

Remote review method and device
Technical Field
The application relates to the technical field of virtual reality, in particular to a remote review method and device.
Background
With the development of internet technology, more and more users begin to use live mobile phones to perform online functions such as teaching, entertainment and the like, but in some professional fields, live mobile phones cannot meet the requirements of users, for example, when an object to be evaluated is evaluated, each evaluation person is generally required to go to a competition site to perform centralized evaluation, but when the site environment does not meet the conditions, the centralized evaluation cannot be performed on the competition site, and when the evaluation is performed through live mobile phones, the object to be evaluated cannot be observed comprehensively, so that remote evaluation cannot be performed only through live mobile phones, and how to perform remote evaluation on the object to be evaluated can be achieved becomes a problem to be solved urgently.
Disclosure of Invention
The embodiment of the application provides a remote review method and a remote review device, so as to realize remote review of an object to be reviewed.
The embodiment of the application provides the following specific technical scheme:
a remote review method, comprising:
the cloud server acquires three-dimensional model information of an object to be evaluated, which is sent by the application platform, and acquires a panoramic live broadcast video transmitted by the panoramic camera;
and sending the three-dimensional model information and the panoramic live video to a third-party platform so that the third-party platform generates a panoramic Virtual Reality (VR) live video according to the three-dimensional model information and the panoramic live video, and sending the panoramic VR live video to a client for display so that a user can remotely review the object to be reviewed according to the panoramic VR live video.
Optionally, the three-dimensional model information is obtained by performing three-dimensional modeling on an object to be reviewed by using an application platform to obtain a modeling model, performing planarization processing on the modeling model, making a planar map of the planarized modeling model, obtaining a planar map of the modeling model, and obtaining the planar map according to the planar map and the modeling model, wherein the ratio between the object to be reviewed and the three-dimensional model is 1: 1.
Optionally, acquiring a panoramic live video transmitted by a panoramic camera specifically includes:
generating live broadcast address information of a newly-built live broadcast room, wherein the live broadcast address information at least comprises a stream pushing address;
sending the streaming address to the panoramic camera so that the panoramic camera can push the live video shot by each camera to the cloud server according to the streaming address;
and receiving live videos pushed by the panoramic camera, performing live video combination on the live videos to generate a panoramic live video, and sending the panoramic live video to the third-party platform.
Optionally, live video combination is performed on each live video to generate a panoramic live video, and the method specifically includes:
transcoding each live video respectively to obtain each transcoded live video;
performing video splicing processing on each transcoded live video to obtain live videos subjected to video splicing;
and packaging the spliced live video to obtain a panoramic live video.
Optionally, if the live broadcast address information further includes a live broadcast address and a stream pulling address corresponding to the stream pushing address, the three-dimensional model information and the panoramic live broadcast video are sent to a third-party platform, so that the third-party platform generates a panoramic VR live broadcast video according to the three-dimensional model information and the panoramic live broadcast video, and sends the panoramic VR live broadcast video to a client for display, so that a user can remotely review an object to be reviewed according to the panoramic VR live broadcast video, including:
sending the live broadcast address to the third-party platform, so that the third-party platform obtains the panoramic live broadcast video from the cloud server according to the live broadcast address, and inserts the three-dimensional model information into the panoramic live broadcast video to generate a panoramic VR live broadcast video;
sending the streaming address to the client, so that the client pulls the panoramic VR live video from the third-party platform according to the streaming address, and displays the panoramic VR live video according to preset configuration information, so that a user can remotely review the object to be reviewed according to the panoramic VR live video, wherein the configuration information at least comprises one or any combination of the following items: resolution, code rate, viewing page configuration information, and buffer information, where the viewing page configuration information includes any one of: business version pages, entertainment version pages, and business Chinese and English version pages.
A remote review method, comprising:
the method comprises the steps that a third-party platform receives three-dimensional model information and a panoramic live broadcast video sent by a cloud server, wherein the three-dimensional model information is a three-dimensional model of an object to be evaluated;
generating a panoramic VR live video according to the three-dimensional model information and the panoramic live video;
and sending the panoramic VR live video to a client for display so that a user can remotely review the object to be reviewed according to the panoramic VR live video.
Optionally, the three-dimensional model information is obtained by performing three-dimensional modeling on an object to be reviewed by using an application platform to obtain a modeling model, performing planarization processing on the modeling model, making a planar map of the planarized modeling model, obtaining a planar map of the modeling model, and obtaining the planar map according to the planar map and the modeling model, wherein the ratio between the object to be reviewed and the three-dimensional model is 1: 1.
Optionally, the panoramic live broadcast video is generated by the cloud server as live broadcast address information of a newly-built live broadcast room, and is sent to the panoramic camera according to a stream pushing address in the live broadcast address information, and is generated by receiving live broadcast videos shot by the panoramic camera according to each camera pushed by the stream pushing address and performing live broadcast video combination on each live broadcast video.
Optionally, if the live broadcast address information further includes a live broadcast address, generating a panoramic VR live broadcast video according to the three-dimensional model information and the panoramic live broadcast video, specifically including:
receiving a live broadcast address sent by the cloud server;
acquiring the panoramic live broadcast video from the cloud server according to the live broadcast address;
and inserting the three-dimensional model information into the panoramic live video to generate a panoramic VR live video.
Optionally, if the live address information further includes a streaming address, the live VR video is sent to a client for display, so that a user can remotely review the object to be reviewed according to the live VR video, specifically including:
and sending the panoramic VR live video to the client so that the client displays the panoramic VR live video according to preset configuration information to enable a user to remotely review the object to be reviewed according to the panoramic VR live video, wherein the streaming address is sent to the client by the cloud server, and the streaming address is used for the client to pull the panoramic VR live video from the third-party platform.
Optionally, the configuration information at least includes one or any combination of the following items: resolution, code rate, viewing page configuration information and buffer area information;
the viewing page configuration information includes any one of: business version pages, entertainment version pages, and business Chinese and English version pages.
A remote monitoring system, comprising:
the application platform is used for generating three-dimensional model information of an object to be evaluated and sending the three-dimensional model information to the cloud server;
the panoramic camera is used for transmitting the shot panoramic live broadcast video to the cloud server;
the cloud server is used for sending the three-dimensional model information and the panoramic live video to a third-party platform;
the third-party platform is used for generating a VR live video according to the three-dimensional model information and the panoramic live video and sending the panoramic VR live video to a client;
and the client is used for receiving the panoramic VR live video and displaying the panoramic VR live video so that a user can remotely review the object to be reviewed according to the panoramic VR live video.
The utility model provides a remote review device, is applied to high in the clouds server, includes:
the system comprises a first processing module, a second processing module and a third processing module, wherein the first processing module is used for acquiring three-dimensional model information of an object to be evaluated, which is sent by an application platform, and acquiring a panoramic live broadcast video transmitted by a panoramic camera;
and the second processing module is used for sending the three-dimensional model information and the panoramic live video to a third-party platform so that the third-party platform can generate a panoramic Virtual Reality (VR) live video according to the three-dimensional model information and the panoramic live video, and the panoramic VR live video is sent to a client for display so that a user can remotely review the object to be reviewed according to the panoramic VR live video.
Optionally, the three-dimensional model information is obtained by performing three-dimensional modeling on an object to be reviewed by using an application platform to obtain a modeling model, performing planarization processing on the modeling model, making a planar map of the planarized modeling model, obtaining a planar map of the modeling model, and obtaining the planar map according to the planar map and the modeling model, wherein the ratio between the object to be reviewed and the three-dimensional model is 1: 1.
Optionally, when acquiring a live panoramic video transmitted by the panoramic camera, the first processing module is specifically configured to:
generating live broadcast address information of a newly-built live broadcast room, wherein the live broadcast address information at least comprises a stream pushing address;
sending the streaming address to the panoramic camera so that the panoramic camera can push the live video shot by each camera to the cloud server according to the streaming address;
and receiving live videos pushed by the panoramic camera, performing live video combination on the live videos to generate a panoramic live video, and sending the panoramic live video to the third-party platform.
Optionally, the live video combination is performed on each live video, and when the panoramic live video is generated, the first processing module is specifically configured to:
transcoding each live video respectively to obtain each transcoded live video;
performing video splicing processing on each transcoded live video to obtain live videos subjected to video splicing;
and packaging the spliced live video to obtain a panoramic live video.
Optionally, if the live address information further includes a live address and a stream pulling address corresponding to the stream pushing address, the second processing module is specifically configured to:
sending the live broadcast address to the third-party platform, so that the third-party platform obtains the panoramic live broadcast video from the cloud server according to the live broadcast address, and inserts the three-dimensional model information into the panoramic live broadcast video to generate a panoramic VR live broadcast video;
sending the streaming address to the client, so that the client pulls the panoramic VR live video from the third-party platform according to the streaming address, and displays the panoramic VR live video according to preset configuration information, so that a user can remotely review the object to be reviewed according to the panoramic VR live video, wherein the configuration information at least comprises one or any combination of the following items: resolution, code rate, viewing page configuration information, and buffer information, where the viewing page configuration information includes any one of: business version pages, entertainment version pages, and business Chinese and English version pages.
A remote review device applied to a third-party platform comprises:
the receiving module is used for receiving three-dimensional model information and a panoramic live broadcast video sent by the cloud server, wherein the three-dimensional model information is a three-dimensional model of an object to be evaluated;
the generating module is used for generating a panoramic VR live video according to the three-dimensional model information and the panoramic live video;
and the processing module is used for sending the panoramic VR live video to a client for displaying so that a user can remotely review the object to be reviewed according to the panoramic VR live video.
Optionally, the three-dimensional model information is obtained by performing three-dimensional modeling on an object to be reviewed by using an application platform to obtain a modeling model, performing planarization processing on the modeling model, making a planar map of the planarized modeling model, obtaining a planar map of the modeling model, and obtaining the planar map according to the planar map and the modeling model, wherein the ratio between the object to be reviewed and the three-dimensional model is 1: 1.
Optionally, the panoramic live broadcast video is generated by the cloud server as live broadcast address information of a newly-built live broadcast room, and is sent to the panoramic camera according to a stream pushing address in the live broadcast address information, and is generated by receiving live broadcast videos shot by the panoramic camera according to each camera pushed by the stream pushing address and performing live broadcast video combination on each live broadcast video.
Optionally, if the live address information further includes a live address, the generating module is specifically configured to:
receiving a live broadcast address sent by the cloud server;
acquiring the panoramic live broadcast video from the cloud server according to the live broadcast address;
and inserting the three-dimensional model information into the panoramic live video to generate a panoramic VR live video.
Optionally, if the live address information further includes a stream pulling address, the processing module is specifically configured to:
and sending the panoramic VR live video to the client so that the client displays the panoramic VR live video according to preset configuration information to enable a user to remotely review the object to be reviewed according to the panoramic VR live video, wherein the streaming address is sent to the client by the cloud server, and the streaming address is used for the client to pull the panoramic VR live video from the third-party platform.
Optionally, the configuration information at least includes one or any combination of the following items: resolution, code rate, viewing page configuration information and buffer area information;
the viewing page configuration information includes any one of: business version pages, entertainment version pages, and business Chinese and English version pages.
An electronic device comprises a memory, a processor and a computer program stored on the memory and operable on the processor, wherein the processor implements the steps of the remote review method when executing the program.
A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the above-mentioned remote review method.
In the embodiment of the application, the cloud server acquires three-dimensional model information of an object to be evaluated sent by the application platform, acquires a panoramic live broadcast video transmitted by the panoramic camera, sends the three-dimensional model information and the panoramic live broadcast video to the third-party platform, so that the third-party platform generates a panoramic VR live broadcast video according to the three-dimensional model information and the panoramic live broadcast video, sends the panoramic VR live broadcast video to the client for display, and enables a user to remotely evaluate the object to be evaluated according to the panoramic VR live broadcast video, thus, after acquiring the three-dimensional model information of the object to be evaluated and the panoramic live broadcast video transmitted by the panoramic camera, the cloud server sends the three-dimensional model information and the panoramic live broadcast video to the third-party platform, and the third-party platform embeds the three-dimensional model information of the object to be evaluated into the panoramic live broadcast video to generate the panoramic VR video, the user can treat the object to be reviewed according to the three-dimensional model information in the panoramic VR live video to realize remote review.
Drawings
FIG. 1 is a flow chart of a remote review method in an embodiment of the present application;
FIG. 2 is a flow chart of another remote review method in an embodiment of the present application;
FIG. 3 is a schematic structural diagram of a remote review system according to an embodiment of the present application;
FIG. 4 is a block diagram illustrating a remote review method according to an embodiment of the present application;
FIG. 5 is a schematic diagram illustrating the effect of remote review in an embodiment of the present application;
FIG. 6 is a schematic structural diagram of a remote review device in an embodiment of the present application;
FIG. 7 is a schematic structural diagram of another remote review device in the embodiment of the present application;
fig. 8 is a schematic structural diagram of an electronic device in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
With the development of internet technology and the popularization of 5G communication technology, more and more users begin to use mobile phones for live broadcast to realize various online functions, for example, users can watch online game competitions, scientific education and technical education and the like through mobile phones for live broadcast, however, in some professional areas, live cell phones have not yet solved the needs of users, for example, when the objects to be evaluated are evaluated, generally, each evaluation is required to be invited to a competition field, and the objects to be evaluated are evaluated through the audit drawings, reports and PPT, however, when the site environment does not satisfy the condition, the centralized evaluation can not be performed in the competition site, and when the evaluation is performed by the live broadcast of the mobile phone, because the objects to be evaluated cannot be observed in all aspects, remote evaluation cannot be realized only by live broadcast of a mobile phone, therefore, how to realize remote evaluation of the objects to be evaluated becomes a problem to be solved urgently.
The cloud server acquires three-dimensional model information of an object to be evaluated sent by the application platform, acquires panoramic live video transmitted by the panoramic camera, and sends the three-dimensional model information and the panoramic live video to the third-party platform, so that the third-party platform generates panoramic VR live video according to the three-dimensional model information and the panoramic live video, and sends the panoramic VR live video to the client for display, so that a user can remotely evaluate the object to be evaluated according to the panoramic VR live video, therefore, the cloud server sends the three-dimensional model information of the object to be evaluated and the panoramic live video shot by the panoramic camera to the third-party platform, the third-party platform carries out video combination on the three-dimensional model information and the panoramic live video, and generates the panoramic VR video, and the user can remotely and online evaluate the object to be evaluated according to the panoramic VR video.
Based on the foregoing embodiment, referring to fig. 1, a flowchart of a remote review method in the embodiment of the present application is applied to a cloud server, and specifically includes:
step 100: the cloud server acquires the three-dimensional model information of the object to be evaluated, which is sent by the application platform, and acquires the panoramic live broadcast video transmitted by the panoramic camera.
In the embodiment of the application, the cloud server sends a model obtaining request about an object to be evaluated to the application platform, the application platform generates three-dimensional model information of the object to be evaluated after receiving the model obtaining request, the generated three-dimensional model information is sent to the cloud server, and then the cloud server receives the three-dimensional model information sent by the application platform.
The three-dimensional model information in the embodiment of the application is obtained by three-dimensionally modeling an object to be evaluated by an application platform to obtain a modeling model, planarizing the modeling model, making a planar map of the modeling model, obtaining a planar map of the modeling model and further obtaining the three-dimensional model information according to the planar map and the modeling model.
Wherein the ratio of the object to be evaluated to the three-dimensional model is 1: 1.
The manner in which the three-dimensional model information is obtained in the embodiments of the present application is described in detail below.
Firstly, the application platform carries out three-dimensional modeling on an object to be evaluated to obtain a modeling model.
In the embodiment of the application, the application platform carries out three-dimensional modeling on the object to be evaluated, which needs to be evaluated, so as to obtain the complete parts and equipment assembly body of the object to be evaluated, and further obtain a modeling model.
The application platform may be, for example, three-dimensional design software, which is not limited in the embodiment of the present application.
The object to be evaluated may be, for example, a device that needs to be evaluated, a clap, a congratulatory message, and the like, which is not limited in the embodiment of the present application.
It should be noted that, in order to improve the exhibition effect and increase the sense of reality of the modeling model, when the object to be evaluated is subjected to three-dimensional modeling, the object to be evaluated is subjected to 1:1 three-dimensional modeling, details such as chamfers, fillets and the like are retained, and no optimization or simplification is performed, so that the ratio between the object to be evaluated and the modeling model is 1: 1.
Then, after the modeling model is obtained, the three-dimensional animation software is used for carrying out dynamic behavior simulation on the modeling model, and the principle, the processing technology and the assembly process of the object to be evaluated are presented by using the visual animation effect.
The three-dimensional animation software may be, for example, maya, 3dmax, or the like, which is not limited in the embodiment of the present application.
It should be noted that the motion trajectory and the appearance size of the modeling model in the animation are simulated strictly according to the standard file.
The standard document may be, for example, a drawing, a craft card, and the like, which is not limited in the embodiment of the present application.
And finally, carrying out planarization treatment on the modeling model to obtain a planarized modeling model, carrying out charting on the planarized modeling model to obtain a plane chartlet of the modeling model, and attaching the plane chartlet to the modeling model to obtain three-dimensional model information.
It should be noted that, in order to better attach the plane map to the modeling model, the modeling model needs to be split into a plane to obtain a planarized modeling model, and because the generated plane map is planar, the modeling model needs to be split into planes, and the position of the content of the plane map corresponding to the position of the modeling model needs to be accurate.
In addition, three-dimensional model rendering needs to be performed on the three-dimensional model, for example, materials, lights, shadows and the like are added to the three-dimensional model, so that a more realistic model appearance is obtained, and the manufactured three-dimensional model is more real.
In the embodiment of the application, the cloud server can also send a video acquisition request to the panoramic camera, and after the panoramic camera receives the video acquisition request, the shot panoramic live video is sent to the cloud server, so that the cloud server receives the panoramic live video sent by the panoramic camera.
The following describes in detail the step of acquiring the panoramic video transmitted by the panoramic camera in the embodiment of the present application, and specifically includes:
s1: and generating the live broadcast address information of the newly-built live broadcast room.
Wherein, the live broadcast address information at least comprises a stream pushing address.
In the embodiment of the application, a user can establish a new live broadcast room on a panoramic live broadcast cloud platform, live broadcast room information is configured, the live broadcast room information at least comprises a terminal number upper limit value and live broadcast time, then, a cloud server generates live broadcast address information of the new live broadcast room, and the live broadcast address information at least comprises a stream pushing address.
And the stream pushing address is used for pushing the shot panoramic live video to the cloud server by the panoramic camera.
S2: and sending the stream pushing address to the panoramic camera so that the panoramic camera pushes the live video shot by each camera to the cloud server according to the stream pushing address.
In the embodiment of the application, after the stream pushing address is generated, the stream pushing address is sent to the panoramic camera, and the panoramic camera pushes the live video shot by each camera to the cloud server according to the stream pushing address.
S3: and receiving live broadcast videos pushed by the panoramic camera, performing live broadcast video combination on the live broadcast videos to generate panoramic live broadcast videos, and sending the panoramic live broadcast videos to a third-party platform.
In the embodiment of the application, the cloud server receives live videos pushed by the panoramic camera according to the pushing address, the live videos are spliced and combined to generate the panoramic live video, and then the generated panoramic live video is sent to the third-party platform.
The following describes in detail the steps of splicing live videos in the embodiment of the present application, and specifically includes:
s1: and transcoding each live video respectively to obtain each transcoded live video.
In the embodiment of the application, each live video is transcoded into a video format which can be identified by the client, and each transcoded live video is obtained.
S2: and performing video splicing processing on each transcoded live video to obtain the live video subjected to video splicing.
In the embodiment of the application, the transcoded live videos are subjected to video splicing to form a panoramic live video.
Further, because the concatenation effect of the live broadcast video that each camera of panoramic camera was shot depends on concrete scene of shooing, for example, the effect of camera between different long-range scenes and close-range scenes can have the difference, consequently, when the user is when preview or trial shooting video, when the discovery is unsatisfied to real-time concatenation effect, can calibrate through panoramic camera's calibration function, and then set up the concatenation calibration when each camera of panoramic camera shoots the live broadcast video, so that the high in the clouds server is when splicing each live broadcast video, can obtain the live broadcast video after the concatenation of better effect.
S3: and packaging the spliced live video to obtain the panoramic live video.
In the embodiment of the application, after the live video is spliced, the live video of the spliced panorama is packaged to obtain the live video of the panorama.
Step 110: and sending the three-dimensional model information and the panoramic live video to a third-party platform so that the third-party platform generates a VR live video according to the three-dimensional model information and the panoramic live video, and sending the panoramic VR live video to a client for display so that a user can remotely review an object to be reviewed according to the panoramic VR live video.
In this embodiment, when step 110 is executed, the method specifically includes:
s1: and sending the live broadcast address to a third-party platform so that the third-party platform can acquire the panoramic live broadcast video from the cloud server according to the live broadcast address, and inserting the three-dimensional model information into the panoramic live broadcast video to generate the panoramic VR live broadcast video.
In the embodiment of the application, because the cloud server can also generate a live broadcast address when generating the streaming address, the cloud server sends the live broadcast address to the third-party platform, so that the third-party platform can acquire the panoramic live broadcast video from the cloud server according to the live broadcast address, and can embed the three-dimensional model information into the panoramic live broadcast video, so that the panoramic live broadcast video and the three-dimensional model information can be fused to generate the panoramic VR live broadcast video.
S2: and sending the streaming address to the client so that the client pulls the panoramic VR live video from the third-party platform according to the streaming address, and displays the panoramic VR live video according to preset configuration information so that the client carries out remote review on the object to be reviewed according to the panoramic VR live video.
In this application embodiment, the cloud server sends the generated streaming address to the client, so that the client pulls the panoramic VR live video from the third party platform according to the streaming address, and displays the pulled panoramic VR live video, so that the user can remotely review the object to be reviewed according to the panoramic VR live video.
The streaming address is generated by the cloud server according to the live broadcast room information and corresponds to the streaming pushing address, and the streaming pulling address is used for pulling the panoramic VR live broadcast video from the client to the third-party platform.
It should be noted that, in this embodiment of the application, the client displays the panoramic VR live video according to the pre-configured configuration information, where the configuration information at least includes one or any combination of the following items: resolution, code rate, viewing page configuration information, buffer information.
Wherein, the configuration information can be resolution ratio, and before carrying out the live broadcast of panorama VR, the resolution ratio when broadcasting directly sets up in advance in the customer end earlier, definition when can effectively increase the live broadcast of panorama VR.
Configuration information can also be code rate, before carrying out the live broadcast of panorama VR, the code rate when broadcasting directly is preset in the customer end earlier, and the smooth degree when can effectively increase the live broadcast of panorama VR.
The configuration information may also be configuration information of a viewing page, and the configuration information of the viewing page includes any one of the following: the user can freely select a viewing page according to a live broadcast scene, and can also set interfaces such as a viewing page background picture, an advertisement page, a Personal Computer (PC) end and a hypertext Markup Language (hypertext Markup Language 5, H5) end live broadcast cover, and the like, which are not limited in the embodiment of the application.
The configuration information can also be buffer area information, and before the client displays the panoramic VR live video, the size of the buffer area is set, for example, the size of the audio and video buffer areas of the panoramic VR live video can be set to 30 frames, so that the fluency of the panoramic VR live video output can be ensured.
After the configuration information is set, the client can display the panoramic VR live video according to the set configuration information, and then the user can treat the evaluation object to perform remote evaluation according to the panoramic VR live video.
Furthermore, after the live broadcast is started, the pull stream address can be issued, and a user can watch the pull stream address; the link sends the pull stream address to other clients, and then the other clients can pull the panoramic VR live video according to the pull stream address.
Further, during the live broadcast, real-time data statistics and analysis, such as viewing duration, viewing peak icon, geographical distribution icon, viewing device analysis, etc., can be viewed.
In the embodiment of the application, the three-dimensional model information and the panoramic live video shot by the panoramic camera are acquired, the three-dimensional model information is embedded into the panoramic live video, the panoramic VR live video is generated, and the traditional mobile phone is broken through. The computer plane live broadcast mechanism can realize the virtual-real combined live broadcast presenting function and realize the remote immersive digital review.
Based on the foregoing embodiment, referring to fig. 2, a flowchart of another remote review method in the embodiment of the present application is applied to a third party platform, and specifically includes:
step 200: and receiving the three-dimensional model information and the panoramic live broadcast video sent by the cloud server.
And the three-dimensional model information is a three-dimensional model of the object to be evaluated.
The three-dimensional model information is obtained by the application platform performing three-dimensional modeling on an object to be evaluated to obtain a modeling model, performing planarization processing on the modeling model, making a planar map of the modeling model, obtaining a planar map of the modeling model and obtaining the three-dimensional model information according to the planar map and the modeling model.
Wherein the ratio of the object to be evaluated to the three-dimensional model is 1: 1.
The panoramic live broadcast video is generated by a cloud server as live broadcast address information of a newly-built live broadcast room, is sent to the panoramic camera according to a stream pushing address in the live broadcast address information, receives live broadcast videos shot by the panoramic camera according to each camera pushed by the stream pushing address, and is generated by combining live broadcast videos of each live broadcast video.
Step 210: and generating a panoramic VR live video according to the three-dimensional model information and the panoramic live video.
In this embodiment of the application, if the live address information further includes a live address, when step 210 is executed, the method specifically includes:
s1: and receiving a live broadcast address sent by the cloud server.
S2: and acquiring the panoramic live broadcast video from the cloud server according to the live broadcast address.
S3: and inserting the three-dimensional model information into the panoramic live video to generate the panoramic VR live video.
Step 220: and sending the panoramic VR live video to a client for display so that a user can remotely review an object to be reviewed according to the panoramic VR live video.
If the live address information further includes a stream pulling address, the step 220 is executed, which specifically includes:
and sending the panoramic VR live video to the client so that the client can display the panoramic VR live video according to preset configuration information, and a user can remotely review the object to be reviewed according to the panoramic VR live video.
The streaming address is sent to the client by the cloud server, and the streaming address is used for the client to pull the panoramic VR live video from the third-party platform.
The configuration information at least comprises one or any combination of the following items: resolution, code rate, viewing page configuration information, buffer information.
The viewing page configuration information includes any one of: business version pages, entertainment version pages, and business Chinese and English version pages.
In this application embodiment, the third party platform receives three-dimensional model information and the live video of panorama that the high in the clouds server sent to according to three-dimensional model information and the live video of panorama, generate the live video of panorama VR, send the live video of panorama VR for the customer end and demonstrate, treat the review object according to the live video of panorama VR and carry out remote review, like this, can realize remote review through adding digital three-dimensional model.
Based on the above embodiments, referring to fig. 3, a schematic structural diagram of a remote review system in the embodiments of the present application is shown, which specifically includes:
and the application platform is used for generating three-dimensional model information of the object to be evaluated and sending the three-dimensional model information to the cloud server.
And the panoramic camera is used for transmitting the shot panoramic live video to the cloud server.
And the cloud server is used for sending the three-dimensional model information and the panoramic live video to a third-party platform.
And the third-party platform is used for generating a VR live video according to the three-dimensional model information and the panoramic live video and sending the panoramic VR live video to the client.
And the client is used for receiving the panoramic VR live video and displaying the panoramic VR live video so as to enable the user to remotely review the object to be reviewed according to the panoramic VR live video.
In the embodiment of the application, through VR live broadcast, a user can be immersed in the environment where the panoramic camera is located, the dynamic state of an evaluation and review site is sensed in real time, and remote digital evaluation can be realized by adding a digital three-dimensional model in a review stage.
Based on the foregoing embodiment, referring to fig. 4, a structural block diagram of a remote review method in the embodiment of the present application is specifically included:
1. and (5) building a hardware system.
In the embodiment of the application, a hardware system for panoramic live broadcast is firstly built, a VR special tripod for live broadcast is prepared, and a panoramic camera, a router and a computer control platform are connected by using a network special line of more than 20 megabytes.
2. And a cloud server.
The cloud server is used for acquiring a three-dimensional model of an object to be evaluated, which is sent by the application platform, acquiring a panoramic live broadcast video transmitted by the panoramic camera, and sending the three-dimensional model and the panoramic live broadcast video to the third-party platform.
3. And (5) video processing.
In this application embodiment, carry out transcoding to the live video of each camera shooting of panoramic camera and handle, obtain the live video after each transcoding, carry out the video concatenation with the live video after each transcoding and handle, obtain the live video after the video concatenation, encapsulate the live video after the concatenation, obtain the live video of panorama.
4. And (5) live broadcasting and plug streaming.
After the live broadcast starts, the third-party platform creates a live broadcast room, sends the configured live broadcast room information to the cloud server, and then the cloud server generates live broadcast address information according to the upper limit value of the number of terminals in the live broadcast room information and the live broadcast time, and sends the streaming address in the live broadcast address information to the panoramic camera, so that the panoramic camera pushes the live broadcast video shot by each camera to the cloud server according to the streaming address.
5. And (4) three-dimensional modeling.
The method comprises the steps that an application platform carries out three-dimensional modeling on an object to be evaluated to obtain a modeling model, planarization processing is carried out on the modeling model, a planar map of the modeling model is made, a planar map of the modeling model is obtained, the three-dimensional model is obtained according to the planar map and the modeling model, then model rendering is carried out on the three-dimensional model, and the rendered three-dimensional model is obtained.
6. And (4) OBS software.
And inputting the live broadcast address into OBS software, and then obtaining a panoramic live broadcast video from a cloud server by the OBS software according to the live broadcast address, and embedding the three-dimensional model into the panoramic live broadcast video to generate a panoramic VR live broadcast video.
7. And (4) a client.
The client receives the streaming address sent by the cloud server, pulls the panoramic VR live video from the OBS software according to the streaming address, and displays the panoramic VR live video according to preset configuration information.
The client may be, for example, a player, a web page end, a mobile end, a helmet end, and the like, which is not limited in the embodiment of the present application.
8. And (6) remote evaluation.
After the client displays the panoramic VR live video, the user can remotely review the object to be reviewed according to the panoramic VR live video.
In the embodiment of the application, the three-dimensional model is embedded into the panoramic live broadcast video, the traditional plane live broadcast mechanism of a mobile phone and a computer is broken, the immersive VR live broadcast is adopted, the surrounding environment can be presented in an all-around manner by 360 degrees through the head display device or the mobile communication device, the virtual animation elements are added, the virtual and real combined effect is formed, the remote digital review of multiple people in different places is realized, and the working efficiency is improved.
Based on the above embodiments, the remote review method in the embodiments of the present application is explained in detail by taking knowledge competition as an example.
Firstly, three-dimensional modeling is carried out on special effect elements involved in each link of knowledge competition.
In the embodiment of the application, the brief information of the participants, the titles of different competition links, the integral condition of competition scores, promotion and elimination actions are modeled in three dimensions, and corresponding background music is made.
The method comprises the steps of shooting hands, congratulating, oiling and the like, wherein 3dmax software is used for three-dimensional modeling and animation production, then mapping, material production and other procedures are carried out on a modeling model according to the real condition, the animation effect is more realistic, finally the three-dimensional model added with the materials is rendered and exported, a sequence frame in a PNG format can be selected during export, and the obtained sequence frames are spliced into video animation or GIF animation.
Because the PNG format picture does not contain background color, the bottom layer picture can not be shielded when the PNG format picture is added to a real panoramic live broadcast scene in a later period.
Then, a panoramic live broadcast system is set up, and a unipod, a recording device, a panoramic camera, a control end and a router special for panoramic VR live broadcast are selected.
Wherein, owing to use the direct dedicated unipod of VR, consequently need not to mend ground, it is very convenient to operate.
Furthermore, a tripod with an 1/4 interface can be used, and after the panoramic live video is shot, the support at the bottom is removed through PS.
The sound recording device may be, for example, an H2N, 3.55mm interface microphone, a USB interface microphone, a sound console, a wireless microphone, or a built-in microphone of a panoramic camera.
The panoramic camera may be, for example, an insta360 panoramic camera.
The control terminal can be, for example, a mobile phone, a computer, etc.
It should be noted that the panoramic camera, the control end, and the router need to be connected to the same local area network, and the network may be 20 megabytes or more, for example.
After the panoramic live broadcast system is built, a user can create a live broadcast room on a live broadcast platform, live broadcast room information is set, the live broadcast room information is transmitted to a cloud server, and the cloud server generates live broadcast address information according to the live broadcast room information after receiving the live broadcast room information.
The live broadcast room information may be, for example, an upper limit value of the number of terminals, live broadcast time, and a cover password of a live broadcast room.
The live address information may be, for example, a live address, a push stream address, a pull stream address, a stream key.
And then, after the stream pushing address is generated, the stream pushing address and the stream key are sent to the panoramic camera, so that the panoramic camera pushes the live videos shot by the cameras to the cloud server according to the stream pushing address and the stream key, and the cloud server combines the live videos to generate the panoramic live videos.
Because virtual elements need to be blended in the panoramic video, the source and the live broadcast address of the panoramic live broadcast video are added in the parameter setting by adopting OBS live broadcast software.
Then, the GIF animation is added to the real-time panoramic live broadcast picture, and then the panoramic VR live broadcast video can be generated.
Finally, after the cloud server generates the streaming address, the streaming address is sent to the client, the client pulls the panoramic VR live video from the cloud server according to the streaming address, and the panoramic VR live video is displayed according to configuration information preset by a user, so that an interactive effect of virtual and real results can be realized, and as shown in fig. 5, an effect schematic diagram of remote review in the embodiment of the application is shown.
The configuration information may be, for example, a code rate, a resolution, audio, and the like.
In the embodiment of the application, three-dimensional modeling is carried out through the action that needs to be used in the knowledge competition, generate three-dimensional model, fuse the three-dimensional model that generates and the live video of the panorama of shooting, generate the live video of panoramic VR, thus, can let the user outside the main meeting place participate in real time, not only can transmit the live video of panoramic VR to each terminal during live, also show the match condition on the live video of panoramic VR, the match title, player information, the match score, special effect animation etc., form a set of rich digital live panorama, can realize the interactive exchange of long-range immersive.
Based on the same inventive concept, the embodiment of the present application further provides a remote review device, which may be, for example, the cloud server in the above embodiments, and the remote review device may be a hardware structure, a software module, or a hardware structure plus a software module. Based on the above embodiments, referring to fig. 6, a schematic structural diagram of a remote review device in an embodiment of the present application is shown, which specifically includes:
the first processing module 600 is configured to obtain three-dimensional model information of an object to be evaluated, which is sent by an application platform, and obtain a panoramic live broadcast video transmitted by a panoramic camera;
and the second processing module 610 is used for sending the three-dimensional model information and the panoramic live video to a third-party platform, so that the third-party platform generates a panoramic Virtual Reality (VR) live video according to the three-dimensional model information and the panoramic live video, and sends the panoramic VR live video to a client for display, so that a user can remotely review the object to be reviewed according to the panoramic VR live video.
Optionally, the three-dimensional model information is obtained by performing three-dimensional modeling on an object to be reviewed by using an application platform to obtain a modeling model, performing planarization processing on the modeling model, making a planar map of the planarized modeling model, obtaining a planar map of the modeling model, and obtaining the planar map according to the planar map and the modeling model, wherein the ratio between the object to be reviewed and the three-dimensional model is 1: 1.
Optionally, when acquiring a live panoramic video transmitted by a panoramic camera, the first processing module 600 is specifically configured to:
generating live broadcast address information of a newly-built live broadcast room, wherein the live broadcast address information at least comprises a stream pushing address;
sending the streaming address to the panoramic camera so that the panoramic camera can push the live video shot by each camera to the cloud server according to the streaming address;
and receiving live videos pushed by the panoramic camera, performing live video combination on the live videos to generate a panoramic live video, and sending the panoramic live video to the third-party platform.
Optionally, the live video combination is performed on each live video, and when a panoramic live video is generated, the first processing module 600 is specifically configured to:
transcoding each live video respectively to obtain each transcoded live video;
performing video splicing processing on each transcoded live video to obtain live videos subjected to video splicing;
and packaging the spliced live video to obtain a panoramic live video.
Optionally, if the live address information further includes a live address and a stream pulling address corresponding to the stream pushing address, the second processing module 610 is specifically configured to:
sending the live broadcast address to the third-party platform, so that the third-party platform obtains the panoramic live broadcast video from the cloud server according to the live broadcast address, and inserts the three-dimensional model information into the panoramic live broadcast video to generate a panoramic VR live broadcast video;
sending the streaming address to the client, so that the client pulls the panoramic VR live video from the third-party platform according to the streaming address, and displays the panoramic VR live video according to preset configuration information, so that a user can remotely review the object to be reviewed according to the panoramic VR live video, wherein the configuration information at least comprises one or any combination of the following items: resolution, code rate, viewing page configuration information, and buffer information, where the viewing page configuration information includes any one of: business version pages, entertainment version pages, and business Chinese and English version pages.
Based on the same inventive concept, another remote review device is also provided in the embodiments of the present application, and the another remote review device may be, for example, a third party platform in the embodiments described above, and the another remote review device may be a hardware structure, a software module, or a hardware structure plus a software module. Based on the above embodiments, referring to fig. 7, a schematic structural diagram of another remote review device in the embodiment of the present application is shown, which specifically includes:
the receiving module 700 is configured to receive three-dimensional model information and a panoramic live broadcast video sent by a cloud server, where the three-dimensional model information is a three-dimensional model of an object to be evaluated;
a generating module 710, configured to generate a panoramic VR live video according to the three-dimensional model information and the panoramic live video;
and the processing module 720 is used for sending the panoramic VR live video to a client for displaying so that a user can remotely review the object to be reviewed according to the panoramic VR live video.
Optionally, the three-dimensional model information is obtained by performing three-dimensional modeling on an object to be reviewed by using an application platform to obtain a modeling model, performing planarization processing on the modeling model, making a planar map of the planarized modeling model, obtaining a planar map of the modeling model, and obtaining the planar map according to the planar map and the modeling model, wherein the ratio between the object to be reviewed and the three-dimensional model is 1: 1.
Optionally, the panoramic live broadcast video is generated by the cloud server as live broadcast address information of a newly-built live broadcast room, and is sent to the panoramic camera according to a stream pushing address in the live broadcast address information, and is generated by receiving live broadcast videos shot by the panoramic camera according to each camera pushed by the stream pushing address and performing live broadcast video combination on each live broadcast video.
Optionally, if the live address information further includes a live address, the generating module 710 is specifically configured to:
receiving a live broadcast address sent by the cloud server;
acquiring the panoramic live broadcast video from the cloud server according to the live broadcast address;
and inserting the three-dimensional model information into the panoramic live video to generate a panoramic VR live video.
Optionally, if the live address information further includes a stream pulling address, the processing module 720 is specifically configured to:
and sending the panoramic VR live video to the client so that the client displays the panoramic VR live video according to preset configuration information to enable a user to remotely review the object to be reviewed according to the panoramic VR live video, wherein the streaming address is sent to the client by the cloud server, and the streaming address is used for the client to pull the panoramic VR live video from the third-party platform.
Optionally, the configuration information at least includes one or any combination of the following items: resolution, code rate, viewing page configuration information and buffer area information;
the viewing page configuration information includes any one of: business version pages, entertainment version pages, and business Chinese and English version pages.
Based on the above embodiments, fig. 8 is a schematic structural diagram of an electronic device in an embodiment of the present application.
An embodiment of the present application provides an electronic device, which may include a processor 810 (CPU), a memory 820, an input device 830, an output device 840, and the like, where the input device 830 may include a keyboard, a mouse, a touch screen, and the like, and the output device 840 may include a Display device, such as a Liquid Crystal Display (LCD), a Cathode Ray Tube (CRT), and the like.
Memory 820 may include Read Only Memory (ROM) and Random Access Memory (RAM), and provides processor 810 with program instructions and data stored in memory 820. In the embodiment of the present application, the memory 820 may be used for storing a program of any one of the remote review methods in the embodiment of the present application.
The processor 810 is configured to execute any of the remote review methods of the embodiments of the present application according to the obtained program instructions by calling the program instructions stored in the memory 820.
Based on the above embodiments, in the embodiments of the present application, a computer-readable storage medium is provided, on which a computer program is stored, and the computer program, when executed by a processor, implements the remote review method in any of the above method embodiments.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (10)

1. A remote review method, comprising:
the cloud server acquires three-dimensional model information of an object to be evaluated, which is sent by the application platform, and acquires a panoramic live broadcast video transmitted by the panoramic camera;
and sending the three-dimensional model information and the panoramic live video to a third-party platform so that the third-party platform generates a panoramic Virtual Reality (VR) live video according to the three-dimensional model information and the panoramic live video, and sending the panoramic VR live video to a client for display so that a user can remotely review the object to be reviewed according to the panoramic VR live video.
2. The method of claim 1, wherein the three-dimensional model information is obtained by three-dimensionally modeling an object to be reviewed for an application platform to obtain a modeled model, planarizing the modeled model, making a map of the planarized modeled model, obtaining a planar map of the modeled model, and further obtaining the planar map according to the planar map and the modeled model, wherein a ratio between the object to be reviewed and the three-dimensional model is 1: 1.
3. The method of claim 1, wherein obtaining the panoramic live video transmitted by the panoramic camera specifically comprises:
generating live broadcast address information of a newly-built live broadcast room, wherein the live broadcast address information at least comprises a stream pushing address;
sending the streaming address to the panoramic camera so that the panoramic camera can push the live video shot by each camera to the cloud server according to the streaming address;
and receiving live videos pushed by the panoramic camera, performing live video combination on the live videos to generate a panoramic live video, and sending the panoramic live video to the third-party platform.
4. The method of claim 3, wherein the live video combination of the live videos to generate the panoramic live video specifically comprises:
transcoding each live video respectively to obtain each transcoded live video;
performing video splicing processing on each transcoded live video to obtain live videos subjected to video splicing;
and packaging the spliced live video to obtain a panoramic live video.
5. The method of claim 3, wherein if the live address information further includes a live address and a streaming address corresponding to the streaming address, sending the three-dimensional model information and the panoramic live video to a third party platform, so that the third party platform generates a panoramic VR live video according to the three-dimensional model information and the panoramic live video, and sending the panoramic VR live video to a client for display, so that a user can remotely review the object to be reviewed according to the panoramic VR live video, specifically comprising:
sending the live broadcast address to the third-party platform, so that the third-party platform obtains the panoramic live broadcast video from the cloud server according to the live broadcast address, and inserts the three-dimensional model information into the panoramic live broadcast video to generate a panoramic VR live broadcast video;
sending the streaming address to the client, so that the client pulls the panoramic VR live video from the third-party platform according to the streaming address, and displays the panoramic VR live video according to preset configuration information, so that a user can remotely review the object to be reviewed according to the panoramic VR live video, wherein the configuration information at least comprises one or any combination of the following items: resolution, code rate, viewing page configuration information and buffer area information; the viewing page configuration information includes any one of: business version pages, entertainment version pages, and business Chinese and English version pages.
6. A remote review method, comprising:
the method comprises the steps that a third-party platform receives three-dimensional model information and a panoramic live broadcast video sent by a cloud server, wherein the three-dimensional model information is a three-dimensional model of an object to be evaluated;
generating a panoramic VR live video according to the three-dimensional model information and the panoramic live video;
and sending the panoramic VR live video to a client for display so that a user can remotely review the object to be reviewed according to the panoramic VR live video.
7. The method of claim 6, wherein if the live address information further includes a live address, generating a panoramic VR live video according to the three-dimensional model information and the panoramic live video, specifically including:
receiving a live broadcast address sent by the cloud server;
acquiring the panoramic live broadcast video from the cloud server according to the live broadcast address;
and inserting the three-dimensional model information into the panoramic live video to generate a panoramic VR live video.
8. A remote monitoring system, comprising:
the application platform is used for generating three-dimensional model information of an object to be evaluated and sending the three-dimensional model information to the cloud server;
the panoramic camera is used for transmitting the shot panoramic live broadcast video to the cloud server;
the cloud server is used for sending the three-dimensional model information and the panoramic live video to a third-party platform;
the third-party platform is used for generating a VR live video according to the three-dimensional model information and the panoramic live video and sending the panoramic VR live video to a client;
and the client is used for receiving the panoramic VR live video and displaying the panoramic VR live video so that a user can remotely review the object to be reviewed according to the panoramic VR live video.
9. The utility model provides a remote review device, is applied to high in the clouds server, its characterized in that includes:
the system comprises a first processing module, a second processing module and a third processing module, wherein the first processing module is used for acquiring three-dimensional model information of an object to be evaluated, which is sent by an application platform, and acquiring a panoramic live broadcast video transmitted by a panoramic camera;
and the second processing module is used for sending the three-dimensional model information and the panoramic live video to a third-party platform so that the third-party platform can generate a panoramic Virtual Reality (VR) live video according to the three-dimensional model information and the panoramic live video, and the panoramic VR live video is sent to a client for display so that a user can remotely review the object to be reviewed according to the panoramic VR live video.
10. A remote review device applied to a third-party platform is characterized by comprising:
the receiving module is used for receiving three-dimensional model information and a panoramic live broadcast video sent by the cloud server, wherein the three-dimensional model information is a three-dimensional model of an object to be evaluated;
the generating module is used for generating a panoramic VR live video according to the three-dimensional model information and the panoramic live video;
and the processing module is used for sending the panoramic VR live video to a client for displaying so that a user can remotely review the object to be reviewed according to the panoramic VR live video.
CN202011304916.8A 2020-11-19 2020-11-19 Remote review method and device Pending CN112423014A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011304916.8A CN112423014A (en) 2020-11-19 2020-11-19 Remote review method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011304916.8A CN112423014A (en) 2020-11-19 2020-11-19 Remote review method and device

Publications (1)

Publication Number Publication Date
CN112423014A true CN112423014A (en) 2021-02-26

Family

ID=74773734

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011304916.8A Pending CN112423014A (en) 2020-11-19 2020-11-19 Remote review method and device

Country Status (1)

Country Link
CN (1) CN112423014A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115474803A (en) * 2022-10-14 2022-12-16 深圳市分米互联科技有限公司 Remote science and technology project display platform and system thereof

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105704501A (en) * 2016-02-06 2016-06-22 普宙飞行器科技(深圳)有限公司 Unmanned plane panorama video-based virtual reality live broadcast system
CN106210703A (en) * 2016-09-08 2016-12-07 北京美吉克科技发展有限公司 The utilization of VR environment bust shot camera lens and display packing and system
CN106648083A (en) * 2016-12-09 2017-05-10 广州华多网络科技有限公司 Playing scene synthesis enhancement control method and device
US20180101966A1 (en) * 2016-10-07 2018-04-12 Vangogh Imaging, Inc. Real-time remote collaboration and virtual presence using simultaneous localization and mapping to construct a 3d model and update a scene based on sparse data
US20190052870A1 (en) * 2016-09-19 2019-02-14 Jaunt Inc. Generating a three-dimensional preview from a two-dimensional selectable icon of a three-dimensional reality video
CN110096368A (en) * 2018-01-31 2019-08-06 上海汽车集团股份有限公司 A kind of review information processing method and processing device
CN110544314A (en) * 2019-09-05 2019-12-06 上海电气集团股份有限公司 Fusion method, system, medium and device of virtual reality and simulation model
CN111540055A (en) * 2020-04-16 2020-08-14 广州虎牙科技有限公司 Three-dimensional model driving method, device, electronic device and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105704501A (en) * 2016-02-06 2016-06-22 普宙飞行器科技(深圳)有限公司 Unmanned plane panorama video-based virtual reality live broadcast system
CN106210703A (en) * 2016-09-08 2016-12-07 北京美吉克科技发展有限公司 The utilization of VR environment bust shot camera lens and display packing and system
US20190052870A1 (en) * 2016-09-19 2019-02-14 Jaunt Inc. Generating a three-dimensional preview from a two-dimensional selectable icon of a three-dimensional reality video
US20180101966A1 (en) * 2016-10-07 2018-04-12 Vangogh Imaging, Inc. Real-time remote collaboration and virtual presence using simultaneous localization and mapping to construct a 3d model and update a scene based on sparse data
CN106648083A (en) * 2016-12-09 2017-05-10 广州华多网络科技有限公司 Playing scene synthesis enhancement control method and device
CN110096368A (en) * 2018-01-31 2019-08-06 上海汽车集团股份有限公司 A kind of review information processing method and processing device
CN110544314A (en) * 2019-09-05 2019-12-06 上海电气集团股份有限公司 Fusion method, system, medium and device of virtual reality and simulation model
CN111540055A (en) * 2020-04-16 2020-08-14 广州虎牙科技有限公司 Three-dimensional model driving method, device, electronic device and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
徐火顺等: "大视频VR直播业务及其技术", 《中兴通讯技术》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115474803A (en) * 2022-10-14 2022-12-16 深圳市分米互联科技有限公司 Remote science and technology project display platform and system thereof
CN115474803B (en) * 2022-10-14 2024-01-26 深圳市分米互联科技有限公司 Remote science and technology project display platform

Similar Documents

Publication Publication Date Title
CN107835436B (en) A kind of real-time virtual reality fusion live broadcast system and method based on WebGL
CN105931288A (en) Construction method and system of digital exhibition hall
US20170206708A1 (en) Generating a virtual reality environment for displaying content
CN107438183A (en) A kind of virtual portrait live broadcasting method, apparatus and system
CN108597032B (en) Method and system for importing building information model into Unity3D for display
CN108960889B (en) Method and device for controlling voice speaking room progress in virtual three-dimensional space of house
WO2022257480A1 (en) Livestreaming data generation method and apparatus, storage medium, and electronic device
CN113781660A (en) Method and device for rendering and processing virtual scene on line in live broadcast room
CN112734947B (en) Method and device for 3D content delivery in VR house
Zerman et al. User behaviour analysis of volumetric video in augmented reality
CN112135158A (en) Live broadcasting method based on mixed reality and related equipment
CN114202576A (en) Virtual scene processing method and device, storage medium and electronic equipment
CN112423014A (en) Remote review method and device
WO2024027611A1 (en) Video live streaming method and apparatus, electronic device and storage medium
KR20200004009A (en) Platform for video mixing in studio environment
KR101752691B1 (en) Apparatus and method for providing virtual 3d contents animation where view selection is possible
CN111167119A (en) Game development display method, device, equipment and storage medium
KR20200028830A (en) Real-time computer graphics video broadcasting service system
KR100403942B1 (en) System for emboding dynamic image of it when selected object in three dimensions imagination space
US9131252B2 (en) Transmission of 3D models
CN115311919A (en) Spoken language training method and system based on VR technology
CN113628324A (en) Wisdom highway VR interactive teaching system
CN112037341A (en) Method and device for processing VR panorama interaction function based on Web front end
Al Hashimi Building 360-degree VR video for AquaFlux and Epsilon research instruments
KR20000024334A (en) The method of a three dimensional virtual operating simulation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210226

RJ01 Rejection of invention patent application after publication