CN116860193A - Movie theatre synchronous network film watching system - Google Patents

Movie theatre synchronous network film watching system Download PDF

Info

Publication number
CN116860193A
CN116860193A CN202310796525.XA CN202310796525A CN116860193A CN 116860193 A CN116860193 A CN 116860193A CN 202310796525 A CN202310796525 A CN 202310796525A CN 116860193 A CN116860193 A CN 116860193A
Authority
CN
China
Prior art keywords
movie
virtual reality
data
cinema
visual field
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310796525.XA
Other languages
Chinese (zh)
Inventor
万继东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202310796525.XA priority Critical patent/CN116860193A/en
Publication of CN116860193A publication Critical patent/CN116860193A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/78Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Data Mining & Analysis (AREA)
  • Library & Information Science (AREA)
  • Multimedia (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The application provides a cinema synchronous network film watching system, and relates to the technical field of computers. The system comprises: the system comprises a user terminal, a VR equipment control terminal, an Internet platform, a cinema platform and a plurality of virtual reality equipment. And establishing connection between the user terminal and the target virtual reality equipment, acquiring and inputting movie information into a movie database of a movie theater, and obtaining movie projection video data and projection time. And acquiring and inputting the binocular position data and binocular size data into a visual field range recognition model to obtain a visual field recognition result. And obtaining the video data of the film projection screen according to the display screen data and the visual field recognition result, and transmitting the video data of the film projection screen to a VR equipment control end according to the projection time. And according to the ID numbers, projecting the movie projection video data to a display screen of the target virtual reality device so as to synchronously play the target virtual reality device and a curtain of a movie theater. Thereby achieving the purpose of synchronously watching the movie in the movie theatre at home.

Description

Movie theatre synchronous network film watching system
Technical Field
The application relates to the technical field of computers, in particular to a cinema synchronous network film watching system.
Background
In the movie age of this rapid development, the viewing mode of the audience remains as a mode of viewing in the de-physical cinema, as opposed to the playing technology of the movie. The urban life with high rhythm makes the life of people become more and more full and tense and the time is less and less enough. The time cost for enjoying a favorite movie in a scene without burden is too great. Therefore, how to provide a user with a movie that can be synchronously watched at home for movie theatre movie watching at home is a current urgent problem to be solved.
Disclosure of Invention
The application aims to provide a cinema synchronous network film watching system which can realize the purpose of synchronously watching films displayed on a cinema at home.
In order to solve the technical problems, the application adopts the following technical scheme:
in a first aspect, an embodiment of the present application provides a cinema synchronization network film viewing system, which includes a user terminal, a VR device control end, an internet platform, a cinema platform, and a plurality of virtual reality devices, where the user terminal, the VR device control end, and the cinema platform are respectively connected to the internet platform, and any one of the virtual reality devices is connected to the VR device control end, and the cinema platform includes a cinema film database;
establishing connection between a user terminal and target virtual reality equipment, acquiring movie information input by the user terminal by an internet platform, inputting the movie information into a movie database of a movie theater for matching, and obtaining corresponding movie projection video data and projection time;
collecting binocular position data and binocular size data of a user through a camera arranged on target virtual reality equipment, and uploading the binocular position data and the binocular size data to an Internet platform through a VR equipment control end;
the Internet platform inputs the binocular position data and binocular size data into a trained visual field range recognition model to obtain a visual field recognition result;
the internet platform acquires the display screen data and ID number of the target virtual reality device, adjusts each picture of the film projection video data according to the display screen data and the visual field identification result to obtain the film projection video data, and transmits the film projection video data to the VR device control end according to the projection time;
and the VR equipment control end throws the video data of the movie on the display screen of the target virtual reality equipment according to the ID number.
In some embodiments of the present application, the cinema platform comprises:
a movie information to be projected acquisition module for acquiring all movie information to be projected of each movie theatre, wherein any one of the movie information to be projected includes movie video data to be projected and projection time;
a cinema movie database construction module for constructing a cinema movie database by using all movie information to be projected of all movie theaters;
and the cinema movie database updating module is used for acquiring the latest movie information of each cinema in real time and storing the latest movie information into the cinema movie database.
In some embodiments of the present application, the above visual field recognition result includes a left eye visual field range, a right eye visual field range, and left-right eye parallax;
the step of adjusting each picture of the film projection video data to obtain the film projection video data according to the display screen data and the visual field recognition result comprises the following steps:
dividing a display screen of the target virtual reality device into a first sub-screen and a second sub-screen according to the parallax of the left eye and the right eye;
determining a display range of the first sub-screen according to the left eye visual field range and the display screen data;
determining the display range of the second sub-screen according to the right eye visual field range and the display screen data;
and adjusting each picture of the film projection video data according to the display range of the first sub-screen and the display range of the second sub-screen so as to obtain the film projection video data.
In some embodiments of the present application, the step of establishing a connection between the user terminal and the target virtual reality device includes:
the user terminal scans the two-dimensional code of the target virtual reality device to obtain the ID number of the target virtual reality device;
binding the ID numbers of the user terminal and the target virtual reality equipment to establish connection between the user terminal and the target virtual reality equipment.
In some embodiments of the present application, the internet platform includes:
the initial model building module is used for building a visual field range identification initial model;
the system comprises a sample acquisition module, a display module and a display module, wherein the sample acquisition module is used for acquiring a plurality of samples, and the samples comprise binocular data and visual field range data of a plurality of users;
and the model training module is used for training the visual field range recognition initial model by using a plurality of samples to obtain a trained visual field range recognition model.
In a second aspect, an embodiment of the present application provides an electronic device, including a memory for storing one or more programs; a processor. The system of any of the above first aspects is implemented when one or more programs are executed by a processor.
In a third aspect, embodiments of the present application provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements a system as in any of the first aspects above.
Compared with the prior art, the embodiment of the application has at least the following advantages or beneficial effects:
the application provides a cinema synchronous network film watching system which comprises a user terminal, a VR equipment control end, an Internet platform, a cinema platform and a plurality of virtual reality equipment, wherein the user terminal, the VR equipment control end and the cinema platform are respectively connected with the Internet platform, any one of the virtual reality equipment is connected with the VR equipment control end, and the cinema platform comprises a cinema film database. And establishing connection between the user terminal and the target virtual reality equipment, acquiring movie information input by the user terminal by the Internet platform, inputting the movie information into a movie database of a movie theater for matching, and obtaining corresponding movie showing video data and showing time. The method comprises the steps that binocular position data and binocular size data of a user are collected through a camera arranged on target virtual reality equipment, and the binocular position data and the binocular size data are uploaded to an Internet platform through a VR equipment control end. The Internet platform inputs the binocular position data and binocular size data into a trained visual field range recognition model to obtain a visual field recognition result. The internet platform acquires the display screen data and ID number of the target virtual reality device, adjusts each picture of the film projection video data according to the display screen data and the visual field identification result to obtain the film projection video data, and transmits the film projection video data to the VR device control end according to the projection time. And the VR equipment control end throws the video data of the movie projection screen onto a display screen of the target virtual reality equipment according to the ID number so as to synchronously play the target virtual reality equipment and a curtain of a movie theater. Thereby achieving the purpose of synchronously watching the movie in the movie theatre at home.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a block diagram of a cinema synchronized network film viewing system according to an embodiment of the present application;
fig. 2 is a flowchart of a cinema synchronized network film viewing system according to an embodiment of the present application;
fig. 3 is a block diagram of a cinema platform according to an embodiment of the present application;
FIG. 4 is a flowchart of adjusting a video frame of a film showing according to an embodiment of the present application;
fig. 5 is a schematic block diagram of an electronic device according to an embodiment of the present application.
Icon: 1-a user terminal; a 2-VR device control end; 3-an internet platform; 4-cinema platform; 410-a movie information acquisition module to be projected; 420-a cinema movie database construction module; 430-a cinema movie database update module; 5-virtual reality devices; 101-memory; 102-a processor; 103-communication interface.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments of the present application. The components of the embodiments of the present application generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Examples
Referring to fig. 1 and fig. 2, fig. 1 is a block diagram illustrating a structure of a cinema synchronized network film viewing system according to an embodiment of the present application, and fig. 2 is a flowchart illustrating a cinema synchronized network film viewing system according to an embodiment of the present application. The embodiment of the application provides a cinema synchronous network film watching system, which comprises a user terminal 1, a VR equipment control end 2, an Internet platform 3, a cinema platform 4 and a plurality of virtual reality equipment 5, wherein the user terminal 1, the VR equipment control end 2 and the cinema platform 4 are respectively connected with the Internet platform 3, any one of the virtual reality equipment 5 is connected with the VR equipment control end 2, and the cinema platform 4 comprises a cinema film database;
specifically, the VR device control end 2 is configured to transmit information and a control instruction to any one of the virtual reality devices 5, and control any one of the virtual reality devices 5.
Establishing connection between the user terminal 1 and the target virtual reality equipment 5, and acquiring movie information input by the user terminal 1 by the Internet platform 3, inputting the movie information into a movie database of a movie theater for matching to obtain corresponding movie showing video data and showing time;
wherein, the movie theatre movie database contains movie showing video data and showing time of all movies currently showing in each movie theatre.
In some implementations of this embodiment, the step of establishing the connection between the user terminal 1 and the target virtual reality device 5 includes: the user terminal 1 scans the two-dimensional code of the target virtual reality device 5 to obtain the ID number of the target virtual reality device 5. The ID numbers of the user terminal 1 and the target virtual reality device 5 are bound to establish a connection of the user terminal 1 and the target virtual reality device 5.
Specifically, the user wears the target virtual reality device 5 on the body, and the user terminal 1 can establish connection between the user terminal 1 and the target virtual reality device 5 by scanning the two-dimensional code on the target virtual reality device 5 through the user terminal 1. The user then inputs movie information via the user terminal 1, and the internet platform 3 inputs the movie information into the cinema movie database, matching to obtain movie projection video data and projection time consistent with the movie information.
For example, the movie information may be movie names, staff names, showing times, etc.
Collecting binocular position data and binocular size data of a user through a camera arranged on target virtual reality equipment 5, and uploading the binocular position data and the binocular size data to an Internet platform 3 through a VR equipment control end 2;
for example, when the user wears the target virtual reality device 5 on the body, the camera provided on the target virtual reality device 5 may photograph both eyes of the user to determine both eye position data and both eye size data of the user. The collected binocular position data and binocular size data are uploaded to the internet platform 3 via the VR device control terminal 2.
The Internet platform 3 inputs the binocular position data and binocular size data into a trained visual field range recognition model to obtain a visual field recognition result;
specifically, the internet platform 3 analyzes the binocular position data and the binocular size data through the trained visual field range recognition model, and obtains a visual field recognition result which is a result of the binocular analysis of the user.
The internet platform 3 acquires the display screen data and ID number of the target virtual reality device 5, adjusts each picture of the film projection video data according to the display screen data and the visual field identification result to obtain film projection video data, and transmits the film projection video data to the VR device control end 2 according to the projection time;
referring to fig. 4, fig. 4 is a flowchart illustrating an adjustment of a film projection video according to an embodiment of the application. In some implementations of this embodiment, the visual field recognition results include a left eye visual field range, a right eye visual field range, and a left-right eye parallax. The step of adjusting each picture of the film projection video data to obtain the film projection video data according to the display screen data and the visual field recognition result comprises the following steps: the display screen of the target virtual reality device 5 is divided into a first sub-screen and a second sub-screen according to the left-right eye parallax. And determining the display range of the first sub-screen according to the left eye visual field range and the display screen data. And determining the display range of the second sub-screen according to the right eye visual field range and the display screen data. And adjusting each picture of the film projection video data according to the display range of the first sub-screen and the display range of the second sub-screen so as to obtain the film projection video data.
Specifically, the internet platform 3 may acquire the display screen data and the ID number of the target virtual reality device 5 through the user terminal 1 or the VR device control end 2, and then divide the screen of the display screen into two according to the parallax between the left eye and the right eye in the visual field recognition result and by using the principle of the parallax between the left eye and the right eye, into a first sub-screen and a second sub-screen. And then, according to the left eye visual field range and the display screen data, determining the display range of the first sub-screen, and according to the right eye visual field range and the display screen data, determining the display range of the second sub-screen. And the display range of the first sub-screen and the second sub-screen is adjusted to adapt to the display of the two sub-screens, so that the film projection video data is obtained. And the video data of the movie is transmitted to the VR equipment control end 2, and the VR equipment control end 2 throws the video data of the movie on the display screen of the target virtual reality equipment 5, so that the left and right screens of the target virtual reality equipment 5 are respectively thrown to the left and right eyes to realize the 3D effect.
And the VR equipment control end 2 throws the movie projection video data onto a display screen of the target virtual reality equipment 5 according to the ID number.
Specifically, according to the showing time, the internet platform 3 transmits the video data of the movie projection screen to the VR device control end 2, so that the video data of the movie projection screen is projected onto the display screen of the target virtual reality device 5 through the VR device control end 2, so that the target virtual reality device 5 and the curtain of the movie theater can play synchronously. Thereby achieving the purpose of synchronously watching the movie in the movie theatre at home.
Referring to fig. 3, fig. 3 is a block diagram illustrating a cinema platform 4 according to an embodiment of the present application. In some implementations of this embodiment, the cinema platform 4 includes:
a movie information to be shown acquisition module 410, configured to acquire all movie information to be shown in each movie theater, where any movie information to be shown includes movie video data to be shown and showing time;
a cinema movie database construction module 420 for constructing a cinema movie database using all movie information to be projected for all movie theaters;
and the cinema movie database updating module 430 is configured to acquire the latest movie information of each cinema in real time, and store the latest movie information in the cinema movie database.
Specifically, each cinema inputs all the movie information to be shown in the cinema platform 4, the cinema platform 4 establishes a cinema movie database by using all the collected movie information to be shown in all the cinemas, and collects the latest movie information of each cinema in real time so as to update the cinema movie database by using the latest movie information, thereby ensuring that the movie video data and showing time in the cinema movie database are consistent with the movies shown in the actual cinema.
In some implementations of the present embodiment, the internet platform 3 includes:
the initial model building module is used for building a visual field range identification initial model;
the system comprises a sample acquisition module, a display module and a display module, wherein the sample acquisition module is used for acquiring a plurality of samples, and the samples comprise binocular data and visual field range data of a plurality of users;
and the model training module is used for training the visual field range recognition initial model by using a plurality of samples to obtain a trained visual field range recognition model.
In particular, a plurality of volunteers may be summoned, and binocular data and visual field range data of these volunteers are collected as samples. And training the visual field range recognition initial model by using a plurality of samples, thereby obtaining a trained visual field range recognition model. The trained visual field range recognition model can analyze the binocular position data and the binocular size data to obtain corresponding visual field recognition results.
Referring to fig. 5, fig. 5 is a schematic block diagram of an electronic device according to an embodiment of the present application. The electronic device comprises a memory 101, a processor 102 and a communication interface 103, wherein the memory 101, the processor 102 and the communication interface 103 are electrically connected with each other directly or indirectly to realize data transmission or interaction. For example, the components may be electrically connected to each other via one or more communication buses or signal lines. The memory 101 may be used to store software programs and modules, such as program instructions/modules corresponding to a cinema synchronized network viewing system provided in an embodiment of the present application, and the processor 102 executes the software programs and modules stored in the memory 101, thereby performing various functional applications and data processing. The communication interface 103 may be used for communication of signaling or data with other node devices.
The Memory 101 may be, but is not limited to, a random access Memory (Random Access Memory, RAM), a Read Only Memory (ROM), a programmable Read Only Memory (Programmable Read-Only Memory, PROM), an erasable Read Only Memory (Erasable Programmable Read-Only Memory, EPROM), an electrically erasable Read Only Memory (Electric Erasable Programmable Read-Only Memory, EEPROM), etc.
The processor 102 may be an integrated circuit chip with signal processing capabilities. The processor 102 may be a general purpose processor including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc.; but also digital signal processors (Digital Signal Processing, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
It will be appreciated that the configuration shown in fig. 5 is merely illustrative, and that the electronic device may also include more or fewer components than shown in fig. 5, or have a different configuration than shown in fig. 5. The components shown in fig. 5 may be implemented in hardware, software, or a combination thereof.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. The apparatus embodiments described above are merely illustrative, for example, of the flowcharts and block diagrams in the figures that illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form a single part, or each module may exist alone, or two or more modules may be integrated to form a single part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
It will be evident to those skilled in the art that the application is not limited to the details of the foregoing illustrative embodiments, and that the present application may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. The present embodiments are, therefore, to be considered in all respects as illustrative and not restrictive, the scope of the application being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned.

Claims (7)

1. The cinema synchronous network film watching system is characterized by comprising a user terminal, a VR equipment control end, an Internet platform, a cinema platform and a plurality of virtual reality equipment, wherein the user terminal, the VR equipment control end and the cinema platform are respectively connected with the Internet platform, any one of the virtual reality equipment is connected with the VR equipment control end, and the cinema platform comprises a cinema film database;
establishing connection between the user terminal and the target virtual reality equipment, and acquiring movie information input by the user terminal by the Internet platform, inputting the movie information into the movie database of the movie theatre for matching to obtain corresponding movie projection video data and projection time;
collecting binocular position data and binocular size data of a user through a camera arranged on target virtual reality equipment, and uploading the binocular position data and the binocular size data to the Internet platform through a VR equipment control end;
the Internet platform inputs the binocular position data and the binocular size data into a trained visual field range recognition model to obtain a visual field recognition result;
the internet platform acquires display screen data and ID numbers of target virtual reality equipment, adjusts each picture of the film projection video data according to the display screen data and the visual field identification result to obtain film projection video data, and transmits the film projection video data to a VR equipment control end according to the projection time;
and the VR equipment control end throws the film projection video data to the display screen of the target virtual reality equipment according to the ID number.
2. The theater synchronous network viewing system of claim 1 wherein the theater platform comprises:
a movie information to be projected acquisition module for acquiring all movie information to be projected of each movie theatre, wherein any one of the movie information to be projected comprises movie video data to be projected and projection time;
a cinema movie database construction module for constructing a cinema movie database by using all movie information to be projected of all movie theaters;
and the cinema movie database updating module is used for acquiring the latest movie information of each cinema in real time and storing the latest movie information into the cinema movie database.
3. The theater synchronous network viewing system of claim 1 wherein the field of view identification results include left eye field of view, right eye field of view, and left and right eye parallax;
according to the display screen data and the visual field recognition result, the step of adjusting each picture of the film projection video data to obtain the film projection video data comprises the following steps:
dividing a display screen of the target virtual reality equipment into a first sub-screen and a second sub-screen according to the left-right eye parallax;
determining the display range of the first sub-screen according to the left eye visual field range and the display screen data;
determining the display range of the second sub-screen according to the right eye visual field range and the display screen data;
and adjusting each picture of the film projection video data according to the display range of the first sub-screen and the display range of the second sub-screen so as to obtain the film projection video data.
4. The theater synchronization network viewing system of claim 1, wherein the step of establishing a connection of the user terminal with a target virtual reality device comprises:
the user terminal scans the two-dimensional code of the target virtual reality equipment to obtain the ID number of the target virtual reality equipment;
binding the ID numbers of the user terminal and the target virtual reality equipment to establish connection between the user terminal and the target virtual reality equipment.
5. The theater synchronous network viewing system of claim 1 wherein the internet platform comprises:
the initial model building module is used for building a visual field range identification initial model;
the system comprises a sample acquisition module, a display module and a display module, wherein the sample acquisition module is used for acquiring a plurality of samples, and the samples comprise binocular data and visual field range data of a plurality of users;
and the model training module is used for training the visual field range recognition initial model by utilizing the plurality of samples to obtain a trained visual field range recognition model.
6. An electronic device, comprising:
a memory for storing one or more programs;
a processor;
the system of any of claims 1-5 is implemented when the one or more programs are executed by the processor.
7. A computer readable storage medium, on which a computer program is stored, which computer program, when being executed by a processor, implements the system according to any of claims 1-5.
CN202310796525.XA 2023-06-30 2023-06-30 Movie theatre synchronous network film watching system Pending CN116860193A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310796525.XA CN116860193A (en) 2023-06-30 2023-06-30 Movie theatre synchronous network film watching system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310796525.XA CN116860193A (en) 2023-06-30 2023-06-30 Movie theatre synchronous network film watching system

Publications (1)

Publication Number Publication Date
CN116860193A true CN116860193A (en) 2023-10-10

Family

ID=88233365

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310796525.XA Pending CN116860193A (en) 2023-06-30 2023-06-30 Movie theatre synchronous network film watching system

Country Status (1)

Country Link
CN (1) CN116860193A (en)

Similar Documents

Publication Publication Date Title
US20180192044A1 (en) Method and System for Providing A Viewport Division Scheme for Virtual Reality (VR) Video Streaming
US9965026B2 (en) Interactive video display method, device, and system
US9485493B2 (en) Method and system for displaying multi-viewpoint images and non-transitory computer readable storage medium thereof
Huynh-Thu et al. The importance of visual attention in improving the 3D-TV viewing experience: Overview and new perspectives
JP4351996B2 (en) Method for generating a stereoscopic image from a monoscope image
CN106534757B (en) Face exchange method and device, anchor terminal and audience terminal
CN106303573B (en) 3D video image processing method, server and client
Devernay et al. Stereoscopic cinema
CN104301769B (en) Method, terminal device and the server of image is presented
KR20130025395A (en) Method, apparatus and computer program for selecting a stereoscopic imaging viewpoint pair
CN113115110B (en) Video synthesis method and device, storage medium and electronic equipment
CN114615513B (en) Video data generation method and device, electronic equipment and storage medium
WO2023169297A1 (en) Animation special effect generation method and apparatus, device, and medium
CN105635675A (en) Panorama playing method and device
US20070122029A1 (en) System and method for capturing visual data and non-visual data for multi-dimensional image display
CN108401163B (en) Method and device for realizing VR live broadcast and OTT service system
CN109872400B (en) Panoramic virtual reality scene generation method
JP2009246917A (en) Video display device, and video processing apparatus
CN116860193A (en) Movie theatre synchronous network film watching system
CN116506563A (en) Virtual scene rendering method and device, electronic equipment and storage medium
CN110198457B (en) Video playing method and device, system, storage medium, terminal and server thereof
CN115103138A (en) Method and system for generating virtual-real fusion image based on space-time consistency
CN115268650A (en) Picture screen capturing method and device, head-mounted virtual reality equipment and storage medium
CN114007056A (en) Method and device for generating three-dimensional panoramic image
CN114915798A (en) Real-time video generation method, multi-camera live broadcast method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination