US10362283B2 - Virtual cinema and implementation method thereof - Google Patents

Virtual cinema and implementation method thereof Download PDF

Info

Publication number
US10362283B2
US10362283B2 US15/211,671 US201615211671A US10362283B2 US 10362283 B2 US10362283 B2 US 10362283B2 US 201615211671 A US201615211671 A US 201615211671A US 10362283 B2 US10362283 B2 US 10362283B2
Authority
US
United States
Prior art keywords
ambient light
virtual
data
virtual screen
video content
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US15/211,671
Other versions
US20170195646A1 (en
Inventor
Ruisheng ZHANG
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Pico Technology Co Ltd
Original Assignee
Beijing Pico Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Pico Technology Co Ltd filed Critical Beijing Pico Technology Co Ltd
Assigned to BEIJING PICO TECHNOLOGY CO., LTD. reassignment BEIJING PICO TECHNOLOGY CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ZHANG, RUISHENG
Publication of US20170195646A1 publication Critical patent/US20170195646A1/en
Application granted granted Critical
Publication of US10362283B2 publication Critical patent/US10362283B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Definitions

  • the present disclosure relates to the technical field of virtual realities, and particularly, to a virtual cinema and an implementation method thereof.
  • the virtual cinema is also referred to as virtual reality player, which enables a user to feel like watching films in a real cinema.
  • the ambient light varying with the screen light is indispensable. But currently, there is no case where the ambient light is added to the virtual cinema, thus the user of the virtual cinema feels that the film watching effect is not real enough, and the user experience is degraded.
  • the present disclosure provides a virtual cinema and an implementation method thereof.
  • an implementation method of a virtual cinema comprising:
  • the method further comprises:
  • obtaining ambient light data corresponding to the data of each frame of image in real time according to the obtained data of each frame of image comprises:
  • the method arranges the virtual screen using Unity3D or OpenGL technique, and projects ambient light varying with the video content around the virtual screen.
  • the method is applicable to a virtual reality device, an enhanced reality device and an ordinary video player.
  • a virtual cinema comprising a virtual screen and a projection module
  • the virtual screen is configured to play a video content
  • the projection module is configured to project ambient light varying with the video content around the virtual screen.
  • the virtual cinema further comprises an ambient light obtaining module
  • the ambient light obtaining module is connected to the virtual screen and the projection module, respectively, and configured to obtain data of each frame of image played by the virtual screen in real time, obtain ambient light data corresponding to the data of each frame of image in real time according to the data of each frame of image, and transmit the ambient light data to the projection module.
  • the ambient light obtaining module is specifically configured to:
  • the virtual cinema implements the virtual screen and the projection module by using Unity3D or OpenGL technique.
  • the virtual cinema is applicable to a virtual reality device, an enhanced reality device and an ordinary video player.
  • the embodiment of the present disclosure has the following beneficial effect: by projecting ambient light varying with the video content around the virtual screen while displaying the video content to the user through the virtual screen, the sense of reality is enhanced and the user experience is improved.
  • the ambient effect is vivid, and it is unnecessary to preprocess the video, thereby saving the resources.
  • FIG. 1 is a flow diagram of an implementation method of a virtual cinema provided by an embodiment of the present disclosure
  • FIG. 2 is a flow diagram of an implementation method of a virtual cinema provided by a preferred embodiment of the present disclosure.
  • FIG. 3 is a system structure diagram of a virtual cinema provided by a preferred embodiment of the present disclosure.
  • FIG. 1 is a flow diagram of an implementation method of a virtual cinema provided by an embodiment of the present disclosure. As illustrated in FIG. 1 , the implementation method of the virtual cinema provided by the embodiment of the present disclosure comprises:
  • step S 110 arranging a virtual screen, and playing a video content thereon;
  • step S 120 projecting ambient light varying with the video content around the virtual screen.
  • the ambient light varies with the played video content, i.e., the light around may be dimmed or brightened along with the image switching on the virtual screen, thus a person feels like watching the film in a real cinema, and the user experience is improved.
  • the ambient light in “step S 120 : projecting ambient light varying with the video content around the virtual screen” is obtained in the steps of:
  • the ambient light is obtained in real time according to the played video content, and there is no requirement about the video data, i.e., the video itself is not required to carry any ambient light data.
  • ambient light varying with the video content can be added while playing any ordinary video, thereby improving the user experience.
  • obtaining ambient light data corresponding to the data of each frame of image in real time according to the obtained data of each frame of image is to perform a Gaussian blur algorithm for the data of each frame of image to process the data of each frame of image into data only including colors, which serves as the ambient light data.
  • the Gaussian blur is also referred to as Gaussian smoothing, which is widely used in the image processing and can effectively reduce image noises and degrade the detail level.
  • the Gaussian blur algorithm can count the pixel color values around a certain point in a Gaussian curve, and obtain a color value of the curve in a weighted average calculation method.
  • the Gaussian blur algorithm is performed for the data of each frame of image to process the data of each frame of image into ambient data only including colors. Next, the obtained ambient light data is projected around the virtual screen to achieve a lighting effect.
  • the ambient light data obtained after the Gaussian blur comes from the virtual screen, thus the ambient effect is vivid while varying with the video content, which immerses the user in the virtual cinema and improves the user experience.
  • FIG. 2 is a flow diagram of an implementation method of a virtual cinema provided by a preferred embodiment of the present disclosure.
  • a virtual screen may be arranged using techniques such as Unity3D or OpenGL, and ambient light varying with the video content may be projected around the virtual screen.
  • an implementation method of a virtual cinema provided by a preferred embodiment of the present disclosure comprises:
  • Step S 210 arranging a virtual screen using a plugin NGUI in Unity3D.
  • the NGUI provides a strong UI system and an event notification framework, and it has concise codes, simple operations and high performances.
  • Step S 220 arranging a projector on the virtual screen to project around the virtual cinema.
  • the Unity3D provides a projector component, and a material can be projected to a scene by the projector.
  • Step S 230 obtaining data of each frame of image of a video in real time in a process of playing the video in the virtual cinema.
  • Step S 240 processing the data of each frame of image using a Gaussian blur algorithm to obtain data only including colors.
  • the Gaussian blur algorithm can count the pixel color values around a certain point in a Gaussian curve, and obtain a color value of that curve in a weighted average calculation method.
  • Step S 250 applying the data only including colors in step S 240 into the projector which projects ambient light of corresponding colors around the virtual screen; when a frame of image varies, a different color value is obtained, and then different ambient light is projected, thus ambient light varying with the video content is obtained.
  • the implementation method of a virtual cinema provided by the present disclosure is particularly suitable to the virtual reality device.
  • Ambient light varying with screen light is added into the virtual cinema, and the ambient light around each frame of picture varies with that picture, which enables a user to feel like watching films in a real cinema, thereby enhancing the sense of reality and improving the user experience.
  • the implementation method of a virtual cinema provided by the present disclosure is also suitable for an enhanced reality device and an ordinary video player.
  • FIG. 3 is a system structure diagram of a virtual cinema provided by a preferred embodiment of the present disclosure.
  • the virtual cinema provided by the embodiment of the present disclosure comprises a virtual screen 310 and a projection module 330 .
  • the virtual screen 310 is configured to play a video content. It is corresponding to a screen of a real cinema, and a user can watch the video content such as a film through the virtual screen 310 .
  • the projection module 330 is configured to project ambient light varying with the video content around the virtual screen 310 .
  • the existed virtual cinema usually plays a video within the entire range of visibility of the user, or within a part of the range of visibility of the user without displaying any content in other parts.
  • people When watching a film in a real cinema, people will see ambient light projected from the screen to the wall, ground, etc. in addition to the content presented on the screen. While the ambient light is not considered in the existed virtual cinema, and the sense of reality is weakened.
  • the virtual cinema provided by the embodiment of the present disclosure plays the video content using the virtual screen 310 which only occupies a part of the range of visibility of the user, usually the middle part, and the projection module 330 projects ambient light varying with the video content around the virtual screen 310 , i.e., in other parts of the range of visibility of the user, so that the user experiences a film watching in a real cinema.
  • the virtual cinema provided by a preferred embodiment of the present disclosure further comprises an ambient light obtaining module 320 connected to the virtual screen 310 and the projection module 330 , respectively. Firstly, the ambient light obtaining module 320 obtains data of each frame of image played by the virtual screen 310 in real time, then obtains ambient light data corresponding to the data of each frame of image in real time according to the data of each frame of image, and finally transmits the ambient light data to the projection module 330 which projects ambient light varying with the video content around the virtual screen 310 .
  • the ambient light is obtained by the ambient light obtaining module 320 in real time according to the content played by the virtual screen 310 , thus it is unnecessary to preprocess the played video, and ambient light varying with the video content can be obtained when any ordinary video is played by the virtual cinema.
  • the ambient light obtaining module 320 performs a Gaussian blur algorithm for the data of each frame of image, counts the pixel color values around a certain point in a Gaussian curve, and obtains a color value of that curve in a weighted average calculation method. It processes the data of each frame of image into ambient data only including colors, and then projects the obtained ambient light data only including colors around the virtual screen to achieve a lighting effect.
  • the ambient light data comes from the virtual screen 310 , thus the ambient effect is vivid while varying with the video content.
  • the virtual screen 310 and the projection module 330 are implemented using the technique such as Unity3D or OpenGL.
  • the virtual screen 310 may be arranged using a plugin NGUI in Unity3D, and the projection module 330 may be implemented using the projector component provided by Unity3D.
  • the virtual cinema provided by the present disclosure is applicable to the virtual reality device, the enhanced reality device and the ordinary video player, and particularly suitable to the virtual reality device, thereby enabling a user to feel like staying in a real scene.
  • the virtual cinema provided by the present disclosure projects ambient light varying with the video content around the virtual screen while displaying the video content to the user through the virtual screen, thereby enhancing the sense of reality and improving the user experience.
  • the virtual cinema provided by the present disclosure obtains ambient light varying with the played video content in real time according to the video content played by the virtual screen, thus the ambient effect is vivid, and it is unnecessary to preprocess the video, thereby saving the resources.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Video Image Reproduction Devices For Color Tv Systems (AREA)
  • User Interface Of Digital Computer (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The present disclosure discloses a virtual cinema and an implementation method thereof. The implementation method of the virtual cinema comprises: arranging a virtual screen, and playing a video content thereon; and projecting ambient light varying with the video content around the virtual screen. By projecting ambient light varying with the video content around the virtual screen while displaying the video content to the user through the virtual screen, the sense of reality is enhanced, thereby solving the problem that the user experience is affected because the ambient light is not considered enough in the existed virtual cinema.

Description

CROSS-REFERENCE TO RELATED APPLICATION
This application claims the benefit and priority to Chinese Application Number 201511026692.8 filed Dec. 31, 2015. The entire disclosure of the above application is incorporated herein by reference.
FIELD
The present disclosure relates to the technical field of virtual realities, and particularly, to a virtual cinema and an implementation method thereof.
BACKGROUND
This section provides background information related to the present disclosure which is not necessarily prior art.
With the gradual maturity of the virtual reality technique, watching films in a virtual cinema becomes an important part of the virtual reality application. The virtual cinema is also referred to as virtual reality player, which enables a user to feel like watching films in a real cinema. For this purpose, the ambient light varying with the screen light is indispensable. But currently, there is no case where the ambient light is added to the virtual cinema, thus the user of the virtual cinema feels that the film watching effect is not real enough, and the user experience is degraded.
SUMMARY
This section provides a general summary of the disclosure, and is not a comprehensive disclosure of its full scope or all of its features.
In order to solve the above problem, the present disclosure provides a virtual cinema and an implementation method thereof.
According to one aspect of the present disclosure, an implementation method of a virtual cinema is provided, comprising:
arranging a virtual screen, and playing a video content thereon; and
projecting ambient light varying with the video content around the virtual screen.
Wherein, the method further comprises:
obtaining data of each frame of image of the video content in real time, in a process of playing the video content on the virtual screen; and
obtaining ambient light data corresponding to the data of each frame of image in real time according to the obtained data of each frame of image.
Wherein, obtaining ambient light data corresponding to the data of each frame of image in real time according to the obtained data of each frame of image comprises:
performing a Gaussian blur algorithm for the data of each frame of image to process the data of each frame of image into data only including colors, which serves as the ambient light data.
Wherein, the method arranges the virtual screen using Unity3D or OpenGL technique, and projects ambient light varying with the video content around the virtual screen.
Wherein, the method is applicable to a virtual reality device, an enhanced reality device and an ordinary video player.
According to another aspect of the present disclosure, a virtual cinema is provided, comprising a virtual screen and a projection module;
the virtual screen is configured to play a video content; and
the projection module is configured to project ambient light varying with the video content around the virtual screen.
Wherein, the virtual cinema further comprises an ambient light obtaining module;
the ambient light obtaining module is connected to the virtual screen and the projection module, respectively, and configured to obtain data of each frame of image played by the virtual screen in real time, obtain ambient light data corresponding to the data of each frame of image in real time according to the data of each frame of image, and transmit the ambient light data to the projection module.
Wherein, the ambient light obtaining module is specifically configured to:
perform a Gaussian blur algorithm for the data of each frame of image to process the data of each frame of image into data only including colors, which serves as the ambient light data.
Wherein, the virtual cinema implements the virtual screen and the projection module by using Unity3D or OpenGL technique.
Wherein, the virtual cinema is applicable to a virtual reality device, an enhanced reality device and an ordinary video player.
The embodiment of the present disclosure has the following beneficial effect: by projecting ambient light varying with the video content around the virtual screen while displaying the video content to the user through the virtual screen, the sense of reality is enhanced and the user experience is improved. In the preferred embodiment, by obtaining ambient light varying with the played video content in real time according to the video content played by the virtual screen, the ambient effect is vivid, and it is unnecessary to preprocess the video, thereby saving the resources.
Further aspects and areas of applicability will become apparent from the description provided herein. It should be understood that various aspects of this disclosure may be implemented individually or in combination with one or more other aspects. It should also be understood that the description and specific examples herein are intended for purposes of illustration only and are not intended to limit the scope of the present disclosure.
DRAWINGS
The drawings described herein are for illustrative purposes only of selected embodiments and not all possible implementations, and are not intended to limit the scope of the present disclosure.
FIG. 1 is a flow diagram of an implementation method of a virtual cinema provided by an embodiment of the present disclosure;
FIG. 2 is a flow diagram of an implementation method of a virtual cinema provided by a preferred embodiment of the present disclosure; and
FIG. 3 is a system structure diagram of a virtual cinema provided by a preferred embodiment of the present disclosure.
DETAILED DESCRIPTION
Example embodiments will now be described more fully with reference to the accompanying drawings.
FIG. 1 is a flow diagram of an implementation method of a virtual cinema provided by an embodiment of the present disclosure. As illustrated in FIG. 1, the implementation method of the virtual cinema provided by the embodiment of the present disclosure comprises:
step S110: arranging a virtual screen, and playing a video content thereon;
step S120: projecting ambient light varying with the video content around the virtual screen.
By using the implementation method of the virtual cinema provided by the embodiment of the present disclosure, when a film is watched in the virtual cinema, the ambient light varies with the played video content, i.e., the light around may be dimmed or brightened along with the image switching on the virtual screen, thus a person feels like watching the film in a real cinema, and the user experience is improved.
Preferably, the ambient light in “step S120: projecting ambient light varying with the video content around the virtual screen” is obtained in the steps of:
obtaining data of each frame of image of the video content in real time, in a process of playing the video content on the virtual screen; and
obtaining ambient light data corresponding to the data of each frame of image in real time according to the obtained data of each frame of image.
In this embodiment, the ambient light is obtained in real time according to the played video content, and there is no requirement about the video data, i.e., the video itself is not required to carry any ambient light data. Thus by using the implementation method of the virtual cinema provided by the embodiment of the present disclosure, ambient light varying with the video content can be added while playing any ordinary video, thereby improving the user experience.
Further, obtaining ambient light data corresponding to the data of each frame of image in real time according to the obtained data of each frame of image is to perform a Gaussian blur algorithm for the data of each frame of image to process the data of each frame of image into data only including colors, which serves as the ambient light data.
The Gaussian blur is also referred to as Gaussian smoothing, which is widely used in the image processing and can effectively reduce image noises and degrade the detail level. The Gaussian blur algorithm can count the pixel color values around a certain point in a Gaussian curve, and obtain a color value of the curve in a weighted average calculation method. The Gaussian blur algorithm is performed for the data of each frame of image to process the data of each frame of image into ambient data only including colors. Next, the obtained ambient light data is projected around the virtual screen to achieve a lighting effect. The ambient light data obtained after the Gaussian blur comes from the virtual screen, thus the ambient effect is vivid while varying with the video content, which immerses the user in the virtual cinema and improves the user experience.
FIG. 2 is a flow diagram of an implementation method of a virtual cinema provided by a preferred embodiment of the present disclosure. In the implementation method of the virtual cinema provided by the present disclosure, a virtual screen may be arranged using techniques such as Unity3D or OpenGL, and ambient light varying with the video content may be projected around the virtual screen. As illustrated in FIG. 2, an implementation method of a virtual cinema provided by a preferred embodiment of the present disclosure comprises:
Step S210: arranging a virtual screen using a plugin NGUI in Unity3D. The NGUI provides a strong UI system and an event notification framework, and it has concise codes, simple operations and high performances.
Step S220: arranging a projector on the virtual screen to project around the virtual cinema. The Unity3D provides a projector component, and a material can be projected to a scene by the projector.
Step S230: obtaining data of each frame of image of a video in real time in a process of playing the video in the virtual cinema.
Step S240: processing the data of each frame of image using a Gaussian blur algorithm to obtain data only including colors. The Gaussian blur algorithm can count the pixel color values around a certain point in a Gaussian curve, and obtain a color value of that curve in a weighted average calculation method.
Step S250: applying the data only including colors in step S240 into the projector which projects ambient light of corresponding colors around the virtual screen; when a frame of image varies, a different color value is obtained, and then different ambient light is projected, thus ambient light varying with the video content is obtained.
The implementation method of a virtual cinema provided by the present disclosure is particularly suitable to the virtual reality device. Ambient light varying with screen light is added into the virtual cinema, and the ambient light around each frame of picture varies with that picture, which enables a user to feel like watching films in a real cinema, thereby enhancing the sense of reality and improving the user experience. In addition, the implementation method of a virtual cinema provided by the present disclosure is also suitable for an enhanced reality device and an ordinary video player.
FIG. 3 is a system structure diagram of a virtual cinema provided by a preferred embodiment of the present disclosure. As illustrated in FIG. 3, the virtual cinema provided by the embodiment of the present disclosure comprises a virtual screen 310 and a projection module 330.
The virtual screen 310 is configured to play a video content. It is corresponding to a screen of a real cinema, and a user can watch the video content such as a film through the virtual screen 310.
The projection module 330 is configured to project ambient light varying with the video content around the virtual screen 310.
The existed virtual cinema usually plays a video within the entire range of visibility of the user, or within a part of the range of visibility of the user without displaying any content in other parts. When watching a film in a real cinema, people will see ambient light projected from the screen to the wall, ground, etc. in addition to the content presented on the screen. While the ambient light is not considered in the existed virtual cinema, and the sense of reality is weakened. The virtual cinema provided by the embodiment of the present disclosure plays the video content using the virtual screen 310 which only occupies a part of the range of visibility of the user, usually the middle part, and the projection module 330 projects ambient light varying with the video content around the virtual screen 310, i.e., in other parts of the range of visibility of the user, so that the user experiences a film watching in a real cinema.
The virtual cinema provided by a preferred embodiment of the present disclosure further comprises an ambient light obtaining module 320 connected to the virtual screen 310 and the projection module 330, respectively. Firstly, the ambient light obtaining module 320 obtains data of each frame of image played by the virtual screen 310 in real time, then obtains ambient light data corresponding to the data of each frame of image in real time according to the data of each frame of image, and finally transmits the ambient light data to the projection module 330 which projects ambient light varying with the video content around the virtual screen 310. The ambient light is obtained by the ambient light obtaining module 320 in real time according to the content played by the virtual screen 310, thus it is unnecessary to preprocess the played video, and ambient light varying with the video content can be obtained when any ordinary video is played by the virtual cinema.
Preferably, the ambient light obtaining module 320 performs a Gaussian blur algorithm for the data of each frame of image, counts the pixel color values around a certain point in a Gaussian curve, and obtains a color value of that curve in a weighted average calculation method. It processes the data of each frame of image into ambient data only including colors, and then projects the obtained ambient light data only including colors around the virtual screen to achieve a lighting effect. The ambient light data comes from the virtual screen 310, thus the ambient effect is vivid while varying with the video content.
Further preferably, the virtual screen 310 and the projection module 330 are implemented using the technique such as Unity3D or OpenGL. For example, the virtual screen 310 may be arranged using a plugin NGUI in Unity3D, and the projection module 330 may be implemented using the projector component provided by Unity3D.
The virtual cinema provided by the present disclosure is applicable to the virtual reality device, the enhanced reality device and the ordinary video player, and particularly suitable to the virtual reality device, thereby enabling a user to feel like staying in a real scene.
In conclusion, a virtual cinema and an implementation method thereof provided by the present disclosure have the following beneficial effects as compared with the prior art:
1. The virtual cinema provided by the present disclosure projects ambient light varying with the video content around the virtual screen while displaying the video content to the user through the virtual screen, thereby enhancing the sense of reality and improving the user experience.
2. The virtual cinema provided by the present disclosure obtains ambient light varying with the played video content in real time according to the video content played by the virtual screen, thus the ambient effect is vivid, and it is unnecessary to preprocess the video, thereby saving the resources.
The foregoing description of the embodiments has been provided for purposes of illustration and description. It is not intended to be exhaustive or to limit the disclosure. Individual elements or features of a particular embodiment are generally not limited to that particular embodiment, but, where applicable, are interchangeable and can be used in a selected embodiment, even if not specifically shown or described. The same may also be varied in many ways. Such variations are not to be regarded as a departure from the disclosure, and all such modifications are intended to be included within the scope of the disclosure.

Claims (4)

The invention claimed is:
1. An implementation method of a virtual cinema, the implementation method applicable to a virtual reality device, the implementation method comprising:
arranging a virtual screen with a Unity3D technique;
providing a projector on the virtual screen and setting a direction of the projector to project around the virtual screen, via a projector component provided by Unity3D;
playing video content on the virtual screen;
obtaining data of each frame of image of the video content in real time when the video content is playing on the virtual screen;
obtaining ambient light data corresponding to the obtained data of each frame of image in real time;
applying the obtained ambient light data to the projector; and
projecting, via the projector, ambient light that varies with the video content around the virtual screen.
2. The implementation method of a virtual cinema according to claim 1, wherein obtaining ambient light data comprises performing a Gaussian blur algorithm to process the data of each frame of image into data including only colors, which serves as the ambient light data.
3. A virtual cinema applicable to a virtual reality device, the virtual cinema comprising a virtual screen, an ambient light obtaining module, and a projection module, the virtual screen arranged with a Unity3D technique and configured to play a video content, the ambient light obtaining module connected to the virtual screen and the projection module, the ambient light obtaining module configured to obtain data of each frame of image played by the virtual screen in real time, obtain ambient light data corresponding to the obtained data of each frame of image played by the virtual screen in real time, and transmit the obtained ambient light data to the projection module, and the projection module implemented using a projector component provided by Unity3D and configured to provide a projector on the virtual screen, set a direction of the projector to project around the virtual screen, and apply the obtained ambient light data to the projector, the projector configured to project ambient light that varies with the video content around the virtual screen.
4. The virtual cinema according to claim 3, wherein the ambient light obtaining module is configured to perform a Gaussian blur algorithm to process the data of each frame of image into data including only colors, which serves as the ambient light data.
US15/211,671 2015-12-31 2016-07-15 Virtual cinema and implementation method thereof Active 2036-12-07 US10362283B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN201511026692.8 2015-12-31
CN201511026692 2015-12-31
CN201511026692.8A CN105657494B (en) 2015-12-31 2015-12-31 A kind of virtual theater and its implementation

Publications (2)

Publication Number Publication Date
US20170195646A1 US20170195646A1 (en) 2017-07-06
US10362283B2 true US10362283B2 (en) 2019-07-23

Family

ID=56491079

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/211,671 Active 2036-12-07 US10362283B2 (en) 2015-12-31 2016-07-15 Virtual cinema and implementation method thereof

Country Status (2)

Country Link
US (1) US10362283B2 (en)
CN (1) CN105657494B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11494994B2 (en) * 2018-05-25 2022-11-08 Tiff's Treats Holdings, Inc. Apparatus, method, and system for presentation of multimedia content including augmented reality content

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106291932A (en) * 2016-09-06 2017-01-04 京东方科技集团股份有限公司 A kind of virtual implementing helmet
CN106534830B (en) * 2016-10-10 2019-04-26 上海蒙彤文化传播有限公司 A kind of movie theatre play system based on virtual reality
CN106843471A (en) * 2016-12-28 2017-06-13 歌尔科技有限公司 A kind of method of cinema system and viewing film based on virtual implementing helmet
CN107016718B (en) * 2017-02-20 2020-06-19 北京奇艺世纪科技有限公司 Scene rendering method and device
CN107071555B (en) * 2017-03-31 2020-07-17 奇酷互联网络科技(深圳)有限公司 Method and device for loading images in VR (virtual reality) video and electronic equipment
CN107135420A (en) * 2017-04-28 2017-09-05 歌尔科技有限公司 Video broadcasting method and system based on virtual reality technology
CN109151539B (en) * 2017-06-16 2021-05-28 武汉斗鱼网络科技有限公司 Video live broadcasting method, system and equipment based on unity3d
CN107908401B (en) * 2017-12-13 2021-06-01 上海幻维数码创意科技股份有限公司 Multimedia file making method based on Unity engine
CN108335362B (en) * 2018-01-16 2021-11-12 重庆爱奇艺智能科技有限公司 Light control method and device in virtual scene and VR (virtual reality) equipment
CN112087663B (en) * 2020-09-10 2021-09-28 北京小糖科技有限责任公司 Method for generating dance video with adaptive light and shade environment by mobile terminal
CN115866311B (en) * 2023-02-15 2023-05-05 深圳市天趣星空科技有限公司 Virtual screen surrounding atmosphere rendering method for intelligent glasses

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030128337A1 (en) * 2001-12-07 2003-07-10 Jaynes Christopher O. Dynamic shadow removal from front projection displays
US20060268180A1 (en) * 2005-05-31 2006-11-30 Chih-Hsien Chou Method and system for automatic brightness and contrast adjustment of a video source
US20100201878A1 (en) * 2006-03-31 2010-08-12 Koninklijke Philips Electronics N.V. Adaptive content rendering based on additional frames of content
US20140080638A1 (en) * 2012-09-19 2014-03-20 Board Of Regents, The University Of Texas System Systems and methods for providing training and instruction to a football kicker
US20140333660A1 (en) * 2011-12-08 2014-11-13 Dolby Laboratories Licensing Corporation Mapping for display emulation based on image characteristics
CN104185087A (en) 2014-08-19 2014-12-03 厦门美图之家科技有限公司 Switching method based on different video streams of one video file
US20160044298A1 (en) * 2014-08-08 2016-02-11 Leap Motion, Inc. Augmented reality with motion sensing

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030128337A1 (en) * 2001-12-07 2003-07-10 Jaynes Christopher O. Dynamic shadow removal from front projection displays
US20060268180A1 (en) * 2005-05-31 2006-11-30 Chih-Hsien Chou Method and system for automatic brightness and contrast adjustment of a video source
US20100201878A1 (en) * 2006-03-31 2010-08-12 Koninklijke Philips Electronics N.V. Adaptive content rendering based on additional frames of content
US20140333660A1 (en) * 2011-12-08 2014-11-13 Dolby Laboratories Licensing Corporation Mapping for display emulation based on image characteristics
US20140080638A1 (en) * 2012-09-19 2014-03-20 Board Of Regents, The University Of Texas System Systems and methods for providing training and instruction to a football kicker
US20160044298A1 (en) * 2014-08-08 2016-02-11 Leap Motion, Inc. Augmented reality with motion sensing
CN104185087A (en) 2014-08-19 2014-12-03 厦门美图之家科技有限公司 Switching method based on different video streams of one video file

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Office action from Chinese Application No. 201511026692.8 dated Feb. 14, 2018 (6 pages).

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11494994B2 (en) * 2018-05-25 2022-11-08 Tiff's Treats Holdings, Inc. Apparatus, method, and system for presentation of multimedia content including augmented reality content

Also Published As

Publication number Publication date
CN105657494A (en) 2016-06-08
US20170195646A1 (en) 2017-07-06
CN105657494B (en) 2018-12-25

Similar Documents

Publication Publication Date Title
US10362283B2 (en) Virtual cinema and implementation method thereof
US10943502B2 (en) Manipulation of media content to overcome user impairments
US11179634B2 (en) Content presenting method, user equipment and system
WO2019153723A1 (en) Video frame display method and device, television and storage medium
US20210266613A1 (en) Generating composite video stream for display in vr
JP2019503517A (en) Method and apparatus for reducing the effects of myopia on electronic displays
CN106303354B (en) Face special effect recommendation method and electronic equipment
CN107948733B (en) Video image processing method and device and electronic equipment
US10204596B2 (en) Display control for transparent display
EP3685575B1 (en) Display apparatus, method for controlling the same and image providing apparatus
CN107920185A (en) For handling apparatus and method of the video content for display control
US9491428B2 (en) Color balancing based on reference points
US20170161875A1 (en) Video resolution method and apparatus
CN111405339A (en) Split screen display method, electronic equipment and storage medium
CN110012336A (en) Picture configuration method, terminal and the device at interface is broadcast live
KR101305735B1 (en) Method and apparatus for providing of tactile effect
CN113099237A (en) Video processing method and device
CN114339449B (en) Copyright protection method for embedding watermark in display system
CN105635789A (en) Video image OSD brightness reducing method and device
CN117062282A (en) Light control method, device, equipment, storage medium and vehicle
JP2019110396A (en) Display device, delivery device, television receiver, and display method
CN108335362B (en) Light control method and device in virtual scene and VR (virtual reality) equipment
US8553071B2 (en) Methods and systems for presenting adjunct content during a presentation of a media content instance
CN105307001A (en) Method and device for real-time displaying release information on video program
CN113347486B (en) Digital television combined with polarized glasses and use method

Legal Events

Date Code Title Description
AS Assignment

Owner name: BEIJING PICO TECHNOLOGY CO., LTD., CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ZHANG, RUISHENG;REEL/FRAME:039176/0580

Effective date: 20160713

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2551); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

Year of fee payment: 4

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY