WO2018050021A1 - 虚拟现实场景调节方法、装置及存储介质 - Google Patents

虚拟现实场景调节方法、装置及存储介质 Download PDF

Info

Publication number
WO2018050021A1
WO2018050021A1 PCT/CN2017/100988 CN2017100988W WO2018050021A1 WO 2018050021 A1 WO2018050021 A1 WO 2018050021A1 CN 2017100988 W CN2017100988 W CN 2017100988W WO 2018050021 A1 WO2018050021 A1 WO 2018050021A1
Authority
WO
WIPO (PCT)
Prior art keywords
scene
theme
movie
television
work
Prior art date
Application number
PCT/CN2017/100988
Other languages
English (en)
French (fr)
Inventor
樊邵婷
陈淼
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Publication of WO2018050021A1 publication Critical patent/WO2018050021A1/zh
Priority to US16/206,156 priority Critical patent/US10880595B2/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/23418Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/266Channel or content management, e.g. generation and management of keys and entitlement messages in a conditional access system, merging a VOD unicast channel into a multicast channel
    • H04N21/2668Creating a channel for a dedicated end-user group, e.g. insertion of targeted commercials based on end-user profiles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8146Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics
    • H04N21/8153Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics comprising still images, e.g. texture, background image
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47202End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting content on demand, e.g. video on demand

Definitions

  • the present application relates to the field of virtual reality (VR), and in particular, to a VR scene adjustment method, apparatus, and storage medium.
  • VR virtual reality
  • VR technology is one of the most popular cutting-edge technologies today. Under the leadership of some world-class technology giants, VR technology is developing rapidly. VR technology is also called the social environment or artificial environment. Its definition is a computer simulation system developed by various fields such as collection simulation technology, computer graphics, human-machine interface technology, multimedia technology, sensing technology and network technology. The ability to create and let users feel the experience that was originally only available in the real world. Simply put, VR technology takes the user's perception into the virtual world created by it, and lets the user think that everything in front of him is real.
  • An embodiment of the present application provides a VR scene adjustment method, including: pre-storing a VR theme scene, and determining corresponding feature information; acquiring a movie and television work selected by the user; determining feature information of the selected movie work; and selecting according to the selected The feature information of the movie work and the feature information of the VR theme scene determine the VR theme scene corresponding to the selected movie work.
  • the embodiment of the present application further provides a VR scene adjustment apparatus, including:
  • a storage module configured to store a VR theme scene
  • a feature information extraction module configured to determine feature information of the stored VR theme scene
  • the obtaining module is configured to obtain a movie and television work selected by the user to play;
  • a first determining module configured to determine feature information of the selected movie and television work
  • the second determining module is configured to determine, according to the feature information of the selected movie and the feature information of the VR theme scene, the VR theme scene corresponding to the selected movie and television work.
  • Another VR scene adjustment method provided by the present application includes: collecting color of n points on a played movie and television, and obtaining color code parameters of n points; wherein n is a natural number greater than or equal to 2; according to n The color code parameter of the point generates a parameter that forms a dynamic light source; and replaces the parameters of all dynamic light sources in the current VR theme scene with the parameters of the generated dynamic light source.
  • An acquisition module configured to collect color of n points on the played movie and television work, to obtain color code parameters of n points; wherein n is a natural number greater than or equal to 2;
  • An analog module configured to generate a parameter of the dynamic light source according to the color code parameters of the n points;
  • the replacement module is configured to replace parameters of all dynamic light sources in the current VR theme scene with parameters of the generated dynamic light source.
  • the VR scene adjustment method provided by the present application includes: loading a configuration file of the played movie and television work; wherein the configuration file includes a time node and a corresponding scene atmosphere event; and according to the configuration file, when the movie is played to a certain When a time node is used, the scene atmosphere event corresponding to the time node is executed.
  • a loading module configured to load a configuration file of the played movie and television work before playing the movie and television work; wherein the configuration file includes a time node and a corresponding scene atmosphere event;
  • the execution module is configured to execute a scene atmosphere event corresponding to the time node when the movie is played to a certain time node according to the configuration file.
  • the embodiment of the present application further provides a non-transitory computer readable storage medium, which is stored
  • the computer readable instructions may cause at least one processor to perform the method as described above.
  • FIG. 1 is a schematic flowchart of a VR scene adjustment method in an example of the present application
  • FIG. 2 is a schematic flowchart of another VR scene adjustment method in an example of the present application.
  • FIG. 3 is a schematic flowchart of still another VR scene adjustment method in an example of the present application.
  • FIG. 4 is a schematic flowchart of a VR scene adjustment method in an example of the present application.
  • FIG. 5 is a schematic flowchart diagram of another VR scene adjustment method in an example of the present application.
  • FIG. 6 is a schematic flowchart of still another VR scene adjustment method in an example of the present application.
  • FIG. 7 is a schematic diagram of an internal structure of a VR scene adjustment apparatus according to an example of the present application.
  • FIG. 8 is a schematic diagram of an internal structure of another VR scene adjusting device in an example of the present application.
  • FIG. 9 is a schematic diagram of an internal structure of another VR scene adjustment apparatus according to an example of the present application.
  • FIG. 10 is a schematic structural diagram of hardware of a VR scene adjustment apparatus according to an example of the present application.
  • the VR system in order to enhance the playback effect of the film and television works, in addition to movies and other film and television works, the VR system also provides a light and shadow theme that can be played with the played movie, for example, a 360-degree panoramic picture, and the like. Form a full landscape movie hall that surrounds the three-dimensional. In this application, such a light and shadow theme is referred to as a VR theme scene. If the played video and VR theme scene match properly, the VR theme scene will highlight the film's environment and atmosphere, making the film more realistic, easier to bring users into the virtual world, increasing user immersion. For example, if you match a horror movie with a threatened VR theme scene, you can better enhance the movie's terror atmosphere and enhance the viewing effect.
  • the present application is proposed to implement the VR system software to automatically adapt the theater scene, change the color matching and/or configure the scene atmosphere event according to the characteristics of playing the movie and television works, so that the user can consciously integrate the film and television works while reducing the VR. Select the operation steps of the theme scene before viewing, greatly reducing the loss of computer performance.
  • the embodiment of the present application proposes a VR scene adjustment method, as shown in FIG. 1 , the method includes the following steps:
  • Step 101 Pre-categorize the stored VR theme scenes, and determine the category identifier of each VR theme scene.
  • VR theme scenes will be stored in the database of the VR system.
  • the VR theme scenes are classified according to the light and shadow effects of the VR theme scene, and the category identifiers corresponding to each VR theme scene are stored in the database, thereby forming a classification mechanism of the VR theme scene.
  • the VR theme scenes may be classified according to the categories classified for the movie and television works.
  • the VR theme scenes are also classified into: horror movies, comedy films, romantic romantic films, and the like.
  • each type of film and television work type can have only one unique corresponding VR theme scene.
  • each type of film and television work type can also have multiple corresponding VR theme scenes.
  • the classification mechanism of the VR theme scene formed by classifying the VR theme scene can facilitate automatic adaptation and classification of the VR theme scene uploaded by the user later. And this classification mechanism can also be applied to the classification of film and television works.
  • Step 102 Obtain a movie and television work selected by the user to play.
  • the user can select a movie and television work to be played from a local database or a network database through a user interface provided by the VR system software.
  • the user can select a movie and television work to be played through a touch screen or a basic operation provided by a VR system software such as voice.
  • Step 103 Determine the classification of the selected movie and television works.
  • the movie and television works stored in the local database or the network database of the VR system contain the category identifier of the movie and television work.
  • the above category identifier is used to identify the category of the film and television work, for example, to identify whether a movie is a comedy film, a horror film or a romantic love film.
  • a film and television work is published, its producer will give the classification determined by the film and television work according to the scene, mood or form of the film and television work.
  • the film and television works can be classified in advance by using manual recognition.
  • the classification mechanism can be classified by the same classification mechanism as the above-mentioned VR theme scene classification mechanism.
  • Step 104 Determine a VR theme scene corresponding to the selected movie work according to the classification of the selected movie work and the category identifier of the VR theme scene.
  • the VR theme scenes having the same classification as the selected movie and television works may be selected from the VR theme scenes stored in the database. That is, the classification of the selected movie works should be the same as the classification of the corresponding VR theme scene.
  • film and television works can be classified into comedy films, horror films, etc.
  • VR theme scenes can also be classified into comedy films, horror movies, etc. according to the classification method, to classify the above-mentioned film and television works. correspond.
  • the movie selected by the user to play is a horror movie
  • the type of the VR theme scene determined by the VR system should also be a horror movie.
  • the VR system can directly use the VR theme scene as the VR theme scene corresponding to the selected movie work; if there are many VR theme scenes in the same category If the number is greater than 1, the VR system can randomly determine a VR theme scene from the VR theme scene corresponding to the selected movie and television work. For example, if the movie selected by the user is a horror movie and there are multiple VR theme scenes of the horror movie saved by the VR system, in this case, the VR system will randomly determine a VR theme scene as the selected movie work. Corresponding VR theme scene.
  • the VR theme scene and its category identifier can be provided to the user through the user interface for the user to select. At this point, the user can select a VR theme scene. Of course, the user can also not select any VR theme scene.
  • the VR theme scene corresponding to the classification information of the movie and television works by the smart adaptation provided by the present application increases the user's immersion and reduces the user's selection of the theme scene before the VR viewing. Operation steps, greatly reducing computerity Energy consumption.
  • the above method is a method for automatically adapting the VR theme scene according to the classification of the selected movie works.
  • each film and television work may also contain tags that may more specifically describe its content or characteristics.
  • the label of the movie and television work is similar to the keyword describing the content of the movie and television work. Therefore, in another example of the present application, the VR theme scene can also be automatically adapted according to the label of the movie and television work.
  • FIG. 2 is a schematic flowchart diagram of another VR scene adjustment method in the example of the present application, where the method includes the following steps:
  • Step 201 Label the stored VR theme scenes in advance, and determine the label identifier of each VR theme scene.
  • VR theme scenes will be stored in the database of the VR system.
  • the VR theme scene in addition to storing the VR theme scene itself, the VR theme scene needs to be labeled according to the light and shadow effect of the VR theme scene, and the label identifier corresponding to each VR theme scene is stored in the database, thereby forming a VR.
  • the labeling mechanism of the theme scene in addition to storing the VR theme scene itself, the VR theme scene needs to be labeled according to the light and shadow effect of the VR theme scene, and the label identifier corresponding to each VR theme scene is stored in the database, thereby forming a VR.
  • the VR theme scene can be labeled according to the labeling of the film and television works, and the VR theme scene is also labeled, such as: horror, comedy, romance, and love.
  • each type of film and television works can have only one unique VR theme scene; however, each type of film and television work can also have multiple corresponding VR theme scenes.
  • the labeling and labeling mechanism of the formed VR theme scene can facilitate labeling of the VR theme scene uploaded by the user later, and can also be applied to labeling of the movie and television works.
  • Step 202 Acquire a movie and television work selected by the user to play.
  • the method for obtaining the movie and television work selected by the user to be played may be the same as the example corresponding to step 102 in FIG. 1 , and therefore is not described in detail herein.
  • Step 203 Determine a label of the selected movie and television work.
  • the movie and television works stored in the local database or the network database of the VR system contain tag identifiers that reflect the content keywords of the film and television work.
  • the above label identification is used to identify the characteristics of a film and television work, such as identifying a film that is characterized by comedy, horror or romance, love, and the like.
  • the narrative elements of a film and television work with the same label will have certain similarities.
  • a film and television work is published, its producer will give the label determined by the film and television work according to the scene, mood or form of the film and television work.
  • the film and television works that do not have a label the film and television works can be labelled in advance by manual means.
  • the labeling mechanism can be performed by the same mechanism as the labeling and labeling mechanism of the above VR theme scene.
  • Step 204 Determine a VR theme scene corresponding to the selected movie and television work according to the label of the selected movie and the label identifier of the VR theme scene.
  • the VR theme scene having the same label as the selected movie and television work may be selected from the VR theme scene stored in the database. That is, the label of the selected movie work should be the same as the label of the corresponding VR theme scene.
  • the film and television works can be labeled with love, comedy, etc.
  • the VR theme scene is also labeled with the label of love, comedy, etc. according to the labeling method, so as to correspond to the label of the above-mentioned film and television work.
  • the label of the movie selected by the user is love or comedy
  • the label of the VR theme scene determined by the VR system should also be love and comedy.
  • the VR system can directly use the VR theme scene as the VR theme scene corresponding to the selected movie work; if there are multiple VR theme scenes under the same label, the VR system can A VR theme scene is randomly determined from it as a VR theme scene corresponding to the selected movie work. For example, such as If the label selected by the user to play the movie is threatened, and the VR system saves the tag with multiple horror VR theme scenes, then in this case, the VR system will randomly determine a VR theme scene as corresponding to the selected movie work. VR theme scene.
  • the VR theme scene and its label identifier can be provided to the user through the user interface for the user to select.
  • the VR system can preferentially provide VR theme scenes with similar tags for the user to select, for example, provide the same VR theme scene as the tag portion of the selected movie work.
  • the user can select a VR theme scene.
  • the user can also not select any VR theme scene.
  • the VR theme scene corresponding to the label information of the movie and television works provided by the present application increases the user's immersion and reduces the user's choice of the theme scene before the VR viewing.
  • the operation steps greatly reduce the loss of computer performance.
  • FIG. 3 is a schematic flowchart diagram of still another VR scene adjustment method in the example of the present application, where the method includes the following steps:
  • Step 301 Perform classification and labeling on the stored VR theme scenes in advance, and determine the classification and label identification of each VR theme scene.
  • Step 302 Acquire a movie and television work selected by the user to play.
  • the method for obtaining the movie and television work selected by the user to play is the same as the step 102 shown in FIG. 1 , and therefore will not be described in detail herein.
  • Step 303 Determine the classification and label of the selected movie and television works.
  • the determining the classification and label of the selected movie and television works may also refer to the examples corresponding to the foregoing FIG. 1-2.
  • Step 304 Determine a VR theme scene corresponding to the selected movie work according to the classification and label of the selected movie work and the category and tag identifier of the VR theme scene.
  • the screening scene according to the classification information and the label information of the movie and television works has a priority, which may firstly filter out VR theme scenes having the same classification information as the movie and television works, and then select the same from the VR theme scenes.
  • the scene of the label information or it may first screen out the scenes with the same label information as the movie and television works, and then screen out the scenes with the same classification information from these scenes, and the like.
  • the VR system further determines a VR theme scene corresponding to the selected movie and television work according to the label of the selected movie and the label identifier of the VR theme scene. .
  • This VR theme scene is taken as a VR theme scene corresponding to the movie work.
  • the VR theme scene corresponding to the selected movie work is determined according to the label of the selected movie and the tag identifier of the VR theme scene, as shown in FIG. 2, and is not described in detail herein.
  • the VR system determines a VR theme scene corresponding to the selected movie and television work according to the classification of the selected movie and the category identifier of the VR theme scene. If there are still multiple VR masters If the problem scene satisfies the condition, one can be randomly selected from it. If the classification of all VR theme scenes under the same label is different from the classification of the movie production, the VR theme scene under the same label may be randomly selected, or the VR theme scene under the same label may be provided to the user for manual selection. Or ask the user to manually select a classification of a VR theme scene, and then randomly or manually select a VR theme scene from the VR theme scene under the category.
  • the classification and screening may be further performed. If the category identifier of the VR theme scene is different from the category identifier of the selected movie work, the user may be manually selected; or may not be performed. Classification screening, directly use this VR theme scene as the VR theme scene corresponding to the film and television work. And the VR theme scene corresponding to the selected movie work is determined according to the classification of the selected movie work and the category identifier of the VR theme scene, and the example corresponding to FIG. 1 is not detailed here.
  • the VR theme scene corresponding to the classification and label information of the movie and television works provided by the present application increases the user's immersion and reduces the user's selection of the theme before the VR viewing.
  • the operation steps of the scene greatly reduce the loss of computer performance.
  • the VR system can determine the VR theme corresponding to the selected movie work according to the classification and/or label of the selected movie work and the category and/or tag identifier of the VR theme scene.
  • the classification and/or label of the film and television works and the VR theme scene may be collectively referred to as the feature information of the film and television work and the VR theme scene, and the feature information may also include other content that can reflect each of the film and television works or the VR theme scene.
  • the characteristics of the content of the work or VR theme scene that is, summarizing the examples shown in FIG. 1, FIG. 2 and FIG.
  • the VR system pre-stores the VR theme context and determines the feature information of the VR theme scene; then, after the user selects to play the movie and television works, the user is determined.
  • the feature information of the selected movie and television work the VR theme scene matching the feature information is determined from the VR theme scene according to the feature information of the movie work, and It serves as the VR theme scene for the film and television works being played.
  • the light and shadow color of the VR theme scene can be dynamically adjusted according to the light and shadow color of the played movie.
  • a further example of the present application provides such a method, the implementation of which is shown in Figure 4. The method can be applied to the VR theme scene automatically obtained by the above method, or can be applied to the VR theme scene manually selected by the user.
  • FIG. 4 is a schematic flowchart diagram of another VR scene adjustment method in the example of the present application, where the method includes the following steps:
  • Step 401 After starting to play the movie and television work, collect the color of n points on the played movie and television work, and obtain color code parameters of n points. Where n is a natural number greater than or equal to 2.
  • the color code parameter may be a parameter represented by an RGB color value or a hexadecimal color code.
  • the system may acquire more than two points in real time from the ongoing movie and television production, for example, taking 4 points to separately record the color code parameters of the colors of the points at each color taking time.
  • the film and television works that are played are comedy romance films.
  • the color code parameters of the four points collected from a certain moment are Light Pink (hex #FFB6C1 or RGB255, 182, 193), pink (Pink). Hexadecimal #FFC0CB or RGB255, 192, 203), Hot Pink (hex #FF69B4 or RGB 255, 105, 180) and Deep Pink (hex #FF69B4 or RGB 255, 105, 180).
  • n the actual selection of n can be adjusted based on the balance between the performance requirements of the software and the performance of the computer.
  • Step 402 Generate parameters of the dynamic light source according to the color code parameters of the n points.
  • the VR system can generate a dynamic light source based on the color code parameters of the n points to obtain parameters of the generated dynamic light source.
  • the parameter for generating the dynamic light source according to the color code parameters of the n points may be implemented by using existing VR software, for example, by using an engine such as Unity 3D or Unreal.
  • the generated dynamic light source may be: a point light source, a surface light source, a spotlight, and the like.
  • the color code parameters of the 4 points collected from a certain time are Light Pink (hex #FFB6C1 or RGB255, 182, 193), Pink (hex). #FFC0CB or RGB255, 192, 203), Hot Pink (hex #FF69B4 or RGB 255, 105, 180) and Deep Pink (hex #FF69B4 or RGB 255, 105, 180), the VR software module gets these After the color code parameter, a new dynamic light source can be generated.
  • the dynamic light source includes four faded nodes from shallow to deep recorded by the above color code parameters.
  • Step 403 Replace the parameters of all dynamic light sources in the current VR theme scene with the parameters of the generated dynamic light source.
  • the parameters of all the dynamic light sources in the current VR theme scene are replaced by the parameters of the dynamic light source generated in step 402, and the light and shadow colors of the VR theme scene can be adjusted, so that the VR theme scene has the color and the played video works. The colors stay the same.
  • the VR system will periodically perform the above method, that is, periodically collect the colors of n points, and periodically generate dynamic light source parameters according to the colors of the collected n points, and replace the VR theme scene with this.
  • the parameters of the medium dynamic light source so that the color of the VR theme scene is always consistent with the color of the movie.
  • the color picking frequency of the VR system It can be f, for example, it can be 10 Hz.
  • the actual selection of f can be adjusted based on the balance between the software's performance requirements and computer performance.
  • f the more times the color is collected in a unit time, the light source simulated by the dynamic light source is closer to the true color of the film and television works, but the higher the performance of the computer is also put forward, so that in actual use, it can be based on
  • the above-mentioned f is selected by the balance between the effect requirements of the software and the performance of the computer.
  • FIG. 5 is a schematic flowchart diagram of another VR scene adjustment method in the example of the present application, where the method includes the following steps:
  • Step 501 Load a configuration file of the played movie and television work.
  • the configuration file includes: a time node and a corresponding scene atmosphere event.
  • the scene atmosphere event refers to a phenomenon in which a scene appearing in a VR scene is displayed in the form of an art material or the like.
  • the configuration file of the played movie and television work may be loaded before playing the movie and television work, or the configuration file of the movie and television work may be loaded during the process of playing the movie and television work.
  • the above configuration file may be a scene atmosphere event list (library), including: a time node parameter and a scene event parameter.
  • a scene atmosphere event list (library)
  • the profile of a film and television work can be served by the VR system. Providers, providers of film and television works, and even user-generated editors.
  • the scene event parameters in the scene atmosphere list may include the use of art materials, and the use of the art materials includes but is not limited to special effects, and may also be resources such as models and actions, such as: users.
  • the scene event triggered by the time node when watching the horror movie may be a zombie with a model and an action appearing in the viewing hall, or a scene change such as a scene shaking or collapsing.
  • the time node may be a moment when an important episode of a movie and television production plays, and the selection of the time point may be performed by a service provider of the VR system, a provider of the movie work, or even the user himself. carry out.
  • the editor of the profile can add the following record to the profile of the movie: time node: 30 minutes and 15 seconds, scene atmosphere event: flash event. Assume that a movie will have a kiss shot when it is played for 58 minutes and 35 seconds. You can add the following record to the video's configuration file: time node: 58 minutes and 35 seconds, scene atmosphere event: pink bubble event and so on.
  • Step 502 According to the configuration file, when the movie is played to a certain time node, a scene atmosphere event corresponding to the time node is executed.
  • the video's configuration file contains the following records: time node: 30 minutes and 15 seconds, scene atmosphere event: flash event.
  • the VR system will perform a flashing event, at which time the entire viewing hall will flash.
  • the video's configuration file contains the following records: time node: 58 minutes and 35 seconds, scene atmosphere event: pink bubble event.
  • the VR system will perform a pink bubble event, when a pink bubble will appear in the viewing hall.
  • FIG. 6 is a schematic flowchart of a method for adjusting a VR scene in the example of the present application.
  • the method can combine the automatic adaptation of a VR theme scene and the color matching scheme of a theme context, including the following steps:
  • Step 601 preset a plurality of VR theme scenes, and automatically adapt the VR theme scenes matched with the selected movie works according to the feature information of the selected movie works.
  • the corresponding scenario corresponding to the movie and television work can be intelligently adapted through the examples corresponding to FIG. 1 to FIG. 3, that is, the step 601 is implemented, and thus is not described in detail herein.
  • Step 602 Perform real-time color picking on the currently being played movie and the color code is superimposed on the dynamic light source, and reset the color of the theater.
  • the step 602 can be implemented by an example corresponding to FIG. 4, that is, the color of the matching with the movie and television works is automatically changed, and thus will not be described in detail herein.
  • the solution for configuring the scene atmosphere event may further be combined, that is, the foregoing method further includes:
  • Step 603 According to the configuration file of the played movie and television works, the video works are played to the finger The scene atmosphere event associated with the time node is triggered when the time node is determined.
  • this step 603 can be implemented by an example corresponding to FIG. 5, and thus will not be described in detail herein.
  • the two steps of the above steps 602 and 603 may be performed to implement the VR theme scene.
  • the color transformation and the execution of scene atmosphere events enhance the immersion and realism of the user when watching the film and television works in the VR environment.
  • the present application also proposes a virtual reality VR scene adjusting device.
  • the VR scene adjustment apparatus may be implemented by the structure shown in FIG. 7, and includes four functional modules: a storage module 701, a feature information extraction module 702, an acquisition module 703, a first determination module 704, and a second determination module 705.
  • the storage module 701 is configured to store a VR theme scene.
  • the feature information extraction module 702 is configured to determine feature information of the stored VR theme scene. Further, the feature information includes a category identifier and/or a tag.
  • the obtaining module 703 is configured to obtain a movie and television work selected by the user to play.
  • the first determining module 704 determines feature information of the selected movie and television work.
  • the second determining module 705 determines, according to the feature information of the selected movie and the feature information of the VR theme scene, the VR theme scene corresponding to the selected movie work.
  • the VR scene adjustment device can automatically adapt the VR theme scene according to the feature information of the movie and television work. .
  • the apparatus may be implemented by the structure shown in FIG. 8, including:
  • the color picking module 801 is configured to collect the color of n points in real time on the played movie and television works, and obtain color code parameters of n points; wherein n is a natural number greater than or equal to 2;
  • An analog module 802 configured to generate a parameter of the dynamic light source according to the color code parameters of the n points;
  • the replacement module 803 is configured to replace the parameters of the generated dynamic light source with the parameters of all dynamic light sources in the current VR theme scene.
  • the VR scene adjusting device can automatically adjust the color of the VR theme scene according to the color of the movie and television work.
  • the device may be implemented by the structure shown in FIG. 9, including:
  • a loading module 901 configured to load a configuration file of the played movie and television work; wherein the configuration file includes a time node and a corresponding scene atmosphere event;
  • the execution module 902 is configured to execute, according to the configuration file, a scene atmosphere event corresponding to the time node when the movie is played to a certain time node.
  • the VR scene adjusting device can add an atmosphere event matched with the played movie and television works in the VR theme scene.
  • the VR scene adjusting device can include both the module shown in FIG. 7 and the module shown in FIG. 8, and at the same time realize automatic adaptation and color adjustment of the VR theme scene.
  • the VR scene adjustment device may include both the module shown in FIG. 8 and the module shown in FIG. 9 to simultaneously implement color adjustment of the VR theme scene and increase the atmosphere event synchronized with the movie and television work.
  • the VR scene adjustment device may include both the module shown in FIG. 7 and the modules shown in FIG. 8 and FIG. 9, and simultaneously implement automatic adaptation, color adjustment, and addition of an atmosphere event synchronized with the movie and television work.
  • the virtual reality VR scene adjustment method and the virtual reality VR scene adjustment apparatus and the modules therein may be integrated into one processing unit, or each module may exist physically separately, or may be two or two. More than one device or module is integrated in one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of a software functional unit.
  • the virtual reality VR scene adjustment apparatus described above can be executed in various computing devices that can perform user information processing based on the Internet and loaded in the memory of the computing device.
  • FIG. 10 is a schematic diagram showing the hardware structure of a VR scene adjusting device.
  • the VR scene adjustment apparatus includes one or more processors (CPUs) 1002, a head-mounted device 1004, a memory 1006, a database 1010, and a connection device 1008 for interconnecting these components.
  • the database 1010 can be implemented by a separate device independent of the VR scene adjustment device. In this case, the VR scene adjustment device will establish a connection with the database 1010 via a wired or wireless network to read data from the database.
  • the processor 1002 can receive and transmit data through the head display device 1004 to implement network communication and/or local communication.
  • the head display device 1004 can play a movie and television work selected by the user and a VR theme scene. In some examples, the head display device 1004 can present other effects such as 2D, 3D, and the like.
  • the head display device 1004 may include an external head display device and an integrated head display device. Mobile terminal display device.
  • the database 1010 includes a VR theme scene library 1012 and a movie library 1014.
  • the database 1010 may further include a configuration file library corresponding to the movie and television work.
  • the memory 1006 may be a high speed random access memory such as DRAM, SRAM, DDR RAM, or other random access solid state storage device; or a nonvolatile memory such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, Or other non-volatile solid-state storage devices.
  • a high speed random access memory such as DRAM, SRAM, DDR RAM, or other random access solid state storage device
  • nonvolatile memory such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, Or other non-volatile solid-state storage devices.
  • the memory 1006 stores a set of instructions executable by the processor 1002, including:
  • An operating system 1016 including a program for processing various basic system services and for performing hardware related tasks
  • the virtual reality VR scene adjustment module 1018 includes various applications for implementing automatic adaptation of the VR theme scene, changing the scene color, and matching the scene atmosphere event function, and the application program can implement the processing flow in each of the above examples.
  • the virtual reality VR scene adjustment module 1018 specifically includes the following instruction modules:
  • a storage module 701 a feature information extraction module 702, an acquisition module 703, a first determination module 704, a second determination module 705, and/or a color selection module 801 for transforming the color of the VR theme scene,
  • the processor 1002 can realize the functions of the above-described modules 701 to 705, 801 to 803, and 901 to 902 by executing the device executable instructions in the modules 701 to 705, 801 to 803, and 901 to 902 in the memory 1006.
  • each of the examples of the present application can be implemented by a data processing program executed by a data processing device such as a computer.
  • the data processing program constitutes the present application.
  • a data processing program usually stored in a storage medium can be read out of the storage medium directly or by installing or copying the program to a storage device of the data processing device (such as a hard disk and Or in memory). Therefore, such a storage medium also constitutes the present application.
  • the storage medium can use any type of recording method, such as paper storage medium (such as paper tape, etc.), magnetic storage medium (such as floppy disk, hard disk, flash memory, etc.), optical storage medium (such as CD-ROM, etc.), magneto-optical storage medium ( Such as MO, etc.).
  • the present application therefore also discloses a non-volatile storage medium in which is stored a data processing program for performing any of the above-described methods of the present application.
  • the method steps described in this application can be implemented by a data processing program, and can also be implemented by hardware, for example, by logic gates, switches, application specific integrated circuits (ASICs), programmable logic controllers, and embedded control. And so on.
  • ASICs application specific integrated circuits
  • programmable logic controllers programmable logic controllers
  • embedded control embedded control

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Graphics (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Studio Devices (AREA)

Abstract

一种虚拟现实(VR)场景调节方法及装置,包括:预先存储VR主题场景,并确定其对应的特征信息;获取用户选择播放的影视作品;确定所选择影视作品的特征信息;以及根据所选择影视作品的特征信息以及VR主题场景的特征信息确定与所选择影视作品对应的VR主题场景。

Description

虚拟现实场景调节方法、装置及存储介质
本申请要求于2016年09月19日提交中国专利局、申请号为201610829939.8、发明名称为“虚拟现实场景调节方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及虚拟现实(VR)领域,尤其涉及一种VR场景调节方法、装置及存储介质。
背景
目前,虚拟现实(VR)技术是如今最受关注的前沿科技之一,在一些世界级科技巨头的带领下,VR技术正在飞速发展着。VR技术也称灵境技术或人工环境,其定义是集合仿真技术、计算机图形学、人机接口技术、多媒体技术、传感技术以及网络技术等多种领域技术而开发出来的一种计算机仿真系统,能够创建并让用户感受到原本只有在真实世界才会拥有的体验。简单来说,VR技术能够将用户的感知带入由它创建的虚拟世界,并让用户以为眼前的以前一切都是真实的。
技术内容
本申请实施例提供了一种VR场景调节方法,包括:预先存储VR主题场景,并确定其对应的特征信息;获取用户选择播放的影视作品;确定所选择影视作品的特征信息;以及根据所选择影视作品的特征信息以及VR主题场景的特征信息确定与所选择影视作品对应的VR主题场景。
本申请实施例还提供了一种VR场景调节装置,包括:
存储模块,用于存储VR主题场景;
特征信息提取模块,用于确定所存储的VR主题场景的特征信息;
获取模块,用于获取用户选择播放的影视作品;
第一确定模块,用于确定所选择影视作品的特征信息;以及
第二确定模块,用于根据所选择影视作品的特征信息以及VR主题场景的特征信息标识确定与所选择影视作品对应的VR主题场景。本申请提供的另一种VR场景调节方法,包括:在播放的影视作品上采集n个点的颜色,得到n个点的色码参数;其中,n为大于或等于2的自然数;根据n个点的色码参数生成形成动态光源的参数;以及用生成的动态光源的参数替换当前VR主题场景下所有动态光源的参数。
本申请实施例提供的另一种VR场景调节装置,包括:
采集模块,用于在播放的影视作品上采集n个点的颜色,得到n个点的色码参数;其中,n为大于或等于2的自然数;
模拟模块,用于根据n个点的色码参数生成动态光源的参数;
替换模块,用于用生成的动态光源的参数替换当前VR主题场景下所有动态光源的参数。本申请提供的又一种VR场景调节方法,包括:加载所播放影视作品的配置文件;其中,所述配置文件包括时间节点和对应的场景气氛事件;以及依据上述配置文件,当影片播放到某个时间节点时,执行该时间节点对应的场景气氛事件。
本申请实施例提供的又一种VR场景调节装置,包括:
加载模块,用于在播放影视作品之前,加载所播放影视作品的配置文件;其中,所述配置文件包括时间节点和对应的场景气氛事件;以及
执行模块,用于依据上述配置文件,当影片播放到某个时间节点时,执行该时间节点对应的场景气氛事件。
本申请实施例还提供了一种非易失性计算机可读存储介质,存储有 计算机可读指令,可以使至少一个处理器执行如上所述的方法。
附图简要说明
为了更清楚地说明本申请实例或现有技术中的技术方案,下面将对实例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为本申请实例中的一种VR场景调节方法的流程示意图;
图2为本申请实例中的另一种VR场景调节方法的流程示意图;
图3为本申请实例中的又一种VR场景调节方法的流程示意图;
图4为本申请实例中的一种VR场景调节方法的流程示意图;
图5为本申请实例中的另一种VR场景调节方法的流程示意图;
图6为本申请实例中的又一种VR场景调节方法的流程示意图;
图7为本申请实例中一种VR场景调节装置的内部结构示意图;
图8为本申请实例中另一种VR场景调节装置的内部结构示意图;
图9为本申请实例中又一种VR场景调节装置的内部结构示意图;以及
图10为本申请实例中一种VR场景调节装置的硬件结构示意图。
实施方式
下面将结合本申请实例中的附图,对本申请实例中的技术方案进行清楚、完整地描述,显然,所描述的实例仅是本申请一部分实例,而不是全部的实例。基于本申请中的实例,本领域普通技术人员在没有做出 创造性劳动前提下所获得的所有其他实例,都属于本申请保护的范围。
在现有的VR系统中,为了提升影视作品的播放效果,除了影片等影视作品之外,VR系统还提供了可以与所播放影片配套播放的光影主题,例如,360度的全景图片等等,形成环绕立体的全景观影大厅。在本申请中,将这种光影主题称为VR主题场景。如果播放的影片和VR主题场景搭配的合适,则VR主题场景会烘托影片的环境和气氛,使得影片更为逼真,更容易将用户带入虚拟世界,增加用户的沉浸感。例如,如果为恐怖片搭配阴森恐怖的VR主题场景,则可以更好地烘托出影片的恐怖气氛,提升观影效果。
然而,在现有的VR应用中,对于VR主题场景的选择,通常是先让用户选择需要播放的影片,然后再提供多种VR主题场景让用户选择。但是,如果用户对影片的内容并不了解,通常很难选择到合适的与影片匹配的VR主题场景,从而使播放效果大打折扣。而且,提供用户进行VR主题场景选择的环节也增加了用户的操作步骤和对VR系统计算机性能耗损。
因此,提出本申请,以实现VR系统软件根据播放影视作品的特点,自动对剧场场景进行适配、变换颜色搭配和/或配置场景气氛事件,让用户从知觉上融入影视作品的同时减少在VR观影前选择主题场景的操作步骤,极大的降低计算机性能的耗损。
鉴于此,本申请实施例提出了一种VR场景调节方法,如图1所示,该方法包括以下步骤:
步骤101:预先对存储的VR主题场景进行分类,确定每个VR主题场景的类别标识。
如前所述,为了提升影视作品的播放效果,VR系统的数据库中会存储一些VR主题场景。在本申请的实例中,除了存储VR主题场景本 身之外,还需要根据VR主题场景的光影效果对VR主题场景进行分类,并在数据库中存储每个VR主题场景对应的类别标识,从而形成VR主题场景的分类机制。
在本步骤中,可以按照针对影视作品分类的类别对VR主题场景进行分类,例如,也将VR主题场景分类为:恐怖片、喜剧片以及浪漫爱情片等等类别。
一般情况下,每类影视作品类型可以只有一个唯一对应的VR主题场景,当然,每类影视作品类型也可以有多个对应的VR主题场景。
此外,通过对VR主题场景进行分类所形成的VR主题场景的分类机制,可以便于对用户后期上传的VR主题场景自动适配分类。而且这种分类机制也可以应用于对影视作品的分类中。
步骤102:获取用户选择播放的影视作品。
在本申请的实例中,用户可以通过VR系统软件提供的用户界面从本地数据库或者网络数据库中选择需要播放的影视作品。例如,用户可以通过触摸屏或通过语音等VR系统软件提供的基础操作来选择需要播放的影视作品。
步骤103:确定所选择影视作品的分类。
在本实例中,在VR系统的本地数据库或者网络数据库中存储的影视作品都包含有该影视作品的类别标识。上述类别标识用于标识影视作品的类别,例如标识一个影片是喜剧片、恐怖片还是浪漫爱情片等等。一般类别标识相同的影视作品的叙事元素会有一定的相似之处。通常情况下,在影视作品发表时,其制作方就会根据影视作品的场景、情绪或者形式给出该影视作品确定的分类。而对于没有分类的影视作品也可以使用人工识别的方式预先对影视作品进行分类,例如,可以利用和上述VR主题场景分类机制相同的分类机制进行分类。
步骤104:根据所选择影视作品的分类以及VR主题场景的类别标识确定与所选择影视作品对应的VR主题场景。
具体而言,在本步骤中,在确定了所选择影视作品的分类之后,可以从数据库存储的VR主题场景中筛选出与所选择影视作品具有相同分类的VR主题场景。也即所选择影视作品的分类应该与其对应的VR主题场景的分类相同。
例如,如前所述的,可以将影视作品分为喜剧片、恐怖片等分类,VR主题场景也按照该分类方法,同样可以分为喜剧片、恐怖片等分类,以与上述影视作品的分类对应。这样,如果用户选择播放的影片是恐怖片,则VR系统确定的VR主题场景的类型也应该是恐怖片。
进一步,如果相同分类下的VR主题场景只有一个,也即数目为1,则VR系统可以直接将该VR主题场景作为所选择影视作品对应的VR主题场景;如果相同分类下的VR主题场景有多个,也即数目大于1,则VR系统可以随机从中确定一个VR主题场景作为与所选择影视作品对应的VR主题场景。例如,如果用户选择播放的影片是恐怖片,且VR系统保存的恐怖片的VR主题场景有多个,则在这种情况下,VR系统将从中随机确定一个VR主题场景作为与所选择影视作品对应的VR主题场景。
更进一步,如果没有相同分类的VR主题场景,则可以通过用户界面将VR主题场景及其类别标识提供给用户,供用户进行选择。此时,用户可以选择一个VR主题场景。当然,用户也可以不选择任何VR主题场景。
由以上技术方案可以看出,本申请所提供的通过智能适配与影视作品的分类信息相对应的VR主题场景,增加了用户的沉浸感,而且减少了用户在VR观影前选择主题场景的操作步骤,极大的降低了计算机性 能的耗损。
从上面的方法可以看出,上述方法是根据所选择影视作品的分类自动为其适配VR主题场景的方法。然而,除分类之外,每个影视作品还可以包含可以更加具体说明其内容或者特点的标签。在本申请中,影视作品的标签类似于描述影视作品内容的关键词,因此,在本申请的另一个实例中,还可以根据影视作品的标签为其自动适配VR主题场景。
图2示出了本申请实例中的另一种VR场景调节方法流程示意图,该方法包括以下步骤:
步骤201:预先对存储的VR主题场景标注标签,确定每个VR主题场景的标签标识。
如前所述,为了提升影视作品的播放效果,VR系统的数据库中会存储一些VR主题场景。在本申请的实例中,除了存储VR主题场景本身之外,还需要根据VR主题场景的光影效果对VR主题场景标注标签,并在数据库中存储每个VR主题场景对应的标签标识,从而形成VR主题场景的标注标签机制。
在本步骤中,可以按照针对影视作品标注标签的方式对VR主题场景标注标签,也将VR主题场景标注标签,例如为:恐怖、喜剧、浪漫以及爱情等等标签。
一般情况下,每类影视作品的标签可以只有一个唯一对应的VR主题场景;当然,每类影视作品也可以有多个对应的VR主题场景。
此外,所述形成的VR主题场景的标注标签机制,可以便于对用户后期上传的VR主题场景进行标注标签,也可以应用于对影视作品的标签标注中。
步骤202:获取用户选择播放的影视作品。
在本申请的实例中,所述获取用户选择播放的影视作品方式可以同图1步骤102所对应的实例,故此处不详述。
步骤203:确定所选择影视作品的标签。
在本实例中,在VR系统的本地数据库或者网络数据库中存储的影视作品都包含有体现该影视作品内容关键词的标签标识。上述标签标识用于标识影视作品的特点,例如标识一个影片的特点是喜剧、恐怖还是浪漫、爱情等等。一般标签相同的影视作品的叙事元素会有一定的相似之处。通常情况下,在影视作品发表时,其制作方就会根据影视作品的场景、情绪或者形式给出该影视作品确定的标签。而对于没有给出标签的影视作品也可以使用人工方式预先对影视作品标注标签,例如,可以利用和上述VR主题场景的标注标签机制相同的机制进行标注标签。
步骤204:根据所选择影视作品的标签以及VR主题场景的标签标识确定与所选择影视作品对应的VR主题场景。
具体而言,在本步骤中,在确定了所选择影视作品的标签之后,可以从数据库存储的VR主题场景中筛选出与所选择影视作品具有相同标签的VR主题场景。也即所选择影视作品的标签应该与其对应的VR主题场景的标签相同。
例如,如前所述的,可以将影视作品标注上爱情、喜剧等标签,VR主题场景亦按照该标注方法,同样标注爱情、喜剧等标签,以与上述影视作品的标签对应。这样,如果用户选择播放的影片的标签是爱情、喜剧,则VR系统确定的VR主题场景的标签也应该是爱情、喜剧。
进一步,如果相同标签下的VR主题场景只有一个,则VR系统可以直接将该VR主题场景作为所选择影视作品对应的VR主题场景;如果相同标签下的VR主题场景有多个,则VR系统可以随机从中确定一个VR主题场景作为与所选择影视作品对应的VR主题场景。例如,如 果用户选择播放影片的标签是恐怖,且VR系统保存的标签是恐怖的VR主题场景有多个,则在这种情况下,VR系统将从中随机确定一个VR主题场景作为与所选择影视作品对应的VR主题场景。
更进一步,如果没有相同标签的VR主题场景,则可以通过用户界面将VR主题场景及其标签标识提供给用户,供用户进行选择。当然,VR系统可以优先提供具有相近标签的VR主题场景供用户选择,例如提供与所选择影视作品的标签部分相同的VR主题场景。此时,用户可以选择一个VR主题场景。当然,用户也可以不选择任何VR主题场景。
由以上技术方案可以看出,本申请所提供的通过智能适配与影视作品的标签信息相对应的VR主题场景,增加了用户的沉浸感,而且减少了用户在VR观影前选择主题场景的操作步骤,极大的降低了计算机性能的耗损。
更进一步,如果VR主题场景以及待播放的影视作品既包含类别标识也包含标签,则还可以同时根据分类以及标签完成VR主题场景和影视作品的自动适配。图3示出了本申请实例中的又一种VR场景调节方法流程示意图,该方法包括以下步骤:
步骤301:预先对存储的VR主题场景进行分类和标注标签,确定每个VR主题场景的分类和标签标识。
本步骤中的对VR主题场景进行分类和标注标签的方式可以参考上述图1-2中所示方法,故此处不再详述。
步骤302:获取用户选择播放的影视作品。
在本申请的实例中,所述获取用户选择播放的影视作品方式同图1所示的步骤102,故此处也不再详述。
步骤303:确定所选择影视作品的分类和标签。
在本申请的实例中,所述确定所选择影视作品的分类和标签也可以参考上述图1-2所对应的实例。
步骤304:根据所选择影视作品的分类和标签以及VR主题场景的类别和标签标识确定与所选择影视作品对应的VR主题场景。
在一些实例中,根据影视作品的分类信息和标签信息的筛选场景具有优先级,其可以为先筛选出与影视作品具有相同分类信息的VR主题场景,再从这些VR主题场景中筛选出有相同标签信息的场景;或者可以先筛选出与影视作品相同标签信息的场景,再从这些场景中筛选出有相同分类信息的场景等等。
例如,如果是分类筛选优先,则如果相同分类下的VR主题场景有多个,则VR系统进一步根据所选择影视作品的标签以及VR主题场景的标签标识确定与所选择影视作品对应的VR主题场景。此时,如果相同分类下所有VR主题场景的标签都和影视作品的标签不完全相同,则可以随机选择一个相同分类下的VR主题场景,或者将相同分类下的VR主题场景提供给用户进行手动选择,又或者请用户手动选择一个VR主题场景的标签,然后再从具有该标签的VR主题场景中随机或者手动选择一个VR主题场景。而如果相同分类下的VR主题场景只有1个,则可以进一步进行标签筛选,如果VR主题场景的标签和所选择影视作品的标签不同,则可以请用户进行手动选择;或者不进行标签筛选,直接将这个VR主题场景作为与该影视作品对应的VR主题场景。上述根据所选择影视作品的标签以及VR主题场景的标签标识确定与所选择影视作品对应的VR主题场景如图2所对应的实例,此处不详述。
如果是标签筛选优先,则如果具有相同标签的VR主题场景有多个,则VR系统进一步根据所选择影视作品的分类以及VR主题场景的类别标识确定与所选择影视作品对应的VR主题场景。如果仍有多个VR主 题场景满足条件,则可以从中随机选择一个。而如果相同标签下所有VR主题场景的分类都和影视作品的分类不相同,则也可以随机选择一个相同标签下的VR主题场景,或者将相同标签下的VR主题场景提供给用户进行手动选择,又或者请用户手动选择一个VR主题场景的分类,然后再从该分类下的VR主题场景中随机或者手动选择一个VR主题场景。而如果相同标签下的VR主题场景只有1个,则可以进一步进行分类筛选,如果VR主题场景的类别标识和所选择影视作品的类别标识不同,则可以请用户进行手动选择;或者也可以不进行分类筛选,直接将这个VR主题场景作为与该影视作品对应的VR主题场景。而所述根据所选择影视作品的分类以及VR主题场景的类别标识确定与所选择影视作品对应的VR主题场景同图1所对应的实例,此处不详述。
由以上技术方案可以看出,本申请所提供的通过智能适配与影视作品的分类和标签信息相对应的VR主题场景,增加了用户的沉浸感,而且减少了用户在VR观影前选择主题场景的操作步骤,极大的降低了计算机性能的耗损。
上述图1、图2和图3所示的实例中,VR系统可以根据所选择影视作品的分类和/或标签以及VR主题场景的类别和/或标签标识确定与所选择影视作品对应的VR主题场景。其中,影视作品以及VR主题场景的分类和/或标签可以统称为影视作品及VR主题场景的特征信息,该特征信息还可以包括其他可以体现每个影视作品或VR主题场景的内容区别于其他影视作品或VR主题场景的内容的特征。也就是说,概括上述图1、图2和图3所示的实例,VR系统均是预先存储VR主题场境,并确定VR主题场景的特征信息;然后在用户选择播放影视作品后,确定用户所选择播放的影视作品的特征信息;最后,再根据影视作品的特征信息从VR主题场景中确定出与该特征信息匹配的VR主题场景,并将 其作为所播放的影视作品的VR主题场景。
为了进一步提高观影效果,增加用户的沉浸感,在确定了与待播放影视作品匹配的VR主题场景之后,还可以根据所播放影片的光影色彩对VR主题场景的光影色彩进行动态调节。本申请的又一个实例给出了这样一种方法,其实现流程如图4所示。该方法可以应用在通过上述方法自动适配得到的VR主题场景之上,也可以应用到用户手动选择得到的VR主题场景之上。
图4示出了本申请实例中的另一种VR场景调节方法流程示意图,该方法包括以下步骤:
步骤401:开始播放影视作品后,在播放的影视作品上采集n个点的颜色,得到n个点的色码参数。其中,n为大于或等于2的自然数。
其中,色码参数可以为RGB颜色值或十六进制颜色码等形式表示的参数。
在一些实例中,系统可以从正在播放的影视作品上进行实时采集两个以上的点,例如,取4个点,分别记录这些点在各个取色时刻的颜色的色码参数。
例如,播放的影视作品为喜剧爱情片,某时刻从其中采集的4个点的色码参数分别为浅粉红(Light Pink)(十六进制#FFB6C1或RGB255,182,193)、粉红(Pink)(十六进制#FFC0CB或RGB255,192,203)、热情的粉红(Hot Pink)(十六进制#FF69B4或RGB 255,105,180)以及深粉色(Deep Pink)(十六进制#FF69B4或RGB 255,105,180)。
在本实例中,n的实际选取可以根据软件的效果要求和计算机性能两者的平衡来调整。n越大,采集的点越多,所述n个点形成的颜色越连续,越接近影视作品的真实颜色,然而对计算机的性能也提出更高的 要求,从而在实际使用时,可以根据软件的效果要求和计算机性能两者的平衡来选取上述n。
步骤402:根据n个点的色码参数生动态光源的参数。
在一些实例中,VR系统可以根据n个点的色码参数生成动态光源,得到所生成动态光源的参数。
这里,所述根据n个点的色码参数生成动态光源的参数可以利用现有VR软件来实现,例如,可以利用Unity 3D或者Unreal等引擎来实现。
在本步骤中,根据取色点的数目的多少,所生成的动态光源可以为:例如点光源、面光源以及聚光灯等等。
例如,如前所述,某时刻从其中采集的4个点的色码参数分别为浅粉红(Light Pink)(十六进制#FFB6C1或RGB255,182,193)、粉红(Pink)(十六进制#FFC0CB或RGB255,192,203)、热情的粉红(Hot Pink)(十六进制#FF69B4或RGB 255,105,180)以及深粉色(Deep Pink)(十六进制#FF69B4或RGB 255,105,180),VR软件模块得到这些色码参数后,可以生成一个新的动态光源,这个动态光源包括由上述色码参数记录的由浅到深的4个渐变节点。
步骤403:用所生成的动态光源的参数替换当前VR主题场景下所有动态光源的参数。
在一些实例中,用上述在步骤402生成的动态光源的参数替换掉当前VR主题场景内所有动态光源的参数,即可调整VR主题场景的光影色彩,使得VR主题场景得色彩和所播放影视作品的色彩保持匹配。
在本实例中,VR系统将周期性地执行上述方法,也即周期性地采集n个点的颜色,并周期性地根据采集的n个点的颜色生成动态光源参数,并用此替换VR主题场景中动态光源的参数,从而使得VR主题场景的色彩一直和影片的色彩保持基本一致。其中,VR系统的取色频率 可以为f,例如,可以为10赫兹。在本实例中,f的实际选取可以根据软件的效果要求和计算机性能两者的平衡来调整。f越大,单位时间内采集点颜色的次数越多,动态光源模拟出来的光源也更接近影视作品的真实颜色,然而对计算机的性能也提出更高的要求,从而在实际使用时,可以根据软件的效果要求和计算机性能两者的平衡来选取上述f。
由以上技术方案可以看出,本申请所提供的通过自动变换与影视作品搭配的颜色,增加用户的沉浸感、真实感,减少用户在VR观影前选择颜色搭配的操作步骤,极大的增强了趣味和娱乐性。
更进一步,通常在播放的影片或影视作品中会有一些关键性或者特别的事件出现,例如,闪电、地震或者僵尸出没等等。为了进一步增加用户的沉浸感,提升观影效果,在确定了与待播放影视作品匹配的VR主题场景之后,还可以根据所播放影片定义一些相关联的场景气氛事件,烘托影片的效果。本申请的又一个实例给出了这样一种方法,其实现流程如图5所示。该方法可以应用在通过上述方法自动适配得到的VR主题场景之上,也可以应用到用户手动选择得到的VR主题场景之上。
图5示出了本申请实例中的另一种VR场景调节方法流程示意图,该方法包括以下步骤:
步骤501:加载所播放影视作品的配置文件。其中,所述配置文件包括:时间节点和对应的场景气氛事件。这里,场景气氛事件是指以美术素材等形式显示出现在VR场景的现象。
在本步骤中,可以在播放影视作品之前加载所播放影视作品的配置文件,也可以在播放影视作品的过程中加载该影视作品的配置文件。
上述配置文件可以为场景气氛事件清单(库),包括:时间节点参数和场景事件参数。通常,影视作品的配置文件可以由VR系统的服务 提供方、影视作品的提供方、甚至是用户编辑生成的。
在本申请的一些实例中,场景气氛清单(库)中的场景事件参数可以包括对美术素材的运用,而美术素材的运用包括但不限于特效,也可以是模型、动作等资源,比如:用户观看恐怖片时由时间节点触发的场景事件可以是一只带有模型和动作的僵尸出现在观影大厅内,也可以是场景发生剧烈摇晃或坍塌等场景的变化。
在本申请的一些实例中,所述时间节点可以为影视作品播放的重要情节发生的时刻,该时间点的选取工作可以由VR系统的服务提供方、影视作品的提供方、甚至是用户自己来完成。
例如,假设一个影片播放到30分钟15秒时会有闪电镜头,则配置文件的编辑者可以在该影片的配置文件中增加如下记录:时间节点:30分15秒,场景气氛事件:闪光事件。又假设一个影片播放到58分钟35秒时会有亲吻镜头,则可以在该影片的配置文件中增加如下记录:时间节点:58分35秒,场景气氛事件:粉色泡泡事件等等。
步骤502:依据上述配置文件,当影片播放到某个时间节点时,执行与该时间节点对应的场景气氛事件。
在本步骤中,当影视作品播放到配置文件中已配置好的时间节点处时,将触发相关联的场景气氛事件的执行。
例如,如前所述,在一个影片播放到30分钟15秒时会有闪电镜头,且该影片的配置文件包含如下记录:时间节点:30分15秒,场景气氛事件:闪光事件。这样,根据配置文件的记录,当影片行进至30分15秒时,影片中将出现有闪电的镜头,同时,VR系统将执行闪光事件,这时,整个观影大厅都出现闪光。又例如,在一个影片播放到58分钟35秒时会有亲吻镜头,且该影片的配置文件包含如下记录:时间节点:58分35秒,场景气氛事件:粉色泡泡事件。这样,根据配置文件的记 录,当影片行进至58分35秒时,影片中将出现亲吻镜头,同时,VR系统将执行粉色泡泡事件,这时,观影大厅内将冒出粉红色气泡。
由以上技术方案可以看出,通过本申请所提供的预先配置场景气氛事件,让其他用户观看到标记的情节时,在VR场景下感受到震撼的3D气氛特效或相关场景事件,增加了用户的沉浸感、真实感,极大的增强了趣味和娱乐性。
由此可以看出,上述图1~图5所示的多种VR场景调节方法,或根据影视作品的特征信息自动对VR主题场景进行适配,或变换VR主题场景的颜色搭配,或配置场景气氛事件等等,让用户从知觉上融入影视作品。此外,本申请的实例还可以将上述不同的方法进行结合。
图6示出了本申请实例中的一种VR场景调节方法流程示意图,该方法可以将VR主题场景的自动适配和变换主题场境的颜色搭配方案进行结合,包括以下步骤:
步骤601:预置多种VR主题场景,根据用户所选影视作品特征信息来自动适配与所选择影视作品搭配的VR主题场景。
如前所述,可以通过如图1~3所对应的实例来智能适配与影视作品的相对应的场景,即实现该步骤601,故此处不详述。
步骤602:对正在播放的影视作品实时取色并将色码进行动态光源叠加,对剧场的色彩进行重新设置。
如前所述,可以通过如图4所对应的实例来实现该步骤602,即自动变换与影视作品搭配的颜色,故此处不详述。
在上述方法的基础之上,还可以进一步结合配置场景气氛事件的方案,也即上述方法进一步包括:
步骤603:根据所播放影视作品的配置文件,在影视作品播放到指 定时间节点时触发与该时间节点相关联的场景气氛事件。
如前所述,可以通过如图5所对应的实例来实现该步骤603,故此处不详述。
此外,如果上述VR主题场景并不是通过上述步骤601自动适配得到的,而是通过其他方式,例如用户手动选择的,则也可以仅执行上述步骤602和603两个步骤,实现对VR主题场景的颜色变换以及执行场景气氛事件,增强用户在VR环境下观看影视作品时的沉浸感、真实感。
基于以上各实例,本申请还提出了一种虚拟现实VR场景调节装置。在一些实例中,VR场景调节装置可由图7所示的结构实现,包括四个功能模块:存储模块701、特征信息提取模块702、获取模块703、第一确定模块704和第二确定模块705。
存储模块701,用于存储VR主题场景。
特征信息提取模块702,用于确定所存储的VR主题场景的特征信息。进一步,所述特征信息包括类别标识和/或标签。
获取模块703,获取用户选择播放的影视作品。
第一确定模块704,确定所选择影视作品的特征信息。
第二确定模块705,根据所选择影视作品的特征信息以及VR主题场景的特征信息标识确定与所选择影视作品对应的VR主题场景。
通过上述存储模块701、特征信息提取模块702、获取模块703、第一确定模块704和第二确定模块705的工作,上述VR场景调节装置可以根据影视作品的特征信息完成VR主题场景的自动适配。
在本申请中,上述模块的功能的具体实现方法可以参考上述图1~图3所示的方法,在此不在赘述。
除此之外,为了实现对VR主题场景的色彩的自动调整,所述装置可以由图8所示的结构实现,包括:
取色模块801,用于在播放的影视作品上实时采集n个点的颜色,得到n个点的色码参数;其中,n为大于或等于2的自然数;
模拟模块802,用于根据n个点的色码参数生成动态光源的参数;以及
替换模块803,用于将生成的动态光源的参数替换当前VR主题场景下所有动态光源的参数。
通过上述取色模块801、模拟模块802和替换模块803的工作,上述VR场景调节装置可以根据影视作品的色彩自动调整VR主题场景的色彩。
在本申请中,上述模块的功能的具体实现方法可以参考上述图4所示的方法,在此不在赘述。
除此之外,为了在VR主题场景中增加与所播放影视作品同步的气氛事件,所述装置可以由图9所示的结构实现,包括:
加载模块901,用于加载所播放影视作品的配置文件;其中,所述配置文件包括时间节点和对应的场景气氛事件;以及
执行模块902,用于依据所述配置文件,当影片播放到某个时间节点时,执行所述时间节点对应的场景气氛事件。
通过上述加载模块901和执行模块902的工作,上述VR场景调节装置可以在VR主题场景中增加与所播放影视作品配套的气氛事件。
上述各模块功能的具体实现原理在前文图5中已有描述,这里不再赘述。
而且,在本申请中VR场景调节装置可以既包括图7所示的模块又包括图8所示的模块,同时实现VR主题场景的自动适配和色彩调节。 或者,VR场景调节装置可以既包括图8所示的模块又包括图9所示的模块,同时实现VR主题场景的色彩调节并增加于影视作品同步的气氛事件。又或者VR场景调节装置可以既包括图7所示的模块、又包括图8和图9所示的模块,同时实现VR主题场景的自动适配、色彩调节并增加于影视作品同步的气氛事件。
另外,在本申请各个实例中的虚拟现实VR场景调节方法和虚拟现实VR场景调节装置以及其中的各模块可以集成在一个处理单元中,也可以是各个模块单独物理存在,也可以两个或两个以上装置或模块集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
在一实例中,上述的虚拟现实VR场景调节装置可运行在各种可基于互联网而进行用户信息处理的计算设备中,并加载在该计算设备的存储器中。
图10示出了一种VR场景调节装置硬件结构示意图。如图10所示,该VR场景调节装置包括一个或者多个处理器(CPU)1002、头显设备1004、存储器1006、数据库1010,以及用于互联这些组件的连接装置1008。其中,除了图10所示的情况外,数据库1010可以由独立于VR场景调节装置的独立的设备实现。在这种情况下,VR场景调节装置将通过有线或者无线网络与数据库1010建立连接从数据库中读取数据。
其中,处理器1002可通过头显设备1004接收和发送数据以实现网络通信和/或本地通信。
头显设备1004可以播放用户选择播放的影视作品以及VR主题场景。在一些实例中,所述头显设备1004可以呈现2D、3D等其他效果。
其中,头显设备1004可以包括外接式头显设备、一体式头显设备、 移动端头显设备。
数据库1010包括VR主题场景库1012和影视作品库1014。上述数据库1010还可以进一步包含与影视作品对应的配置文件库。
存储器1006可以是高速随机存取存储器,诸如DRAM、SRAM、DDR RAM、或其他随机存取固态存储设备;或者非易失性存储器,诸如一个或多个磁盘存储设备、光盘存储设备、闪存设备,或其他非易失性固态存储设备。
存储器1006存储处理器1002可执行的指令集,包括:
操作系统1016,包括用于处理各种基本系统服务和用于执行硬件相关任务的程序;
虚拟现实VR场景调节模块1018,包括用于实现自动适配VR主题场景、变换场景颜色、搭配场景气氛事件功能的各种应用程序,这种应用程序能够实现上述各实例中的处理流程。
上述虚拟现实VR场景调节模块1018具体包括如下的指令模块:
用于自动适配VR主题场景的存储模块701、特征信息提取模块702、获取模块703、第一确定模块704、第二确定模块705和/或用于变换VR主题场景色彩的取色模块801、模拟模块802、替换模块803和/或用于增加气氛事件的加载模块901和执行模块902。
处理器1002通过执行存储器1006中各模块701~705、801~803以及901~902中的机器可执行指令,进而能够实现上述各模块701~705、801~803以及901~902的功能。
另外,本申请的每一个实例可以通过由数据处理设备如计算机执行的数据处理程序来实现。显然,数据处理程序构成了本申请。此外,通常存储在一个存储介质中的数据处理程序通过直接将程序读取出存储介质或者通过将程序安装或复制到数据处理设备的存储设备(如硬盘和 或内存)中执行。因此,这样的存储介质也构成了本申请。存储介质可以使用任何类型的记录方式,例如纸张存储介质(如纸带等)、磁存储介质(如软盘、硬盘、闪存等)、光存储介质(如CD-ROM等)、磁光存储介质(如MO等)等。
因此本申请还公开了一种非易失性存储介质,其中存储有数据处理程序,该数据处理程序用于执行本申请上述方法的任何一种实例。
另外,本申请所述的方法步骤除了可以用数据处理程序来实现,还可以由硬件来实现,例如,可以由逻辑门、开关、专用集成电路(ASIC)、可编程逻辑控制器和嵌微控制器等来实现。因此这种可以实现本申请所述方法的硬件也可以构成本申请。
以上所述仅为本申请的较佳实例而已,并不用以限制本申请,凡在本申请的精神和原则之内,所做的任何修改、等同替换、改进等,均应包含在本申请保护的范围之内。

Claims (20)

  1. 一种虚拟现实VR场景调节方法,其中,所述方法包括:
    预先存储VR主题场景,并确定其对应的特征信息;
    获取用户选择播放的影视作品;
    确定所选择影视作品的特征信息;以及
    根据所选择影视作品的特征信息以及VR主题场景的特征信息确定与所选择影视作品对应的VR主题场景。
  2. 根据权利要求1所述的方法,其中,所述特征信息为类别标识;
    所述根据所选择影视作品的特征信息以及VR主题场景的特征信息确定与所选择影视作品对应的VR主题场景包括:
    从存储的VR主题场景中筛选出与所选择影视作品具有相同类别标识的VR主题场景。
  3. 根据权利要求2所述的方法,其中,所述筛选出与所选择影视作品具有相同类别标识的VR主题场景包括:
    确定与所选择影视作品具有相同类别标识的VR主题场景的数目,
    如果该数目为1,则直接将所述VR主题场景作为所选择影视作品对应的VR主题场景;或
    如果该数目大于1,则随机从与所选择影视作品具有相同类别标识的VR主题场景中选择一个VR主题场景作为与所选择影视作品对应的VR主题场景。
  4. 根据权利要求1所述的方法,其中,所述特征信息为标签;
    所述根据所选择影视作品的特征信息以及VR主题场景的特征信息确定与所选择影视作品对应的VR主题场景包括:
    从存储的VR主题场景中筛选出与所选择影视作品具有相同标签的 VR主题场景。
  5. 根据权利要求4所述的方法,其中,所述筛选出与所选择影视作品具有相同标签的VR主题场景包括:
    确定与所选择影视作品具有相同标签的VR主题场景的数目,
    如果该数目为1,则直接将该VR主题场景作为所选择影视作品对应的VR主题场景;
    如果该数目大于1,则随机从与所选择影视作品具有相同标签的VR主题场景中选择一个VR主题场景作为与所选择影视作品对应的VR主题场景。
  6. 根据权利要求1所述的方法,其中,所述特征信息包括类别标识和标签;
    所述根据所选择影视作品的特征信息以及VR主题场景的特征信息确定与所选择影视作品对应的VR主题场景包括:
    从存储的VR主题场景中筛选出与所选择影视作品具有相同类别标识的VR主题场景;以及
    从与所选择影视作品具有相同类别标识的VR主题场景中筛选出有相同标签的VR主题场景。
  7. 根据权利要求1-6中任一项权利要求所述的方法,其中,进一步包括:
    开始播放影视作品后,在播放的影视作品上采集n个点的颜色,得到n个点的色码参数集合;其中,n为大于或等于2的自然数;
    根据n个点的色码参数生成动态光源的参数;以及
    使用生成的动态光源的参数替换所述VR主题场景中所有动态光源的参数。
  8. 根据权利要求7所述的方法,其中,所述采集n个点的颜色包 括:周期地采集n个点的颜色;
    所述生成动态光源的参数包括:周期地根据n个点的色码参数生成动态光源的参数;以及
    所述替换所述VR主题场景中所有动态光源的参数包括:周期地使用生成的动态光源的参数替换所述VR主题场景中所有动态光源的参数。
  9. 根据权利要求1-8中任一项权利要求所述的方法,其中,进一步包括:
    加载所播放影视作品的配置文件;其中,所述配置文件包括时间节点和对应的场景气氛事件;以及
    依据所述配置文件,当影片播放到某个时间节点时,执行所述时间节点对应的场景气氛事件。
  10. 一种虚拟现实VR场景调节装置,其中,所述装置包括:
    一个或一个以上存储器;
    一个或一个以上处理器;
    所述一个或一个以上存储器存储有一个或者一个以上指令模块,经配置由所述一个或者一个以上处理器执行;其中,
    所述一个或者一个以上指令模块包括:
    存储模块,用于存储VR主题场景;
    特征信息提取模块,用于确定所存储的VR主题场景的特征信息;
    获取模块,用于获取用户选择播放的影视作品;
    第一确定模块,用于确定所选择影视作品的特征信息;以及
    第二确定模块,用于根据所选择影视作品的特征信息以及VR主题场景的特征信息标识确定与所选择影视作品对应的VR主题场景。
  11. 根据权利要求10所述的装置,其中,所述特征信息包括类别 标识和/或标签。
  12. 根据权利要求10或11所述的装置,其中,所述一个或者一个以上指令模块进一步包括:
    取色模块,用于在播放的影视作品上实时采集n个点的颜色,得到n个点的色码参数;其中,n为大于或等于2的自然数;
    模拟模块,用于根据n个点的色码参数生成动态光源的参数;以及
    替换模块,用于将生成的动态光源的参数替换当前VR主题场景下所有动态光源的参数。
  13. 根据权利要求10-12中任一项权利要求所述的装置,其中,所述一个或者一个以上指令模块进一步包括:
    加载模块,用于加载所播放影视作品的配置文件;其中,所述配置文件包括时间节点和对应的场景气氛事件;以及
    执行模块,用于依据所述配置文件,当影片播放到某个时间节点时,执行所述时间节点对应的场景气氛事件。
  14. 一种虚拟现实VR场景调节方法,其中,所述方法包括:
    在播放的影视作品上采集n个点的颜色,得到n个点的色码参数;其中,n为大于或等于2的自然数;
    根据n个点的色码参数生成形成动态光源的参数;以及
    用生成的动态光源的参数替换当前VR主题场景下所有动态光源的参数。
  15. 根据权利要求14所述的方法,其中,进一步包括:
    加载所播放影视作品的配置文件;其中所述配置文件包括时间节点和对应的场景气氛事件;以及
    依据上述配置文件,当影片播放到某个时间节点时,执行该时间节 点对应的场景气氛事件。
  16. 一种虚拟现实VR场景调节装置,其中,所述装置包括:
    一个或一个以上存储器;
    一个或一个以上处理器;
    所述一个或一个以上存储器存储有一个或者一个以上指令模块,经配置由所述一个或者一个以上处理器执行;其中,
    所述一个或者一个以上指令模块包括:
    采集模块,用于在播放的影视作品上采集n个点的颜色,得到n个点的色码参数;其中,n为大于或等于2的自然数;
    模拟模块,用于根据n个点的色码参数生成动态光源的参数;
    替换模块,用于用生成的动态光源的参数替换当前VR主题场景下所有动态光源的参数。
  17. 根据权利要求16所述的装置,其中,所述一个或者一个以上指令模块进一步包括:
    加载模块,用于加载所播放影视作品的配置文件;其中,所述配置文件包括时间节点和对应的场景气氛事件;以及
    执行模块,用于依据上述配置文件,当影片播放到某个时间节点时,执行该时间节点对应的场景气氛事件。
  18. 一种虚拟现实VR场景调节方法,其中,所述方法包括:
    加载所播放影视作品的配置文件;其中,所述配置文件包括时间节点和对应的场景气氛事件;以及
    依据上述配置文件,当影片播放到某个时间节点时,执行该时间节点对应的场景气氛事件。
  19. 一种虚拟现实VR场景调节装置,其中,所述装置包括:
    一个或一个以上存储器;
    一个或一个以上处理器;
    所述一个或一个以上存储器存储有一个或者一个以上指令模块,经配置由所述一个或者一个以上处理器执行;其中,
    所述一个或者一个以上指令模块包括:
    加载模块,用于加载所播放影视作品的配置文件;其中,所述配置文件包括时间节点和对应的场景气氛事件;以及
    执行模块,用于依据上述配置文件,当影片播放到某个时间节点时,执行该时间节点对应的场景气氛事件。
  20. 一种非易失性计算机可读存储介质,存储有计算机可读指令,可以使至少一个处理器执行如权利要求1至10任一项所述的方法、权利要求16或17所述的方法及权利要求20所述的方法。
PCT/CN2017/100988 2016-09-19 2017-09-08 虚拟现实场景调节方法、装置及存储介质 WO2018050021A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/206,156 US10880595B2 (en) 2016-09-19 2018-11-30 Method and apparatus for adjusting virtual reality scene, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610829939.8A CN106371605B (zh) 2016-09-19 2016-09-19 虚拟现实场景调节方法及装置
CN201610829939.8 2016-09-19

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/206,156 Continuation US10880595B2 (en) 2016-09-19 2018-11-30 Method and apparatus for adjusting virtual reality scene, and storage medium

Publications (1)

Publication Number Publication Date
WO2018050021A1 true WO2018050021A1 (zh) 2018-03-22

Family

ID=57896818

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/100988 WO2018050021A1 (zh) 2016-09-19 2017-09-08 虚拟现实场景调节方法、装置及存储介质

Country Status (3)

Country Link
US (1) US10880595B2 (zh)
CN (1) CN106371605B (zh)
WO (1) WO2018050021A1 (zh)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107688392B (zh) * 2017-09-01 2020-08-11 广州励丰文化科技股份有限公司 一种控制mr头显设备显示虚拟景象的方法及系统
CN110310356B (zh) * 2019-06-26 2023-06-02 北京奇艺世纪科技有限公司 一种场景渲染方法和装置
CN110534094B (zh) * 2019-07-31 2022-05-31 大众问问(北京)信息科技有限公司 一种语音交互方法、装置及设备
CN111105294A (zh) * 2019-12-20 2020-05-05 武汉市奥拓智能科技有限公司 一种vr导览方法、系统、客户端、服务器及其存储介质
CN111897426A (zh) * 2020-07-23 2020-11-06 许桂林 一种智能沉浸式场景展现及交互的方法及系统
CN114237401A (zh) * 2021-12-28 2022-03-25 广州卓远虚拟现实科技有限公司 一种多种虚拟场景的无缝链接方法及系统
CN115904089B (zh) * 2023-01-06 2023-06-06 深圳市心流科技有限公司 App主题场景推荐方法、装置、终端设备及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103823549A (zh) * 2012-11-19 2014-05-28 大连天地伟业数码科技有限公司 一种体感电影播放系统
CN105163191A (zh) * 2015-10-13 2015-12-16 腾叙然 一种将虚拟现实设备运用在ktv卡拉ok的系统和方法
CN105704501A (zh) * 2016-02-06 2016-06-22 普宙飞行器科技(深圳)有限公司 一种基于无人机全景视频的虚拟现实直播系统
CN105929959A (zh) * 2016-04-29 2016-09-07 四川数字工匠科技有限公司 虚拟现实头盔定位控制系统

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102565847B1 (ko) * 2015-07-06 2023-08-10 삼성전자주식회사 전자 장치 및 전자 장치에서의 디스플레이 제어 방법
US9881584B2 (en) * 2015-09-10 2018-01-30 Nbcuniversal Media, Llc System and method for presenting content within virtual reality environment
KR20170060485A (ko) * 2015-11-24 2017-06-01 삼성전자주식회사 표시 모드에 따른 콘텐트를 표시하기 위한 전자 장치 및 방법
US20170206708A1 (en) * 2016-01-19 2017-07-20 Immersv, Inc. Generating a virtual reality environment for displaying content
US10269158B2 (en) * 2016-05-26 2019-04-23 Disney Enterprises, Inc. Augmented or virtual reality digital content representation and interaction

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103823549A (zh) * 2012-11-19 2014-05-28 大连天地伟业数码科技有限公司 一种体感电影播放系统
CN105163191A (zh) * 2015-10-13 2015-12-16 腾叙然 一种将虚拟现实设备运用在ktv卡拉ok的系统和方法
CN105704501A (zh) * 2016-02-06 2016-06-22 普宙飞行器科技(深圳)有限公司 一种基于无人机全景视频的虚拟现实直播系统
CN105929959A (zh) * 2016-04-29 2016-09-07 四川数字工匠科技有限公司 虚拟现实头盔定位控制系统

Also Published As

Publication number Publication date
CN106371605B (zh) 2018-03-30
US20190098372A1 (en) 2019-03-28
US10880595B2 (en) 2020-12-29
CN106371605A (zh) 2017-02-01

Similar Documents

Publication Publication Date Title
WO2018050021A1 (zh) 虚拟现实场景调节方法、装置及存储介质
US10679063B2 (en) Recognizing salient video events through learning-based multimodal analysis of visual features and audio-based analytics
US9712862B2 (en) Apparatus, systems and methods for a content commentary community
WO2021042605A1 (zh) 视频处理方法、装置、终端及计算机可读存储介质
JP5214825B1 (ja) プレゼンテーションコンテンツ生成装置、プレゼンテーションコンテンツ生成方法、プレゼンテーションコンテンツ生成プログラム、及び集積回路
US20160330522A1 (en) Apparatus, systems and methods for a content commentary community
US9542975B2 (en) Centralized database for 3-D and other information in videos
US9443337B2 (en) Run-time techniques for playing large-scale cloud-based animations
JP2008529150A (ja) ダイナミックフォトコラージュ
WO2008109233A1 (en) Automatically generating audiovisual works
US20090143881A1 (en) Digital media recasting
WO2022061806A1 (zh) 影片生成方法、终端设备、拍摄设备及影片生成系统
WO2021068958A1 (zh) 生成商品对象动态图像的方法、装置及电子设备
CN111667557B (zh) 动画制作方法及装置、存储介质、终端
KR20070011093A (ko) 멀티미디어 컨텐츠 부호화/재생 방법 및 장치
TW200849030A (en) System and method of automated video editing
TW201939322A (zh) 一種影視作品的製作方法、裝置及設備
WO2017157135A1 (zh) 媒体信息处理方法及媒体信息处理装置、存储介质
JP5878523B2 (ja) コンテンツ加工装置とその集積回路、方法、およびプログラム
CN112422844A (zh) 在视频中添加特效的方法、装置、设备及可读存储介质
US7610554B2 (en) Template-based multimedia capturing
WO2023125393A1 (zh) 用于控制智能家居设备的方法、装置及移动终端
WO2013187796A1 (ru) Способ автоматического монтажа цифровых видеофайлов
US20230209003A1 (en) Virtual production sets for video content creation
Adams et al. Situated event bootstrapping and capture guidance for automated home movie authoring

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17850223

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17850223

Country of ref document: EP

Kind code of ref document: A1