WO2019037040A1 - 基于虚拟现实应用录制视频的方法、终端设备及存储介质 - Google Patents

基于虚拟现实应用录制视频的方法、终端设备及存储介质 Download PDF

Info

Publication number
WO2019037040A1
WO2019037040A1 PCT/CN2017/098865 CN2017098865W WO2019037040A1 WO 2019037040 A1 WO2019037040 A1 WO 2019037040A1 CN 2017098865 W CN2017098865 W CN 2017098865W WO 2019037040 A1 WO2019037040 A1 WO 2019037040A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual
person perspective
perspective
person
virtual reality
Prior art date
Application number
PCT/CN2017/098865
Other languages
English (en)
French (fr)
Inventor
陈阳
黄雨川
沈晓斌
麥偉強
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to PCT/CN2017/098865 priority Critical patent/WO2019037040A1/zh
Priority to CN201780054182.2A priority patent/CN109952757B/zh
Priority to EP17922767.3A priority patent/EP3675488B1/en
Publication of WO2019037040A1 publication Critical patent/WO2019037040A1/zh
Priority to US16/588,506 priority patent/US11000766B2/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • A63F13/49Saving the game status; Pausing or ending the game
    • A63F13/497Partially or entirely replaying previous game actions
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5255Changing parameters of virtual cameras according to dedicated instructions from a player, e.g. using a secondary joystick to rotate the camera around a player's character
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5258Changing parameters of virtual cameras by dynamically adapting the position of the virtual camera to keep a game object or game character in its viewing frustum, e.g. for tracking a character or a ball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/90Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
    • A63F13/92Video game devices specially adapted to be hand-held while playing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Definitions

  • the present application relates to the field of virtual reality technologies, and in particular, to a method, a terminal device, and a storage medium for recording video based on a virtual reality application.
  • the current third-person perspective shooting can only rely on the physical handle, and the physical handle will be unstable due to human manipulation, which will eventually cause the synthesized video to shake and the video quality is poor.
  • the shooting of the third person perspective is also limited by the physical handle, and it is impossible to capture the desired picture in a wider range.
  • the present application provides a method for recording video based on a virtual reality application, a terminal device, and a storage medium, which can solve the problem that the third person perspective shooting in the prior art is limited by hardware.
  • the present application provides, on the one hand, a method for recording a video based on a virtual reality application, the method for recording a third person perspective image of a virtual character in the virtual reality application, where the virtual character refers to interacting with the virtual reality application.
  • the role of the object in the virtual reality application, the method for recording video based on the virtual reality application includes:
  • the third person perspective is a shooting angle of a virtual controller configured as a third person perspective virtual camera of the virtual reality application;
  • the terminal device is configured to record a third person perspective image of a virtual character in the virtual reality application, where the virtual character refers to an object that interacts with the virtual reality application.
  • the terminal device includes a configurator, an input and output unit, and a virtual controller;
  • the configurator is configured to configure a third person perspective, wherein the third person perspective is a shooting angle of a virtual controller configured as a third person perspective virtual camera of the virtual reality application;
  • the virtual controller is configured to acquire, by the input and output unit, location information of a virtual character in the virtual reality application, and acquire current orientation information of a third person perspective;
  • the third person perspective refers to shooting of the virtual controller
  • the perspective information, the current orientation information of the third person perspective refers to the orientation information of the third person perspective virtual camera in the virtual reality application;
  • Yet another aspect of the present application provides a terminal device including at least one connected processor, a memory, and an input and output unit, wherein the memory is configured to store program code, and the processor is configured to invoke a program in the memory
  • the code performs the operations performed by the terminal device in the first aspect described above.
  • Yet another aspect of the present application provides a computer storage medium comprising instructions that, when executed on a computer, cause the computer to perform the operations performed by the terminal device in the first aspect described above.
  • Yet another aspect of the present application provides a computer program product comprising instructions which, when executed on a computer, cause the computer to perform the operations performed by the terminal device in the first aspect described above.
  • the virtual controller for capturing the third person perspective picture first acquires the location information and the current orientation information of the third person perspective, according to the location information and the third person
  • the current orientation information of the viewing angle acquires the scene data
  • the third person perspective image is captured according to the scene data, the current orientation information of the third person perspective, and the posture data of the virtual character.
  • FIG. 1 is a schematic diagram of a user interacting with a VR application using a virtual reality system according to an embodiment of the present invention
  • FIG. 2 is a schematic flowchart of a method for recording video based on a virtual reality application according to an embodiment of the present invention
  • 3a is a schematic structural diagram of a terminal device according to an embodiment of the present invention.
  • FIG. 3b is another schematic structural diagram of a terminal device according to an embodiment of the present invention.
  • FIG. 4 is a schematic diagram of an interface of an automatic detection configuration part in a configurator according to an embodiment of the present invention
  • FIG. 5 is a schematic diagram of an interface of a preview shooting control portion in a viewing angle controller according to an embodiment of the present invention
  • 6a is another schematic flowchart of a method for recording video based on a virtual reality application according to an embodiment of the present invention
  • FIG. 6b is a schematic flowchart of an application interface of a VR application according to an embodiment of the present invention.
  • 6c is another schematic flowchart of an application interface of a VR application according to an embodiment of the present invention.
  • 6d is another schematic flowchart of an application interface of a VR application according to an embodiment of the present invention.
  • 6e is another schematic flowchart of an application interface of a VR application according to an embodiment of the present invention.
  • FIG. 7a is a schematic flow chart of adjusting a viewing angle position of a viewing angle orientation of a third person perspective according to an embodiment of the present invention.
  • FIG. 7b is a schematic diagram of an interface used by a Steam VR platform to load a virtual controller according to an embodiment of the present invention
  • FIG. 8 is a schematic flowchart of a third person screen captured by a third person perspective virtual camera in an embodiment of the present invention.
  • FIG. 9 is another schematic structural diagram of a terminal device according to an embodiment of the present invention.
  • modules may be combined or integrated into another device, or some features may be ignored or not executed, and in addition, displayed or discussed between each other
  • the coupling or direct coupling or communication connection may be through some interfaces, and the indirect coupling or communication connection between the modules may be electrical or the like, which is not limited in the present application.
  • the modules or sub-modules described as separate components may or may not be physically separated, may not be physical modules, or may be distributed to multiple circuit modules, and some or all of them may be selected according to actual needs. Modules are used to achieve the objectives of the present application.
  • the present application provides a method, a terminal device, and a storage medium for recording video based on a virtual reality application, and is used in the field of virtual reality technology. The details are described below.
  • the entire virtual reality system includes a VR helmet, a terminal device equipped with VR resources (eg, VR games) and/or VR applications, software, App, etc., and two handheld controllers.
  • the handheld controller can be a standard handheld controller.
  • the terminal device is not limited to a conventionally used personal computer or notebook, and may also be a similar electronic terminal, device or device having a VR resource, an application, a software, and the like.
  • the user wears the VR helmet when participating in the VR application, the VR helmet is a head-mounted display, and the VR helmet is connected to the computer with the VR game through a human-computer interaction interface. Users can participate in VR applications in a first-person perspective through the VR helmet and manipulate the props in the VR application in a VR application.
  • the VR application in the terminal device can be run on the Steam VR platform.
  • an application plug-in can be loaded in the VR resource of the terminal device, thereby A third-person perspective recording function.
  • the application plug-in can be loaded in the VR application by configuring the VR application, so that the application plug-in can simulate the third-person view virtual camera in the VR application.
  • the user wears the VR helmet to play the VR application, the user can activate the third person perspective recording function of the third person perspective virtual camera in the application plugin on the computer. In this way, during the process of playing the VR application, the user can record the posture and scene of the virtual character of the VR application in the VR application through the application plug-in.
  • the VR application refers to an interactive application that comprehensively utilizes a computer graphics system and various interface devices such as reality and control to provide an immersive feeling in an interactive three-dimensional environment generated on a computer.
  • the computer-generated, interactive 3D environment becomes a virtual environment (English name: Virtual Environment, English abbreviation VE).
  • the VR application can provide a human interface to the user, enabling the user to command the VR device on which the VR application is installed, and how the VR device provides information to the user.
  • a virtual character refers to a virtual role that a user needs to substitute in a VR application when interacting with a VR application.
  • the first person perspective means that the user who manipulates the VR application cannot see the virtual character that he or she has entered, but can see the scene in which the virtual character that he or she is in, so that the user can have an immersive user perspective.
  • the third person perspective refers to the perspective of being able to see the virtual character that the user has entered and the interaction scenario in which the virtual character is located.
  • the terminal device can include an input output unit, a virtual controller, a view controller, and a configurator.
  • the virtual controller, the view controller, and the configurator are part of the application plug-in described above, and the virtual controller, the view controller, and the configurator are all computer program products, and are not limited by physical hardware.
  • the configurator can be used to detect and configure the Steam VR platform, enabling the Steam VR platform to recognize and load the virtual controller, enabling the virtual controller to simulate a VR physical handle for adjustment of the viewing position and viewing angle of the third person perspective, and for Video recording in a third-person perspective in a VR application.
  • the configurator is also used to configure the VR application to accept the virtual controller as a third-person view virtual camera.
  • the Steam VR platform refers to the open VR technology platform based on the OpenVR standard.
  • the view controller provides a user interface (English name: user interface, English abbreviation: UI), so that the user can conveniently adjust the viewing angle and viewing angle of the virtual controller in real time.
  • UI English name: user interface, English abbreviation: UI
  • the virtual reality application video recording method in the present application is exemplified below.
  • the method can be used to record a third person perspective picture of a virtual character in the virtual reality application.
  • the virtual reality application refers to an interactive application, such as a virtual reality game, and the user can enter the real game scene by bringing a VR helmet, and obtain an interactive experience such as senses and touches in the game scene.
  • the virtual character refers to a role of an object interacting with the virtual reality application in the virtual reality application.
  • a virtual character may refer to a user participating in a VR game. When participating in a VR game, the user needs to substitute a character in the VR game.
  • the VR game may include multiple game scenes, and each game scene may include one or more.
  • Virtual objects which can be rooms, roads, trees, buildings, cars, etc., can be referred to as materials.
  • the embodiments of the present application mainly include:
  • the terminal device configures a third person perspective.
  • the third person perspective is a shooting angle of a virtual controller configured as a third person perspective virtual camera of the virtual reality application.
  • the terminal device receives a third person perspective recording operation input by the object.
  • the object refers to a user interacting with the virtual reality application, for example, the object may refer to a user who interacts with the VR application in the terminal device in the VR helmet in FIG. 1 .
  • the terminal device obtains location information of the virtual character in the virtual reality application and current orientation information of the third person perspective in response to the third person perspective recording operation.
  • the location information may refer to a location of the role that the user substitutes in the VR game in the VR game.
  • the location information may be a certain room in the VR game, a certain road, or the like.
  • the third person perspective refers to a shooting angle of a virtual controller configured as a third person perspective virtual camera of the virtual reality application.
  • the current orientation information of the third person perspective refers to the orientation information of the current third person perspective virtual camera in the virtual reality application. It may be the initial configuration orientation information, or the orientation information adjusted during the recording of the third person perspective image. If the adjusted orientation information is used, the current third person perspective virtual camera captures the adjusted orientation information. The third person perspective picture.
  • the timing information of the location information and the current orientation information of the third person perspective is not limited in this application.
  • the terminal device Before the terminal device responds to the third person perspective recording operation, the terminal device further needs to configure the virtual reality application, so that the virtual reality application uses the virtual controller as the third person perspective virtual camera.
  • the virtual reality application uses the virtual controller as the third person perspective virtual camera.
  • the PC can turn on the function of the virtual controller as a third person perspective virtual camera, so that the third person perspective virtual camera can be used to capture the virtual character of the user in the VR game to play the third person perspective image of the VR game.
  • the third person perspective picture includes the gesture of the virtual character, and then records the scene in which the user plays the VR game.
  • the terminal device acquires scene data corresponding to the virtual reality application where the current virtual character is located, according to the location information and the current orientation information of the third person perspective.
  • the scene data may refer to a picture material currently presented by the virtual reality application, for example, the scene data may be a certain game scene in the VR game.
  • the terminal device can determine the game scene a of the virtual character currently in the VR game according to the position of the virtual character in the VR game.
  • the terminal device records, according to the scene data, the current orientation information of the third person perspective, and the posture data of the virtual character, a third person perspective image of the virtual character currently in the virtual reality application.
  • the posture data of the virtual character refers to the posture data embodied by the user when the virtual reality application interacts.
  • the gesture data may be data input by the user when the user plays the VR game by shaking the physical handle, and the posture data may be acquired by the sensor.
  • the specific acquisition manner is not limited in this application. It should be noted that the location information, the current orientation information of the third person perspective, and the acquisition timing of the gesture data of the virtual character are not limited in this application, as long as the posture data of the virtual character is guaranteed to be recorded.
  • the virtual character is currently acquired before the third person perspective picture in the virtual reality application.
  • the third person perspective virtual camera is used to capture a third person perspective picture of the virtual character currently in the virtual reality application.
  • the scene data captured by the orientation information can be obtained, and then the scene data and the posture data captured by the orientation information are synthesized, and then the motion data can be obtained.
  • the final third-person perspective picture is used to capture a third person perspective picture of the virtual character currently in the virtual reality application.
  • the terminal device may further be configured to: the orientation information of the third person perspective virtual camera changes according to a change in position of the virtual character.
  • the third person perspective camera can follow the virtual character movement to follow the virtual character in the virtual reality application, which can be convenient.
  • the virtual controller for capturing the third-person perspective picture first acquires the location information and the current orientation information of the third person perspective, according to the location information and the third person
  • the current orientation information of the viewing angle acquires the scene data
  • the third person perspective image is captured according to the scene data, the current orientation information of the third person perspective, and the posture data of the virtual character. It can be seen that, after adopting the scheme, in one aspect, the third person perspective picture can be captured by the virtual controller without the need of an additional handheld controller, and no additional manpower and equipment are required, nor physical hardware limitation. A full range of third-person perspective images.
  • the method further includes:
  • the method further includes:
  • the second adjustment operation is an adjustment operation for a field of view angle of the third person perspective (English name: Field of View, English abbreviation: FOV);
  • an angle of view of the virtual controller where the angle of view refers to a range of angles of view of the third person perspective virtual camera capturing the third person perspective image. It can be seen that the present application does not limit the physical hardware when adjusting the viewing angle, and can adjust or adjust the angle of view arbitrarily to obtain a suitable visual field effect.
  • the method further includes:
  • the orientation information of the third person perspective is set to a fixed value to suppress picture jitter of the third person perspective.
  • this fixed setting it is possible to achieve still shooting of the third person's angle of view.
  • a virtual reality application that is not suitable for the first person perspective, or if the captured effect causes the viewer's eyes to be uncomfortable due to shaking, it is possible to select still.
  • Third person perspective to shoot it is possible to shoot.
  • the orientation information includes a viewing angle position and a viewing angle orientation
  • the object corresponding to the virtual character may wear a virtual reality helmet
  • the first person perspective image may be captured based on the perspective orientation of the virtual reality helmet.
  • the method further includes:
  • the perspective orientation of the third person perspective by binding the perspective orientation of the third person perspective to the perspective orientation of the virtual reality helmet, it is possible to achieve a change in the perspective orientation of the third person perspective following the change in the perspective orientation of the virtual reality helmet, and to maintain The perspective orientation of the third person perspective is consistent with the perspective orientation of the virtual reality helmet.
  • the picture taken by the virtual reality helmet is the first person perspective picture taken in the first person perspective, so that the shooting of the first person perspective can be realized by controlling the shooting of the third person perspective.
  • the user wearing the virtual reality helmet can see the 130° picture, while the virtual reality helmet can only capture the 110° picture seen by the user, in the perspective of the third person perspective and the perspective of the virtual reality helmet. After the orientation is bound, you can capture a picture of 130° or more.
  • the virtual reality helmet may be subjected to some severe shaking due to the change of the user posture of the user wearing the virtual reality helmet. It may cause the picture taken by the third person to see the picture to shake accordingly.
  • the present application can also perform the algorithm for the rotation of the third person perspective on this basis. Handle to reduce the screen shake caused by the helmet shake.
  • the viewing angle orientation of the virtual reality helmet includes a roll angle, and the roll angle can be set to a fixed value to control the horizontal scrolling of the third person perspective.
  • a threshold range can be set for the fixed value, and the roll angle within the threshold range can alleviate the phenomenon of picture shaking.
  • the virtual controller may perform at least one of the following items to adjust the perspective information of the third person perspective:
  • the control of the third person perspective is controlled by the above-mentioned orientation parameter, and can also be manually controlled to implement a rapid and autonomous manner according to the actual interaction scenario of the VR application.
  • the adjustment operation of the displacement and rotation can also realize the self-directed shooting mode. For example, it is possible to control the displacement or rotation of the third person perspective in the entire virtual reality application by inputting an operation command to the virtual controller.
  • the object inputs a fourth adjustment operation to the terminal device, and then the terminal device can adjust the orientation parameter of the third person perspective according to the fourth adjustment operation in response to the fourth adjustment operation.
  • the adjustment boundary when adjusting the following items is infinity:
  • the adjustment boundary of the perspective orientation of the third person perspective is not limited, and the physical hardware can be restrained, and the perspective orientation can be freely adjusted without limitation, thereby obtaining a super-large field of view similar to the wide-angle lens. It also provides users with a variety of choices for VR application video capture.
  • the terminal device may include an input and output unit, a configurator, and a virtual controller.
  • the terminal device may also include a view controller as shown in Figure 3b.
  • the configurator, the virtual controller, and the view controller may be independent or integrated, and the three may be application plug-ins installed on the terminal device, which is not limited in this application.
  • the terminal device is configured to record a third person perspective image of the virtual character in the virtual reality application.
  • the virtual character refers to the role of an object interacting with the virtual reality application in the virtual reality application.
  • the configurator is configured to configure a virtual reality application, such that the virtual reality application uses the virtual controller as a third person perspective virtual camera.
  • the configurator is further configured to configure the virtual controller as a third person perspective virtual camera of the virtual reality application, and configure the third person perspective.
  • the third person perspective is a shooting angle of a virtual controller configured as a third person perspective virtual camera of the virtual reality application. Specifically, when the virtual reality application is run on the Steam VR platform, the configurator can configure a driver to load the virtual controller on the Steam VR platform.
  • the input/output unit is configured to receive a third person perspective recording operation.
  • the virtual controller is configured to acquire location information of the virtual character in the virtual reality application, and acquire current orientation information of the third person perspective; the orientation information of the third person perspective refers to the third person perspective virtual camera in the virtual reality application. Bearing information.
  • the virtual controller is further configured to acquire, according to the location information and the current orientation information of the third person perspective, the scenario data corresponding to the virtual reality application where the current virtual character is located.
  • the virtual controller is further configured to record, according to the scene data, the current orientation information of the third person perspective, and the posture data of the virtual character, a third person perspective image of the virtual character currently in the virtual reality application. .
  • the orientation information of the third person perspective virtual camera changes as the position of the virtual character changes.
  • the virtual controller for capturing the third person perspective picture first acquires the location information and the current orientation information of the third person perspective, according to the location information and the third person
  • the orientation information of the perspective acquires the scene data, and then captures a third person perspective image according to the scene data, the current orientation information of the third person perspective, and the posture data of the virtual character.
  • the third person perspective picture can be captured by the virtual controller without the need of an additional handheld controller, and no additional manpower and equipment are required, nor physical hardware limitation. A full range of third-person perspective images.
  • the input and output unit is further configured to receive a first adjustment operation after the virtual controller captures a third person view image of the virtual character currently in the virtual reality application. .
  • the virtual controller is further configured to adjust orientation information of the third person perspective according to the first orientation parameter of the first adjustment operation input received by the input and output unit, to control the third person perspective virtual camera Displacement and rotation in the virtual reality application.
  • the first orientation parameter may be input by the user through the perspective controller as shown in FIG. 3b.
  • the input and output unit is further configured to receive a second adjustment operation
  • the virtual controller is further configured to adjust an angle of view of the virtual controller according to the first instruction of the second adjustment operation input received by the input and output unit, where the angle of view refers to the first
  • the three-person perspective virtual camera captures the range of perspectives of the third-person perspective image. It can be seen that the present application does not limit the physical hardware when adjusting the viewing angle, and can adjust or adjust the angle of view arbitrarily to obtain a suitable visual field effect.
  • the virtual controller is further configured to:
  • the orientation information of the third person perspective is set to a fixed value.
  • the orientation information includes a viewing angle position and a viewing angle orientation
  • the camera may capture a first person perspective view based on a viewing angle orientation of the virtual reality helmet.
  • the configurator is also used to:
  • the first person perspective picture refers to a picture within the field of view angle captured by the third person perspective virtual camera when the virtual character is in the location information. In turn, the function of acting as the first person perspective is realized.
  • the perspective orientation of the third person perspective by binding the perspective orientation of the third person perspective to the perspective orientation of the virtual reality helmet, it is possible to achieve a change in the perspective orientation of the third person perspective following the change in the perspective orientation of the virtual reality helmet, and to maintain The perspective orientation of the third person perspective is consistent with the perspective orientation of the virtual reality helmet.
  • the picture taken by the virtual reality helmet is the first person perspective picture taken in the first person perspective, so that the shooting of the first person perspective can be realized by controlling the shooting of the third person perspective.
  • the virtual reality helmet may be subjected to some severe shaking due to the change of the posture of the user wearing the virtual reality helmet. It may cause the picture taken by the third person to see the picture to shake accordingly.
  • the present application can also algorithmically process the rotation of the third person perspective on the basis of the method to reduce the screen shake caused by the helmet shaking.
  • the viewing angle orientation of the virtual reality helmet includes a roll angle, and the roll angle can be set to a fixed value to control the horizontal scrolling of the third person perspective.
  • a threshold range can be set for the fixed value, and the roll angle within the threshold range can alleviate the phenomenon of picture shaking.
  • the virtual controller may perform at least one of the following items to adjust the perspective information of the third person perspective:
  • the control of the third person perspective is controlled by the above-mentioned orientation parameter, and can also be manually controlled to implement a rapid and autonomous manner according to the actual interaction scenario of the VR application.
  • the adjustment operation of the displacement and rotation can also realize the self-directed shooting mode. For example, it is possible to control the displacement or rotation of the third person perspective in the entire virtual reality application by inputting an operation command to the virtual controller.
  • the object inputs a fourth adjustment operation to the terminal device, and then the terminal device can adjust the orientation parameter of the third person perspective according to the fourth adjustment operation in response to the fourth adjustment operation.
  • the adjustment boundary when adjusting the following items is infinity:
  • the adjustment boundary of the perspective orientation of the third person perspective is not limited, and the physical hardware can be restrained, and the perspective orientation can be freely adjusted without limitation, thereby obtaining a super-large field of view similar to the wide-angle lens. It also provides users with a variety of choices for VR application video capture.
  • the Steam VR platform is installed on a terminal device, such as a VR client installed on the computer side of the Steam VR platform.
  • This application does not define a platform on which the virtual reality application can run.
  • the configurator, virtual controller, and perspective controller in the present application can be configured as application plug-ins in the virtual reality application.
  • the following describes the functions of the configurator, virtual controller, and view controller:
  • the above configurator is mainly used to configure the Steam VR platform and configure the VR application.
  • the Steam VR recognizes the loading of the virtual controller.
  • the virtual controller's driver module needs to be placed in the specified directory in the Steam VR platform, so that the virtual controller can be successfully called.
  • the VR application can be configured to accept the virtual controller as a virtual camera with a third-person perspective.
  • the Steam VR platform provides a set of development kits for Unity developers.
  • the development kit includes a script that automatically adds virtual cameras.
  • the third person view based on the Steam VR platform needs to be equipped with an additional third handle (which is used to capture the third person view).
  • the orientation of the third handle is taken as the orientation of the virtual camera in the VR application.
  • the configurator in this application can automatically configure the VR application without the need for an additional third handle.
  • the configurator configuration allows the Steam VR platform to use the virtual controller as the virtual third handle and the orientation of the virtual controller.
  • the virtual controller can realize the function of the third physical handle as a virtual camera, thereby enabling the VR application to meet the shooting requirements of the virtual controller.
  • the FOV used to configure the virtual controller that is, the FOV configured with the third-person perspective, FOV
  • the relationship between the virtual controller and the view controller may also be configured such that the view controller may provide the virtual controller with an orientation parameter for controlling the third person view virtual camera.
  • the virtual controller can simulate a VR handheld handle, and can also simulate a third-person view virtual camera in a VR application.
  • the virtual controller can be used for the movement and direction rotation of the third-person view position.
  • the virtual controller can be a dynamic link library (English full name: DLL) file that follows the OpenVR driver interface standard.
  • the DLL file is located in the root directory of the Steam VR platform, and the DLL file is used to drive the Steam VR platform simulation.
  • the third physical handle (the virtual controller).
  • the photographing function of the third person view virtual camera may be configured in an automatic detection configuration portion in the configurator, and the photographing function of the third person view virtual camera is configured in the preview photographing control portion in the view controller.
  • the automatic detection configuration part of the configurator can refer to the interface shown in FIG. 4, and the automatic detection configuration part includes an effect comparison area between the first person view video GIF and the third person view video GIF, and the first person view video GIF and the third person view video GIF can Intuitively present the difference between the first person view and the third person view.
  • the automatic detection configuration portion further includes an information prompting area to remind the user of an action that is to be automatically or manually performed, such as left and right movement, up and down movement, up and down rotation, and the like.
  • the preview shooting control portion of the viewing angle controller may refer to the interface shown in FIG. 5, the left side of FIG. 5 is a preview window, and the right side of FIG. 5 is a control area.
  • the control area includes an orientation control area and a rotation control area.
  • the azimuth control area can mainly adjust the three dimensions of up, down, left and right, and front and back.
  • the left and right, up and down, and front and rear displacement movements can be realized by dragging the slider. This application does not limit the boundary of the displacement change.
  • the rotation control area can mainly adjust the rotation of the two dimensions of up and down and left and right.
  • the slider can be rotated 180° left and right and rotated 90° up and down. This application does not limit the angle of rotation.
  • the view controller can provide the user interface (English full name: user interface, English abbreviation: UI), and the user can adjust the orientation parameter of the virtual controller in real time and conveniently through the UI, that is, adjust the orientation parameter of the virtual camera of the third person perspective. .
  • UI user interface
  • the displacement of the third-person view virtual camera in the three-dimensional space can be continuously adjusted up and down, left and right, and back and forth by the view controller in real time, and there is no adjustable boundary for the user to adjust the orientation of the virtual camera of the third person perspective.
  • the preview shooting control portion interface shown in FIG. 5 can be implemented.
  • the viewing angle controller, the virtual controller and the configurator can cooperate to complete the shooting of the third-person perspective picture and the adjustment of the viewing angle orientation and the viewing angle position of the third-person perspective.
  • the specific process is shown in FIG. 6a.
  • the process of Figure 6a includes:
  • the configurator configures the VR application, the virtual controller, and the view controller.
  • the VR application is configured to enable the VR application to accept the virtual controller as a third-person view virtual camera, and also configure the virtual controller's FOV and configure the relationship between the virtual controller and the view controller.
  • the configurator starts a third person view recording function.
  • the virtual controller is activated as a virtual camera of the third person perspective.
  • the virtual controller records the third person perspective image in a third person perspective.
  • the user is interacting with the VR application through the terminal device.
  • the user can click the application icon “virtual controller” on the VR application interface of the terminal device at any time. ", and select "Start third person perspective recording” in the drop-down menu of the application icon "virtual controller”, so that the third person perspective virtual camera can be activated to record the third person perspective picture in the third person perspective.
  • the user when the user interacts with the VR application, if he wants to take a third-person perspective picture at a new angle, he can click on the application icon "view controller" in FIG. 6b and select the application icon.
  • the "adjust orientation parameter” enters the interface diagram of the view controller panel as shown in FIG.
  • the user can adjust the parameters to the range in the orientation control area or the rotation control area in the application interface as shown in FIG. 5, and the adjusted value is transmitted to the virtual controller, so that the virtual controller can follow the angle controller in real time.
  • Azimuth parameter change To control the position information of the third-person view virtual camera in real time, and then adjust the shooting direction of the third-person perspective to obtain a variety of images. Specific steps are as follows:
  • the view controller sends the first orientation parameter to the virtual controller.
  • the virtual controller receives the first orientation parameter, and adjusts the orientation information of the third person perspective virtual camera according to the first orientation parameter to control displacement and rotation of the third person perspective virtual camera in the VR application.
  • the user may also adjust the third-person view virtual camera without using the angle controller, or directly drag, touch, click, slide, etc. on the VR application interface, so that the user can be more intuitive and flexible.
  • the user can directly drag and drop the “camera icon” in the VR application interface as shown in FIG. 6c, and after dragging, the current viewing angle of the virtual camera of the third person perspective can be changed.
  • the photographed third person perspective picture becomes a scene as shown in Fig. 6d.
  • the user when the user interacts with the VR application, if he wants to take a third-person perspective image with a wider field of view, he can click on the application icon "view controller" as shown in FIG. 6d, and select the application icon "view” "Adjust the field of view” in the pull-down menu of the controller, for example, increase the field of view by 20°, so that the field of view of the third-person view virtual camera in the VR application is expanded, and the application interface of the expanded VR application is expanded. Shown at 6e.
  • the viewing angle controller, the virtual controller, and the Steam VR platform are associated, and the three can cooperate to complete the adjustment function of the perspective orientation and the viewing angle position of the third person perspective.
  • the specific process is shown in FIG. 7a.
  • the Steam VR platform can load the DLL file of the driving virtual controller, and then load the virtual controller as a virtual handle, which is used to implement the virtual controller.
  • the three people call the function of the virtual camera.
  • the virtual controller can be invoked to implement the function of the third-person view virtual camera, and then the virtual controller performs the interaction scenario of the VR application where the user is located in a third-person perspective.
  • Shooting is the way, when the user uses the virtual controller to take a third-person perspective picture, the virtual controller can be invoked to implement the function of the third-person view virtual camera, and then the virtual controller performs the interaction scenario of the VR application where the user is located in a third-person perspective. Shooting.
  • the Steam VR platform will load the virtual controller with the correct configuration of the configurator.
  • the Steam VR platform can pass the interface shown in Figure 7b (including IServerTrackedDevi ceProvider, IServerDriverHost, ITrackedDeviceServerDriver, IVRControllerCo). Mponent) implements loading of the virtual controller.
  • the Steam VR platform can periodically call the function RunFrame in the Service Driver (ServerDriver) class to load the virtual controller, and the interface IServerDri verHost initializes the orientation parameters of the virtual controller. And when the RunFrame is called, the virtual controller also needs to report the orientation parameter of the current virtual controller.
  • the orientation parameter of the virtual controller can be updated by ITrackedDevicePoseUpdated in the interface IServerDriverHost.
  • ITrackedDevicePoseUpdated in the interface IServerDriverHost ITrackedDevicePoseUpdated in the interface IServerDriverHost.
  • the user may set an orientation parameter for adjusting the virtual controller in the UI interface of the view controller and save the file to the shared memory.
  • the shared memory mainly includes azimuth parameters (for example, represented by x, y, and z in a three-axis coordinate system) and rotation parameters (for example, represented by r x , r y , and r z in a three-axis coordinate system).
  • the virtual controller may request, from the shared memory, an orientation parameter input by the user at the view controller.
  • the virtual controller acquires a bearing parameter from the shared memory.
  • the virtual controller sends the acquired orientation parameter to the Steam VR platform.
  • the Steam VR platform receives the orientation parameter.
  • the Steam VR platform updates the current orientation parameter of the virtual controller according to the received orientation parameter.
  • the displacement adjustment or rotation operation of the third-person view virtual camera can be realized.
  • the user can also adjust the viewing angle and the viewing angle position of the third-person view virtual camera directly by using a virtual button, a keyboard or a mouse drag and drop in the VR application interface.
  • the configurator when launching the third person perspective shooting video, the configurator also needs to determine whether the configuration of the third person perspective shooting is completed, and whether the configuration of the third person perspective shooting is correct. For the specific judgment process, refer to the flow diagram shown in FIG. 8.
  • the configurator first determines whether there is a virtual controller in the specified directory of the Steam VR platform. If not, the module file is placed in the command directory in the Steam VR, and it is determined whether the Steam VR configuration allows third-party device drivers; If it exists, it is determined whether the Steam VR configuration allows third-party device drivers.
  • Steam VR configuration does not allow third-party device drivers, set up the Steam VR configuration, automatically restart Steam VR, and load the virtual controller in Steam VR. Then determine if the VR application has configured a third person perspective configuration.
  • the VR application has configured a third-person perspective configuration, it is further determined whether the two physical controllers that are standard are ready to take a picture of the VR application.
  • the VR application is not configured with a third-person perspective configuration, set the third-person perspective configuration and restart the VR application. Then further determine if the two physical controllers that are standard are ready.
  • an open command is sent to the virtual controller to turn on the virtual controller, thereby turning on the third person perspective shooting function of the third person perspective virtual camera; if two of the standard are determined
  • the physical controller is not ready to take pictures of the VR application. The user is guided to open two physical controllers. After detecting that the two physical controllers are ready, the virtual controller is turned on, so that the virtual controller is powered by Steam VR. Platform recognition, which turns on third-person perspective shooting.
  • the perspective controller updates the orientation parameter in the shared memory, and then sends the updated orientation parameter to the virtual controller, so that the virtual
  • the controller updates the orientation parameter to the Steam VR platform, so that the terminal device can take the third person perspective picture with the updated orientation parameter.
  • the update of the azimuth parameter in the embodiment of the present application may be triggered based on the user input, or may be triggered based on the change of the location information of the user during the interaction between the user and the VR application, which is not limited in this application.
  • the terminal device in the embodiment of the present invention is described above from the perspective of a modular functional entity.
  • the terminal device in the embodiment of the present invention is described below from the perspective of hardware processing.
  • the embodiment of the present invention further provides another terminal device.
  • the terminal device can be a PC, a tablet computer, a personal digital assistant (English full name: Personal Digital Assistant, English abbreviation: PDA), a sales terminal (English full name: Point of Sales, English abbreviation: POS), car computer and other terminal devices, taking the terminal device as a PC as an example:
  • FIG. 9 is a block diagram showing a partial structure of a PC related to a terminal device provided by an embodiment of the present invention.
  • the PC includes: radio frequency (English full name: Radio Frequency, English abbreviation: RF) circuit 99, memory 920, input unit 930, display unit 940, sensor 950, audio circuit 960, wireless fidelity (English full name: wireless fidelity , English abbreviation: WiFi) module 970, processor 980, and power supply 990 and other components.
  • radio frequency English full name: Radio Frequency, English abbreviation: RF
  • memory 920 input unit 930
  • display unit 940 sensor 950
  • audio circuit 960 audio circuit 960
  • wireless fidelity English full name: wireless fidelity , English abbreviation: WiFi
  • the PC structure shown in FIG. 9 does not constitute a definition of a PC, and may include more or less components than those illustrated, or a combination of certain components, or different component arrangements.
  • the RF circuit 99 can be used for receiving and transmitting signals during the transmission or reception of information or during a call. Specifically, after receiving the downlink information of the base station, the processing is processed by the processor 980. In addition, the data designed for the uplink is sent to the base station.
  • the RF circuit 99 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier (English name: Low Noise Amplifier, LNA for short), a duplexer, and the like.
  • RF circuitry 99 can also communicate with the network and other devices via wireless communication.
  • the above wireless communication may use any communication standard or protocol, including but not limited to the global mobile communication system (English full name: Global System of Mobile communication, English abbreviation: GSM), general packet radio service (English full name: General Packet Radio Service, English Abbreviation: GPRS), code division multiple access (English full name: Code Division Multiple Access, English abbreviation: CDMA), wideband code division multiple access (English full name: Wideband Code Division Multiple Access, English abbreviation: WCDMA), long-term evolution (English full name : Long Term Evolution, English abbreviation: LTE), e-mail, short message service (English full name: Short Messaging Service, English abbreviation: SMS).
  • GSM Global System of Mobile communication
  • GPRS General Packet Radio Service
  • CDMA Code Division Multiple Access
  • WCDMA Wideband Code Division Multiple Access
  • LTE Long Term Evolution
  • SMS Short Messaging Service
  • the memory 920 can be used to store software programs and modules, and the processor 980 executes various functional applications and data processing of the PC by running software programs and modules stored in the memory 920.
  • the memory 920 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may be stored according to Data created by the use of the PC (such as audio data, phone book, etc.).
  • the memory 920 may include a high speed random access memory, and may also include a nonvolatile A memory such as at least one disk storage device, flash memory device, or other volatile solid state storage device.
  • the input unit 930 can be configured to receive input numeric or character information and to generate key signal inputs related to user settings and function control of the PC.
  • the input and output unit 930 may include a touch panel 931 and other input and output devices 932.
  • the touch panel 931 also referred to as a touch screen, can collect touch operations on or near the user (such as a user using a finger, a stylus, or the like on the touch panel 931 or near the touch panel 931. Operation), and drive the corresponding connecting device according to a preset program.
  • the touch panel 931 can include two parts: a touch detection device and a touch controller.
  • the touch detection device detects the touch orientation of the user, and detects a signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts the touch information into contact coordinates, and sends the touch information.
  • the processor 980 is provided and can receive commands from the processor 980 and execute them.
  • the touch panel 931 can be implemented in various types such as resistive, capacitive, infrared, and surface acoustic waves.
  • the input unit 930 may also include other input devices 932.
  • other input devices 932 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackballs, mice, joysticks, and the like.
  • the display unit 940 can be used to display information input by the user or information provided to the user as well as various menus of the PC.
  • the display unit 940 can include a display panel 941.
  • a liquid crystal display (English name: Liquid Crystal Display, English abbreviation: LCD), an organic light emitting diode (English name: Organic Light-Emitting Diode, English abbreviation: OLED), etc.
  • the display panel 941 is configured in a form.
  • the touch panel 931 can cover the display panel 941. When the touch panel 931 detects a touch operation on or near it, the touch panel 931 transmits to the processor 980 to determine the type of the touch event, and then the processor 980 according to the touch event.
  • the type provides a corresponding visual output on display panel 941.
  • the touch panel 931 and the display panel 941 are used as two independent components to implement the input and input functions of the PC in FIG. 9, in some embodiments, the touch panel 931 and the display panel 941 may be integrated. Implement PC input and output functions.
  • the PC may also include at least one type of sensor 950, such as a light sensor, motion sensor, and other sensors.
  • the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 941 according to the brightness of the ambient light, and the proximity sensor may be When the PC is moved to the ear, the display panel 941 and/or the backlight are turned off.
  • the accelerometer sensor can detect the magnitude of acceleration in all directions (usually three axes). When it is stationary, it can detect the magnitude and direction of gravity. It can be used to identify the PC attitude (such as horizontal and vertical screen switching, related).
  • Game magnetometer attitude calibration
  • vibration recognition related functions such as pedometer, tapping
  • the PC can also be configured with gyroscopes, barometers, hygrometers, thermometers, infrared sensors and other sensors, no longer Narration.
  • An audio circuit 960, a speaker 961, and a microphone 962 can provide an audio interface between the user and the PC.
  • the audio circuit 960 can transmit the converted electrical data of the received audio data to the speaker 961, and convert it into a sound signal output by the speaker 961.
  • the microphone 962 converts the collected sound signal into an electrical signal, and the audio circuit 960 After receiving, it is converted into audio data, and then processed by the audio data output processor 980, sent to another PC, for example, via the RF circuit 99, or outputted to the memory 920 for further processing.
  • WiFi is a short-range wireless transmission technology.
  • the PC can help users to send and receive emails, browse web pages and access streaming media through the WiFi module 970. It provides users with wireless broadband Internet access.
  • FIG. 9 shows the WiFi module 970, it can be understood that it does not belong to the essential configuration of the PC, and may be omitted as needed within the scope of not changing the essence of the invention.
  • Processor 980 is the control center of the PC that connects various portions of the entire PC using various interfaces and lines, by executing or executing software programs and/or modules stored in memory 920, and invoking data stored in memory 920, executing The PC's various functions and processing data, so that the PC is monitored as a whole.
  • the processor 980 may include one or more processing units; preferably, the processor 980 may integrate an application processor and a modem processor, where the application processor mainly processes an operating system, a user interface, an application, and the like.
  • the modem processor primarily handles wireless communications. It will be appreciated that the above described modem processor may also not be integrated into the processor 980.
  • the PC also includes a power source 990 (such as a battery) that supplies power to the various components.
  • a power source 990 such as a battery
  • the power source can be logically coupled to the processor 980 through a power management system to manage functions such as charging, discharging, and power management through the power management system.
  • the PC may further include a camera, a Bluetooth module, and the like, and details are not described herein again.
  • the processor 980 included in the PC further has a control to perform the operations performed by the terminal device.
  • the disclosed system, apparatus, and method may be implemented in other manners.
  • the device embodiments described above are merely illustrative.
  • the division of the modules is only a logical function division.
  • there may be another division manner for example, multiple modules or components may be combined or Can be integrated into another system, or some features can be ignored or not executed.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or module, and may be electrical, mechanical or otherwise.
  • the modules described as separate components may or may not be physically separated.
  • the components displayed as modules may or may not be physical modules, that is, may be located in one place, or may be distributed to multiple network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional module in each embodiment of the present application may be integrated into one processing module, or each module may exist physically separately, or two or more modules may be integrated into one module.
  • the above integrated modules can be implemented in the form of hardware or in the form of software functional modules.
  • the integrated modules, if implemented in the form of software functional modules and sold or used as separate products, may be stored in a computer readable storage medium.
  • the computer program product includes one or more computer instructions.
  • the computer can be a general purpose computer, a special purpose computer, a computer network, or other programmable device.
  • the computer instructions can be stored in a computer readable storage medium or transferred from one computer readable storage medium to another computer readable storage medium.
  • the computer readable storage medium can It is any available medium that the computer can store or a data storage device such as a communication device, data center, or the like that includes one or more available media.
  • the usable medium may be a magnetic medium (eg, a floppy disk, a hard disk, a magnetic tape), an optical medium (eg, a DVD), or a semiconductor medium (such as a solid state disk (SSD)).

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Cardiology (AREA)
  • Biophysics (AREA)
  • Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

本申请提供一种基于虚拟现实应用录制视频的方法、终端设备及存储介质,所述方法用于录制虚拟角色在虚拟现实应用中的第三人称视角画面,该方法包括:配置第三人称视角,其中所述第三人称视角是配置为虚拟现实应用的第三人称视角虚拟摄像机的虚拟控制器的拍摄视角;获取虚拟角色在虚拟现实应用中的位置信息;获取第三人称视角当前的方位信息;根据位置信息和第三人称视角当前的方位信息,获取当前虚拟角色所在的虚拟现实应用对应的场景数据;根据场景数据、方位信息和虚拟角色的姿态数据,录制虚拟角色当前在虚拟现实应用中的第三人称视角画面。不受物理硬件的限制,可以全方位的进行第三人称视角的画面拍摄。

Description

基于虚拟现实应用录制视频的方法、终端设备及存储介质 技术领域
本申请涉及虚拟现实技术领域,尤其涉及基于虚拟现实应用录制视频的方法、终端设备及存储介质。
背景技术
随着近年来个人电脑(英文全称:personal computer,英文简称:PC)VR(英文全称:virtual reality,英文简称:VR)市场与用户的快速增长,用户对于VR录制的需求越来越大,尤其是用户想要拍摄到自身玩VR游戏的场景。为了拍摄到用户玩VR游戏的场景,目前主要采用第三人称视角的拍摄方式对用户的真人姿态进行拍摄,然后结合拍摄的用户真人姿态画面,VR头盔拍摄的第一人称视角画面以及用户在每个姿态时在VR游戏中的游戏场景,合成得到最终的视频。
但是,目前的第三人称视角拍摄只能依赖于物理手柄,并且物理手柄由于人为操控原因会出现不稳定现象,最终会导致合成的视频出现晃动,视频质量较差。其次,第三人称视角的拍摄也受限于物理手柄,无法在更大范围的拍摄到想要的画面。
发明内容
本申请提供了基于虚拟现实应用录制视频的方法、终端设备及存储介质,能够解决现有技术中第三人称视角拍摄受到硬件限制的问题。
本申请一方面提供基于虚拟现实应用录制视频的方法,所述方法用于录制虚拟角色在所述虚拟现实应用中的第三人称视角画面,所述虚拟角色是指与所述虚拟现实应用进行交互的对象在所述虚拟现实应用中的角色,所述基于虚拟现实应用录制视频的方法包括:
配置第三人称视角,其中所述第三人称视角是配置为所述虚拟现实应用的第三人称视角虚拟摄像机的虚拟控制器的拍摄视角;
获取虚拟角色在所述虚拟现实应用中的位置信息;
获取第三人称视角当前的方位信息,其中所述第三人称视角当前的方位信息是第三人称视角虚拟摄像机在虚拟现实应用中的方位信息;
根据所述位置信息和所述第三人称视角当前的方位信息,获取当前虚拟角 色所在的所述虚拟现实应用对应的场景数据;
根据所述场景数据、所述第三人称视角当前的方位信息和所述虚拟角色的姿态数据,录制所述虚拟角色当前在所述虚拟现实应用中的第三人称视角画面。
本申请另一方面提供一种终端设备,所述终端设备用于录制虚拟角色在所述虚拟现实应用中的第三人称视角画面,所述虚拟角色是指与所述虚拟现实应用进行交互的对象在所述虚拟现实应用中的角色,所述终端设备包括配置器、输入输出单元和虚拟控制器;
所述配置器用于配置第三人称视角,其中所述第三人称视角是配置为所述虚拟现实应用的第三人称视角虚拟摄像机的虚拟控制器的拍摄视角;
所述虚拟控制器用于通过所述输入输出单元获取虚拟角色在所述虚拟现实应用中的位置信息,以及获取第三人称视角当前的方位信息;所述第三人称视角是指所述虚拟控制器的拍摄视角,所述第三人称视角当前的方位信息是指第三人称视角虚拟摄像机在虚拟现实应用中的方位信息;
根据所述位置信息和所述第三人称视角当前的方位信息,获取当前虚拟角色所在的所述虚拟现实应用对应的场景数据;
根据所述场景数据、所述第三人称视角当前的方位信息和所述虚拟角色的姿态数据,录制所述虚拟角色当前在所述虚拟现实应用中的第三人称视角画面。
本申请又一方面提供了一种终端设备,其包括至少一个连接的处理器、存储器和输入输出单元,其中,所述存储器用于存储程序代码,所述处理器用于调用所述存储器中的程序代码来执行上述第一方面中由终端设备所执行的操作。
本申请又一方面提供了一种计算机存储介质,其包括指令,当其在计算机上运行时,使得计算机执行上述第一方面中由终端设备所执行的操作。
本申请又一方面提供了一种包含指令的计算机程序产品,当其在计算机上运行时,使得计算机执行上述第一方面中由终端设备所执行的操作。
相较于现有技术,本申请提供的方案中,用于拍摄第三人称视角画面的虚拟控制器先获取所述位置信息以及第三人称视角当前的方位信息,根据所述位置信息和所述第三人称视角当前的方位信息获取上述场景数据,然后根据所述场景数据、所述第三人称视角当前的方位信息和所述虚拟角色的姿态数据拍摄第三人称视角画面。可见,采用本方案后,一方面中,不需要另外配备手持控制器就能通过虚拟控制器实现第三人称视角画面的拍摄,并且不需要额外的人力和设备,也不受物理硬件的限制,可以全方位的进行第三人称视角画面的拍摄。
附图说明
图1为本发明实施例中用户使用虚拟现实系统与VR应用交互的一种示意图;
图2为本发明实施例中基于虚拟现实应用录制视频的方法的一种流程示意图;
图3a为本发明实施例中终端设备的一种结构示意图;
图3b为本发明实施例中终端设备的另一种结构示意图;
图4为本发明实施例中配置器中的自动检测配置部分的界面示意图;
图5为本发明实施例中视角控制器中的预览拍摄控制部分的界面示意图;
图6a为本发明实施例中基于虚拟现实应用录制视频的方法的另一种流程示意图;
图6b为本发明实施例中VR应用的应用界面的一种流程示意图;
图6c为本发明实施例中VR应用的应用界面的另一种流程示意图;
图6d为本发明实施例中VR应用的应用界面的另一种流程示意图;
图6e为本发明实施例中VR应用的应用界面的另一种流程示意图;
图7a为本发明实施例中调整第三人称视角的视角方位的视角位置的一种流程示意图;
图7b为本发明实施例中Steam VR平台加载虚拟控制器使用的接口示意图;
图8为本发明实施例中启动第三人称视角虚拟摄像机拍摄第三人称画面的一种流程示意图;
图9为本发明实施例中终端设备的另一种结构示意图。
具体实施方式
本申请的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的实施例能够以除了在这里图示或描述的内容以外的顺序实施。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或模块的过程、方法、装置、产品或设备不必限于清楚地列出的那些步骤或模块,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或模块,本申请中所出现的模块的划分,仅仅是一种逻辑上的划分,实际应用中实现时可以有另外的划分方式,例如多个模块可以结合成或集成在另一个装置中,或一些特征可以忽略,或不执行,另外,所显示的或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,模块之间的间接耦合或通信连接可以是电性或其他类似的形式,本申请中均不作限定。并且,作为分离部件说明的模块或子模块可以是也可以不是物理上的分离,可以是也可以不是物理模块,或者可以分布到多个电路模块中,可以根据实际的需要选择其中的部分或全部模块来实现本申请方案的目的。
本申请提供了一种基于虚拟现实应用录制视频的方法、终端设备及存储介质,用于虚拟现实技术领域。以下进行详细说明。
如图1所示,整个虚拟现实系统包括VR头盔、安装有VR资源(例如VR游戏)和/或VR应用、软件、App等的终端设备、以及两个手持控制器。在本申请的一个较佳实施例中,为增加整个虚拟现实系统配置的便捷与通用性,所述手持控制器可以是标配的手持控制器。所述终端设备并不限于传统使用的个人电脑、笔记本,还可以是具有VR资源、应用、软件等承载运行功能的类似电子终端、装置或设备。用户在参与VR应用时穿戴VR头盔,VR头盔为头戴式显示器,VR头盔与安装了VR游戏的计算机通过人机交互接口通信连接。用户可通过VR头盔以第一人称视角去参与VR应用,并在VR应用中操控VR应用里的道具。
本申请实施例中,终端设备中的VR应用可以基于Steam VR平台运行,为摆脱物理硬件的限制,可在终端设备的VR资源中加载一个应用插件,从而 实现第三人称视角的录制功能。具体来说,可通过配置VR应用,在VR应用中加载该应用插件,使得该应用插件可以模拟VR应用中的第三人称视角虚拟摄像机。当用户穿戴VR头盔玩VR应用时,用户可以在计算机上启动该应用插件中第三人称视角虚拟摄像机的第三人称视角录制功能。这样,用户在玩VR应用过程中,就可以通过该应用插件对该用户代入VR应用的虚拟角色在VR应用中的姿态和场景进行录制。
其中,VR应用是指综合利用计算机图形系统和各种现实及控制等接口设备,在计算机上生成的、可交互的三维环境中提供沉浸感觉的交互式应用。其中,计算机生成的、可交互的三维环境成为虚拟环境(英文全称:Virtual Environment,英文简称VE)。该VR应用可为用户提供人机接口,使得用户能够命令安装了VR应用的VR装置,以及VR装置如何向用户提供信息。
本申请实施例中,虚拟角色是指用户在与VR应用交互时,用户需要代入VR应用中的虚拟角色。第一人称视角是指操纵VR应用的用户无法看见自身所代入的虚拟角色,但可以看到自身所代入的虚拟角色所处的场景,使得用户可以身临其境的用户视角。第三人称视角是指能够看到用户所代入的虚拟角色和该虚拟角色所处的交互场景的视角。
在一些实施方式中,该终端设备可包括输入输出单元、虚拟控制器、视角控制器和配置器。其中,虚拟控制器、视角控制器和配置器则为上述应用插件的一部分,虚拟控制器、视角控制器和配置器均是计算机程序产品,不受物理硬件限制。
配置器可用于检测和配置Steam VR平台,使Steam VR平台识别并加载虚拟控制器,使得虚拟控制器模拟一个VR物理手柄,其用于第三人称视角的视角位置与视角方向的调整,以及用于在VR应用中以第三人称视角进行视频录制。
配置器还用于配置VR应用,使其接受虚拟控制器为第三人称视角虚拟摄像机。其中,Steam VR平台是指基于OpenVR标准的开放式VR技术平台。
视角控制器提供用户界面(英文全称:user interface,英文简称:UI),使得用户可以实时便捷调整虚拟控制器的视角方位和视角位置。
参照图2,以下对本申请中的虚拟现实应用视频录制方法进行举例说明。 所述方法可用于录制虚拟角色在所述虚拟现实应用中的第三人称视角画面。其中,其中,虚拟现实应用是指交互式应用,例如虚拟现实游戏,用户通过带上VR头盔即可实现进入真实的游戏场景,并得到游戏场景中的感官、触感等交互式体验。
所述虚拟角色是指与所述虚拟现实应用进行交互的对象在所述虚拟现实应用中的角色。例如虚拟角色可以是指参与VR游戏的用户,该用户在参与VR游戏时,其需要代入该VR游戏中的角色,VR游戏可包括多个游戏场景,每个游戏场景都可包括一个或多个虚拟对象,虚拟对象可以是房间、道路、树木、建筑物和车等,可以将这些游戏场景称为素材。本申请实施例主要包括:
201、终端设备配置第三人称视角。
其中,所述第三人称视角是配置为所述虚拟现实应用的第三人称视角虚拟摄像机的虚拟控制器的拍摄视角。
202、终端设备接收对象输入的第三人称视角录制操作。
其中,对象是指与所述虚拟现实应用进行交互的用户,例如对象可以是指图1中戴VR头盔与终端设备中的VR应用交互的用户。
203、终端设备响应于第三人称视角录制操作,获取虚拟角色在所述虚拟现实应用中的位置信息以及第三人称视角当前的方位信息。
其中,所述位置信息可以是指该用户代入VR游戏中的角色在VR游戏中所处的位置,例如该位置信息可以是VR游戏中的某个房间、某条道路等。
所述第三人称视角是指配置为所述虚拟现实应用的第三人称视角虚拟摄像机的虚拟控制器的拍摄视角。
所述第三人称视角当前的方位信息是指当前第三人称视角虚拟摄像机在虚拟现实应用中的方位信息。可以是初始配置的方位信息,也可以是在录制第三人称视角画面的过程中调整后的方位信息,若是调整后的方位信息,那么当前第三人称视角虚拟摄像机拍摄的就是以调整后的方位信息拍摄的第三人称视角画面。
需要说明的是,所述位置信息和所述第三人称视角当前的方位信息的获取时序本申请均不作限定。在终端设备响应第三人称视角录制操作之前,终端设备还需要配置所述虚拟现实应用,使得所述虚拟现实应用将所述虚拟控制器作为所述第三人称视角虚拟摄像机。例如,用户在玩VR游戏时,在PC上点击 第三人称视角录制操作,PC即可开启虚拟控制器作为第三人称视角虚拟摄像机的功能,从而可以通过该第三人称视角虚拟摄像机去拍摄该用户在VR游戏中的虚拟角色玩VR游戏的第三人称视角画面,该第三人称视角画面包括该虚拟角色的姿态,进而记录该用户玩VR游戏的场景。
204、终端设备根据所述位置信息和所述第三人称视角当前的方位信息,获取当前虚拟角色所在的所述虚拟现实应用对应的场景数据。
其中,场景数据可以是指所述虚拟现实应用当前呈现的画面素材,例如场景数据可以是VR游戏中的某个游戏场景。终端设备可以根据虚拟角色在VR游戏中的位置确定该虚拟角色当前在VR游戏中的游戏场景a。
205、终端设备根据所述场景数据、所述第三人称视角当前的方位信息和所述虚拟角色的姿态数据,录制所述虚拟角色当前在所述虚拟现实应用中的第三人称视角画面。
其中,所述虚拟角色的姿态数据是指上述用户在虚拟现实应用交互时的体现的姿态数据。例如姿态数据可以是用户在玩VR游戏时,通过晃动物理手柄时输入终端设备的数据,该姿态数据可通过传感器获取,具体获取的方式本申请不作限定。需要说明的是,所述位置信息、所述第三人称视角当前的方位信息、以及所述虚拟角色的姿态数据的获取时序本申请均不作限定,只要保证所述虚拟角色的姿态数据在录制所述虚拟角色当前在所述虚拟现实应用中的第三人称视角画面之前获取即可。
具体来说,由第三人称视角虚拟摄像机去拍摄所述虚拟角色当前在所述虚拟现实应用中的第三人称视角画面。实际操作时,根据所述场景数据和所述第三人称视角当前的方位信息就可以得到以该方位信息拍摄的场景数据,然后将以该方位信息拍摄的场景数据、姿态数据进行合成,即可得到最终的第三人称视角画面。
在一些实施方式中,终端设备还可以配置:所述第三人称视角虚拟摄像机的方位信息跟随所述虚拟角色的位置变化而变化。这样可以实现,当所述虚拟角色在所述虚拟现实应用中的位置信息变化时,第三人称视角摄像机可以跟随该虚拟角色移动,以便对在虚拟现实应用中的虚拟角色进行跟拍,这样可以便于上述对象可以欣赏自身代入虚拟现实应用中的交互场景,或者,也可以提 供给其他用户观看。
与现有机制相比,本申请提供的方案中,用于拍摄第三人称视角画面的虚拟控制器先获取所述位置信息以及第三人称视角当前的方位信息,根据所述位置信息和所述第三人称视角当前的方位信息获取上述场景数据,然后根据所述场景数据、所述第三人称视角当前的方位信息和所述虚拟角色的姿态数据拍摄第三人称视角画面。可见,采用本方案后,一方面中,不需要另外配备手持控制器就能通过虚拟控制器实现第三人称视角画面的拍摄,并且不需要额外的人力和设备,也不受物理硬件的限制,可以全方位的进行第三人称视角画面的拍摄。
另一方面中,在拍摄VR游戏时,不需要有拍摄混合现实经验的摄影师在拍摄过程中依据游戏进程和玩家位置,不断地调整拍摄方位,普通玩家就能够独立完成整个操作。更不需要玩家同步设定绿幕环境和视频同步,以及拍摄完后还要进行视频的叠加处理,由此可见,本申请的方案能够降低玩家的门槛,也能简化整个获取拍摄视频的时间。
可选的,在一些发明实施例中,所述录制所述虚拟角色当前在所述虚拟现实应用中的第三人称视角画面之后,所述方法还包括:
接收第一调整操作;
根据所述第一调整操作输入的第一方位参数调整所述第三人称视角的方位信息,以控制所述第三人称视角虚拟摄像机在所述虚拟现实应用中的位移和旋转。
可选的,在一些发明实施例中,所述方法还包括:
接收第二调整操作,第二调整操作是针对第三人称视角的视场角(英文全称:Field of View,英文简称:FOV)的调整操作;
根据所述第二调整操作输入的第一指令调整所述虚拟控制器的视场角,所述视场角是指所述第三人称视角虚拟摄像机拍摄第三人称视角画面的视角范围。可见本申请在调整该视场角时,不会受到物理硬件的限制,可以任意的调大或调小视场角,以获得合适的视野效果。
可选的,在一些发明实施例中,所述方法还包括:
接收第三调整操作;
根据所述第三人称视角配置操作,将所述第三人称视角的方位信息设置为固定值,以抑制所述第三人称视角的画面抖动。通过这种固定设置,能够实现第三人称视角的静止拍摄,对于不适合第一人称视角拍摄的虚拟现实应用,或者是拍摄出的效果因晃动而引起观看者的眼睛不适的情况,都可以选择静止的第三人称视角来拍摄。
可选的,在一些发明实施例中,所述方位信息包括视角位置和视角方位,虚拟角色所对应的对象可以戴虚拟现实头盔,基于虚拟现实头盔的视角方位,可以拍摄第一人称视角画面。
所述方法还包括:
响应于第三人称视角配置操作,将所述第三人称视角的视角方位与虚拟现实头盔的视角方位绑定,使得所述第三人称视角的视角方位跟随所述虚拟现实头盔的视角方位变化而变化,以获取第一人称视角画面,所述第一人称视角画面是指所述虚拟角色在所述位置信息指示的位置时,所述第三人称视角虚拟摄像机拍摄的视场角内的画面。可见,本申请通过绑定,能够实现通过第三人称视角充当第一人称视角的功能。
具体来说,通过将第三人称视角的视角方位与所述虚拟现实头盔的视角方位绑定位置,就能够实现第三人称视角的视角方位跟随所述虚拟现实头盔的视角方位的变化而变化,并且保持第三人称视角的视角方位与虚拟现实头盔的视角方位一致。这样,由虚拟现实头盔所拍摄的画面则是以第一人称视角拍摄的第一人称视角画面,这样就可以实现通过控制第三人称视角的拍摄来实现对第一人称视角的拍摄。并且可以通过调整第三人称视角的FOV来实现更广阔的视野,比在同样视角方位的头盔所拍摄到的画面视野更广。比如原来穿戴该虚拟现实头盔的用户可以看到130°的画面,而虚拟现实头盔只能拍摄到该用户眼前看到的110°的画面,在将第三人称视角的视角方位与虚拟现实头盔的视角方位绑定后,就能够拍摄到130°或者130°以上的画面。
可选的,在一些发明实施例中,将第三人称视角的视角方位与所述虚拟现实头盔的视角方位绑定后,由于虚拟现实头盔可能会由于穿戴其的用户姿态的改变而受到一些剧烈晃动,可能会导致第三人称视角拍摄的画面也相应出现晃动。为解决该问题,本申请还可以在该基础上对第三人称视角的旋转进行算法 处理,以减轻由于头盔晃动所引起的画面晃动。一些实施方式中,所述虚拟现实头盔的视角方位包括横滚角,可将该横滚角设置为固定值,从而控制第三人称视角的横滚动作。可为该固定值设置一个阈值范围,取该阈值范围内的横滚角都可以减轻画面晃动的现象。例如可以设为0或者设为较小的值,具体本申请不作限定。这样当所述虚拟现实头盔的横滚角为固定值时,所述虚拟控制器至少可执行以下项之一来调整第三人称视角的视角信息:
左右移动、前后移动、左右旋转和前后旋转。可见,通过该机制,能够使得第三人称视角所拍摄的画面不会在横滚角的方向上晃动,从而减少画面在横滚角方向上的晃动对用户的眩晕感。
可选的,在一些发明实施例中,对于第三人称视角的控制,除了通过上述方位参数的更新来控制之外,还可以通过人工控制,以实现根据VR应用的实际交互场景迅速的、自主的进行位移和旋转的调整操作,也能实现自当导演式的拍摄方式。例如,用于可通过对虚拟控制器输入操作指令来控制第三人称视角在整个虚拟现实应用中的位移或旋转。具体来说,
上述对象向终端设备输入第四调整操作,然后终端设备可响应于所述第四调整操作,根据所述第四调整操作调整所述第三人称视角的方位参数。
可选的,在一些发明实施例中,所述虚拟控制器调整所述第三人称视角的视角方位时,调整以下项时的调整边界为无穷大:
左右移动、前后移动、上下移动、左右旋转、前后旋转和上下旋转。
可见,通过不限定调整所述第三人称视角的视角方位的调整边界,能够实现不受物理硬件的束缚,可以无限制的、自由的进行视角方位的调整,进而获取类似广角镜头的超大视野的拍摄效果,也为用户提供VR应用视频拍摄的多样性选择。
请参照图3a,以下对本申请提供一种终端设备进行举例说明,所述终端设备可包括输入输出单元、配置器和虚拟控制器。所述终端设备还可以包括视角控制器,如图3b所示。配置器、虚拟控制器和视角控制器可以是独立或者集成的,三者可以是安装于终端设备上的应用插件,具体本申请不作限定。所述终端设备用于录制虚拟角色在所述虚拟现实应用中的第三人称视角画面,所 述虚拟角色是指与所述虚拟现实应用进行交互的对象在所述虚拟现实应用中的角色。下面分别对配置器、虚拟控制器和视角控制器的功能进行说明:
所述配置器,用于配置虚拟现实应用,使得所述虚拟现实应用将所述虚拟控制器作为第三人称视角虚拟摄像机。
所述配置器还用于将所述虚拟控制器配置为所述虚拟现实应用的第三人称视角虚拟摄像机,以及配置所述第三人称视角。其中所述第三人称视角是配置为所述虚拟现实应用的第三人称视角虚拟摄像机的虚拟控制器的拍摄视角。具体的,当虚拟现实应用基于Steam VR平台运行时,所述配置器可以在Steam VR平台上配置加载所述虚拟控制器的驱动。
所述输入输出单元用于接收第三人称视角录制操作。
所述虚拟控制器用于获取虚拟角色在所述虚拟现实应用中的位置信息,以及获取第三人称视角当前的方位信息;所述第三人称视角的方位信息是指第三人称视角虚拟摄像机在虚拟现实应用中的方位信息。
所述虚拟控制器还用于根据所述位置信息和所述第三人称视角当前的方位信息,获取当前虚拟角色所在的所述虚拟现实应用对应的场景数据。
所述虚拟控制器还用于根据所述场景数据、所述第三人称视角当前的方位信息和所述虚拟角色的姿态数据,录制所述虚拟角色当前在所述虚拟现实应用中的第三人称视角画面。
在一些实施方式中,所述第三人称视角虚拟摄像机的方位信息跟随所述虚拟角色的位置变化而变化。
相较于现有技术,本申请提供的方案中,用于拍摄第三人称视角画面的虚拟控制器先获取所述位置信息以及第三人称视角当前的方位信息,根据所述位置信息和所述第三人称视角的方位信息获取上述场景数据,然后根据所述场景数据、所述第三人称视角当前的方位信息和所述虚拟角色的姿态数据拍摄第三人称视角画面。可见,采用本方案后,一方面中,不需要另外配备手持控制器就能通过虚拟控制器实现第三人称视角画面的拍摄,并且不需要额外的人力和设备,也不受物理硬件的限制,可以全方位的进行第三人称视角画面的拍摄。
另一方面中,在拍摄VR游戏时,不需要有拍摄混合现实经验的摄影师在拍摄过程中依据游戏进程和玩家位置,不断地调整拍摄方位,普通玩家就能够 独立完成整个操作。更不需要玩家同步设定绿幕环境和视频同步,以及拍摄完后还要进行视频的叠加处理,由此可见,本申请的方案能够降低玩家的门槛,也能简化整个获取拍摄视频的时间。
可选的,在一些发明实施例中,所述输入输出单元在所述虚拟控制器拍摄所述虚拟角色当前在所述虚拟现实应用中的第三人称视角画面之后,还用于接收第一调整操作。
所述虚拟控制器还用于根据所述输入输出单元接收到的所述第一调整操作输入的所述第一方位参数,调整第三人称视角的方位信息,以控制所述第三人称视角虚拟摄像机在所述虚拟现实应用中的位移和旋转。其中,所述第一方位参数可由用户通过如图3b所示的视角控制器输入虚拟控制器。
可选的,在一些发明实施例中,所述输入输出单元还用于接收第二调整操作;
所述虚拟控制器还用于根据所述输入输出单元接收到的所述第二调整操作输入的第一指令,调整所述虚拟控制器的视场角,所述视场角是指所述第三人称视角虚拟摄像机拍摄第三人称视角画面的视角范围。可见本申请在调整该视场角时,不会受到物理硬件的限制,可以任意的调大或调小视场角,以获得合适的视野效果。
可选的,在一些发明实施例中,所述虚拟控制器还用于:
通过所述输入输出单元接收第三调整操作;
将所述第三人称视角的方位信息设置为固定值。
可选的,在一些发明实施例中,所述方位信息包括视角位置和视角方位,基于虚拟现实头盔的视角方位,该摄像机可以拍摄第一人称视角的画面。所述配置器还用于:
通过所述输入输出单元接收第三人称视角配置操作;
将所述第三人称视角的视角方位与所述虚拟现实头盔的视角方位绑定,使得所述第三人称视角的视角方位跟随所述虚拟现实头盔的视角方位变化而变化,以获取第一人称视角画面,所述第一人称视角画面是指所述虚拟角色在所述位置信息时,所述第三人称视角虚拟摄像机拍摄的视场角内的画面。进而实现充当第一人称视角的功能。
具体来说,通过将第三人称视角的视角方位与所述虚拟现实头盔的视角方位绑定位置,就能够实现第三人称视角的视角方位跟随所述虚拟现实头盔的视角方位的变化而变化,并且保持第三人称视角的视角方位与虚拟现实头盔的视角方位一致。这样,由虚拟现实头盔所拍摄的画面则是以第一人称视角拍摄的第一人称视角画面,这样就可以实现通过控制第三人称视角的拍摄来实现对第一人称视角的拍摄。
可选的,在本发明实施例中,将第三人称视角的视角方位与所述虚拟现实头盔的视角方位绑定后,由于虚拟现实头盔可能会由于穿戴其的用户姿态的改变而受到一些剧烈晃动,可能会导致第三人称视角拍摄的画面也相应出现晃动。为解决该问题,本申请还可以在该基础上对第三人称视角的旋转进行算法处理,以减轻由于头盔晃动所引起的画面晃动。一些实施方式中,所述虚拟现实头盔的视角方位包括横滚角,可将该横滚角设置为固定值,从而控制第三人称视角的横滚动作。可为该固定值设置一个阈值范围,取该阈值范围内的横滚角都可以减轻画面晃动的现象。例如可以设为0或者设为较小的值,具体本申请不作限定。这样当所述虚拟现实头盔的横滚角为固定值时,所述虚拟控制器至少可执行以下项之一来调整第三人称视角的视角信息:
左右移动、前后移动、左右旋转和前后旋转。可见,通过该机制,能够使得第三人称视角所拍摄的画面不会在横滚角的方向上晃动,从而减少画面在横滚角方向上的晃动对用户的眩晕感。
可选的,在一些发明实施例中,对于第三人称视角的控制,除了通过上述方位参数的更新来控制之外,还可以通过人工控制,以实现根据VR应用的实际交互场景迅速的、自主的进行位移和旋转的调整操作,也能实现自当导演式的拍摄方式。例如,用于可通过对虚拟控制器输入操作指令来控制第三人称视角在整个虚拟现实应用中的位移或旋转。具体来说,
上述对象向终端设备输入第四调整操作,然后终端设备可响应于所述第四调整操作,根据所述第四调整操作调整所述第三人称视角的方位参数。
可选的,在一些发明实施例中,所述虚拟控制器调整所述第三人称视角的视角方位时,调整以下项时的调整边界为无穷大:
左右移动、前后移动、上下移动、左右旋转、前后旋转和上下旋转。
可见,通过不限定调整所述第三人称视角的视角方位的调整边界,能够实现不受物理硬件的束缚,可以无限制的、自由的进行视角方位的调整,进而获取类似广角镜头的超大视野的拍摄效果,也为用户提供VR应用视频拍摄的多样性选择。
为便于理解,下面以虚拟现实应用基于Steam VR平台实现为例,Steam VR平台安装于终端设备,例如Steam VR平台安装于计算机端的VR客户端。本申请不对所述虚拟现实应用可运行的平台进行限定。本申请中的配置器、虚拟控制器和视角控制器可以被配置为所述虚拟现实应用中的应用插件。下面针对配置器、虚拟控制器和视角控制器的功能进行说明:
上述配置器主要用于配置Steam VR平台以及配置VR应用。
1、在配置Steam VR平台时,通过修改Steam VR平台的配置,使其允许加载第三方的设备驱动,从而使得Steam VR识别加载虚拟控制器。另外,按照Steam VR平台的路径要求,为实现成功加载虚拟控制器,还需要将虚拟控制器的驱动模块放置在Steam VR平台中的指定目录,使得虚拟控制器能够被成功调用。
2、在配置VR应用时,可以通过配置使得VR应用接受虚拟控制器为第三人称视角的虚拟摄像机。由于目前开发VR应用的主流工具为Unity,Steam VR平台为Unity开发者提供一套开发工具包,开发工具包中包括一个自动添加虚拟摄像机的脚本。要成功调用Unity则需要在VR应用的根目录下设置一个符合调用该Unity的格式的配置文件,基于Steam VR平台拍摄第三人称视角需要配备额外的第三只手柄(其用于拍摄第三人称视角的画面),将该第三只手柄的方位作为虚拟摄像机在VR应用中的方位。而本申请中的配置器可自动配置VR应用,不需要额外的第三只手柄,而是通过配置器配置使得Steam VR平台将虚拟控制器作为虚拟的第三只手柄,以及将该虚拟控制器的方位作为虚拟摄像机在VR应用中的方位,这样就使得该虚拟控制器能够实现该第三只物理手柄作为虚拟摄像机的功能,因此能够使得VR应用满足虚拟控制器的拍摄要求。
3、用于配置虚拟控制器的FOV,也就是配置第三人称视角的FOV,FOV 的取值越大,那么第三人称视角的视野范围也就越大,第三人称视角虚拟摄像机的拍摄范围也越大。这样,即使在第一人称视角下,因为受硬件的限制而无法调整FOV,在第三人称视角下则无需受到硬件的限制,只需要根据需求调大FOV,就能实现拍摄更广视野的效果。还可配置虚拟控制器与视角控制器之间的关联关系,使得视角控制器可以向虚拟控制器提供用于控制第三人称视角虚拟摄像机的方位参数。
上述虚拟控制器可模拟一个VR手持手柄,也能模拟VR应用中的第三人称视角虚拟摄像机,虚拟控制器可用于第三人称视角位置的移动和方向的旋转。虚拟控制器可以是一个遵循OpenVR驱动接口标准的动态链接库(英文全称:dynamic link library,英文简称:DLL)文件,DLL文件位于Steam VR平台的根目录,DLL文件用于驱动Steam VR平台模拟的第三只物理手柄(即虚拟控制器)。
在一些实施方式中,可以在配置器中的自动检测配置部分配置第三人称视角虚拟摄像机的拍摄功能,以及在视角控制器中的预览拍摄控制部分配置控制第三人称视角虚拟摄像机的拍摄功能。
其中配置器的自动检测配置部分可参考图4所示的界面,自动检测配置部分包括第一人称视角视频GIF与第三人称视角视频GIF的效果对比区域,第一人称视角视频GIF与第三人称视角视频GIF能够直观的向用户呈现第一人称视角和第三人称视角所拍摄的区别。该自动检测配置部分还包括信息提示区,以提醒用户即将自动或需要手动进行的动作,例如左右移动、上下移动、上下旋转等操作。
其中视角控制器的预览拍摄控制部分可参考图5所示的界面,图5的左侧为预览窗口,图5的右侧为控制区域。例如,控制区域中包括方位控制区域和旋转控制区域。方位控制区域主要可以调整上下、左右以及前后这3个维度。例如可通过拖拽滑块实现左右、上下、前后的位移移动,本申请不对位移变化的边界进行限制。
旋转控制区域主要可以调整上下以及左右这2个维度的旋转。例如可通过拖拽滑块实现左右旋转180°,上下旋转90°,本申请不对旋转的角度进行限制。
视角控制器可向用户提供用户界面(英文全称:user interface,英文简称:UI),用户可以通过该UI实时、便捷的调整虚拟控制器的方位参数,也就是调整第三人称视角虚拟摄像机的方位参数。视角控制器的具体样式可参考如图5所示的预览拍摄控制部分的示意图。其中,第三人称视角虚拟摄像机在三维空间的位移可由视角控制器实时地上下、左右、前后来不断调整,且不设可调整的边界,以供用户随意调整第三人称视角虚拟摄像机的方位。用户在调整第三人称视角虚拟摄像机的方位时,可通过图5所示的预览拍摄控制部分界面来实现。
视角控制器、虚拟控制器及配置器三者可共同配合完成对第三人称视角画面的拍摄,以及对第三人称视角的视角方位和视角位置的调整功能,具体流程如图6a所示。图6a的流程包括:
601、配置器配置VR应用、虚拟控制器和视角控制器。
具体来说,配置VR应用,使VR应用接受将虚拟控制器作为第三人称视角虚拟摄像机,还可以配置虚拟控制器的FOV,以及配置虚拟控制器与视角控制器之间的关联关系。
602、配置器启动第三人称视角录制功能。
即启动虚拟控制器作为第三人称视角虚拟摄像机的功能。
603、虚拟控制器以第三人称视角录制第三人称视角画面。
需要说明的是,在本步骤中,用户正在通过终端设备与VR应用交互,在其交互过程中,如图6b所示,用户可随时点击终端设备的VR应用界面上的应用图标“虚拟控制器”,并选择该应用图标“虚拟控制器”的下拉菜单中的“启动第三人称视角录制”,这样就可以启动第三人称视角虚拟摄像机去以第三人称视角录制第三人称视角画面。
在一些实施方式中,当用户在与VR应用交互时,若想以新的角度去拍摄第三人称视角画面,可以通过点击如图6b中的应用图标“视角控制器”,并选择该应用图标“视角控制器”的下拉菜单中的“调整方位参数”,进入如图5所示的视角控制器面板的界面示意图。用户可以在如图5所示的应用界面中的方位控制区域或者旋转控制区域对范围给参数进行调整,同时调整的数值会传递到虚拟控制器,使得虚拟控制器可以实时的跟随视角控制器的方位参数变化 来实时控制第三人称视角虚拟摄像机的方位信息,进而调整第三人称视角的拍摄方位,从而得到多样化的画面。具体步骤如下:
604、视角控制器向虚拟控制器发送第一方位参数。
605、虚拟控制器接收第一方位参数,根据第一方位参数调整第三人称视角虚拟摄像机的方位信息,以控制第三人称视角虚拟摄像机在VR应用中的位移和旋转。
一些实施方式中,用户也可以不通过视角控制器去调整第三人称视角虚拟摄像机,也可以直接在VR应用界面上进行拖拽、触摸、点击、滑动等操作,这样就可以更加直观的、灵活的去调整第三人称视角虚拟摄像机。如图6c所示,用户可用手指直接在如图6c所示的VR应用界面中的“摄像机图标”进行拖拽,拖拽后,可看到当前第三人称视角虚拟摄像机的拍摄视角变化了,其所拍摄的第三人称视角画面变为如图6d所示的场景。
一些实施方式中,当用户在与VR应用交互时,若想以更广的视野拍摄第三人称视角画面,可以通过点击如图6d中的应用图标“视角控制器”,并选择该应用图标“视角控制器”的下拉菜单中的“调整视场角”,例如将视场角调大20°,使得在VR应用中的第三人称视角虚拟摄像机拍摄的视野扩大,扩大后的VR应用的应用界面如何6e所示。
一些实施方式中,视角控制器、虚拟控制器及Steam VR平台相关联,三者可共同配合完成对第三人称视角的视角方位和视角位置的调整功能,具体流程如图7a所示。
图7a中,配置器将虚拟控制器放置在Steam VR平台的特定目录后,Steam VR平台就可以加载驱动虚拟控制器的DLL文件,进而加载虚拟控制器作为虚拟手柄,该虚拟手柄用于实现第三人称视角虚拟摄像机的功能。这样,当用户使用虚拟控制器拍摄第三人称视角画面时,可以调用该虚拟控制器以实现第三人称视角虚拟摄像机的功能,然后由虚拟控制器以第三人称视角对用户所在的VR应用的交互场景进行拍摄。
具体来说,在对配置器正确配置的前提下,Steam VR平台将会加载虚拟控制器。Steam VR平台可通过如图7b所示的接口(包括IServerTrackedDevi ceProvider、IServerDriverHost、ITrackedDeviceServerDriver、IVRControllerCo  mponent)实现对虚拟控制器的加载。Steam VR平台可定时调用服务驱动(ServerDriver)类中的函数RunFrame去加载虚拟控制器,并由接口IServerDri verHost初始化虚拟控制器的方位参数。并且在调用RunFrame时,虚拟控制器还需要上报当前虚拟控制器的方位参数。此外,可由接口IServerDriverHost内的ITrackedDevicePoseUpdated实现虚拟控制器的方位参数的更新。下面对调整第三人称视角虚拟摄像机的方位参数进行详细说明:
701、视角控制器收到用户通过UI输入的调整方位的指令后,用户可在视角控制器的UI界面设置用于调整虚拟控制器的方位参数,并保存至共享内存中。
其中,共享内存中主要包括方位参数(例如用三轴坐标系中的x、y和z表示)和旋转参数(例如用三轴坐标系中的rx、ry和rz表示)。
702、虚拟控制器可从该共享内存中请求用户在视角控制器输入的方位参数。
703、虚拟控制器从共享内存获取方位参数。
704、虚拟控制器将获取到的方位参数发送给Steam VR平台。
705、Steam VR平台接收方位参数。
706、Steam VR平台根据接收到的方位参数更新该虚拟控制器当前的方位参数。
可见,通过同步更新方位参数,可以实现第三人称视角虚拟摄像机的位移调整或旋转操作。可选的,用户还可以直接在VR应用界面通过虚拟按键、键盘或者鼠标拖拽等方式来调整第三人称视角虚拟摄像机的视角方位和视角位置。
在一些实施方式中,在启动第三人称视角拍摄视频时,配置器还需要判断是否完成了本次第三人称视角拍摄的配置,以及判断第三人称视角拍摄的配置是否正确。具体的判断流程可参考如图8所示的流程示意图。
图8中,配置器首先判断Steam VR平台的指定目录中是否存在虚拟控制器,若不存在,则放置模块文件至Steam VR中的指令目录,并判断Steam VR配置是否允许第三方设备驱动;若存在,则判断Steam VR配置是否允许第三方设备驱动。
若Steam VR配置允许第三方设备驱动,则判断VR应用是否已配置第三人称视角配置。
若Steam VR配置不允许第三方设备驱动,则设置Steam VR配置,并自动重启Steam VR,以及在Steam VR中加载虚拟控制器。然后判断VR应用是否已配置第三人称视角配置。
若VR应用已配置第三人称视角配置,则进一步判断标配的两个物理控制器是否已准备好对VR应用进行画面的拍摄。
若VR应用未配置第三人称视角配置,则设置第三人称视角配置,并重启VR应用。然后进一步判断标配的两个物理控制器是否已准备好。
若确定标配的两个物理控制器已准备好,则向虚拟控制器发送开启指令,以开启虚拟控制器,从而开启第三人称视角虚拟摄像机的第三人称视角拍摄功能;若确定标配的两个物理控制器还未准备好对VR应用进行画面的拍摄,则引导用户先开启两个物理控制器,检测到这两个物理控制器准备好后,开启虚拟控制器,使得虚拟控制器被Steam VR平台识别,从而开启第三人称视角拍摄功能。
虚拟控制器开启后,即可开始进行第三人称视角画面的拍摄。在用户与VR应用交互过程中,当用户通过UI调整第三人称视角虚拟摄像机的方位参数后,视角控制器在共享内存中更新方位参数,然后将更新后的方位参数发给虚拟控制器,使得虚拟控制器在从视角控制器接收更新的方位参数后,更新方位参数至Steam VR平台,这样终端设备就可以以更新后的方位参数进行第三人称视角画面的拍摄。需要说明的是,本申请实施例中的方位参数的更新可以是基于用户输入触发,也可以是基于用户与VR应用交互过程中用户的位置信息的变化而触发,具体本申请不作限定。
上面从模块化功能实体的角度对本发明实施例中的终端设备进行了描述,下面从硬件处理的角度分别对本发明实施例中的终端设备进行描述。
本发明实施例还提供了另一种终端设备,如图9所示,为了便于说明,仅示出了与本发明实施例相关的部分,具体技术细节未揭示的,请参照本发明实施例方法部分。该终端设备可以为包括PC、平板电脑、个人数字助理(英文全称:Personal Digital Assistant,英文简称:PDA)、销售终端(英文全称: Point of Sales,英文简称:POS)、车载电脑等任意终端设备,以终端设备为PC为例:
图9示出的是与本发明实施例提供的终端设备相关的PC的部分结构的框图。参考图9,PC包括:射频(英文全称:Radio Frequency,英文简称:RF)电路99、存储器920、输入单元930、显示单元940、传感器950、音频电路960、无线保真(英文全称:wireless fidelity,英文简称:WiFi)模块970、处理器980、以及电源990等部件。本领域技术人员可以理解,图9中示出的PC结构并不构成对PC的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
下面结合图9对PC的各个构成部件进行具体的介绍:
RF电路99可用于收发信息或通话过程中,信号的接收和发送,特别地,将基站的下行信息接收后,给处理器980处理;另外,将设计上行的数据发送给基站。通常,RF电路99包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器(英文全称:Low Noise Amplifier,英文简称:LNA)、双工器等。此外,RF电路99还可以通过无线通信与网络和其他设备通信。上述无线通信可以使用任一通信标准或协议,包括但不限于全球移动通讯系统(英文全称:Global System of Mobile communication,英文简称:GSM)、通用分组无线服务(英文全称:General Packet Radio Service,英文简称:GPRS)、码分多址(英文全称:Code Division Multiple Access,英文简称:CDMA)、宽带码分多址(英文全称:Wideband Code Division Multiple Access,英文简称:WCDMA)、长期演进(英文全称:Long Term Evolution,英文简称:LTE)、电子邮件、短消息服务(英文全称:Short Messaging Service,英文简称:SMS)等。
存储器920可用于存储软件程序以及模块,处理器980通过运行存储在存储器920的软件程序以及模块,从而执行PC的各种功能应用以及数据处理。存储器920可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据PC的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器920可以包括高速随机存取存储器,还可以包括非易失 性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。
输入单元930可用于接收输入的数字或字符信息,以及产生与PC的用户设置以及功能控制有关的键信号输入。具体地,输入输出单元930可包括触控面板931以及其他输入输出设备932。触控面板931,也称为触摸屏,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板931上或在触控面板931附近的操作),并根据预先设定的程式驱动相应的连接装置。可选的,触控面板931可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再送给处理器980,并能接收处理器980发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触控面板931。除了触控面板931,输入单元930还可以包括其他输入设备932。具体地,其他输入设备932可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆等中的一种或多种。
显示单元940可用于显示由用户输入的信息或提供给用户的信息以及PC的各种菜单。显示单元940可包括显示面板941,可选的,可以采用液晶显示器(英文全称:Liquid Crystal Display,英文简称:LCD)、有机发光二极管(英文全称:Organic Light-Emitting Diode,英文简称:OLED)等形式来配置显示面板941。进一步的,触控面板931可覆盖显示面板941,当触控面板931检测到在其上或附近的触摸操作后,传送给处理器980以确定触摸事件的类型,随后处理器980根据触摸事件的类型在显示面板941上提供相应的视觉输出。虽然在图9中,触控面板931与显示面板941是作为两个独立的部件来实现PC的输入和输入功能,但是在某些实施例中,可以将触控面板931与显示面板941集成而实现PC的输入和输出功能。
PC还可包括至少一种传感器950,比如光传感器、运动传感器以及其他传感器。具体地,光传感器可包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板941的亮度,接近传感器可在 PC移动到耳边时,关闭显示面板941和/或背光。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别PC姿态的应用(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;至于PC还可配置的陀螺仪、气压计、湿度计、温度计、红外线传感器等其他传感器,在此不再赘述。
音频电路960、扬声器961,传声器962可提供用户与PC之间的音频接口。音频电路960可将接收到的音频数据转换后的电信号,传输到扬声器961,由扬声器961转换为声音信号输出;另一方面,传声器962将收集的声音信号转换为电信号,由音频电路960接收后转换为音频数据,再将音频数据输出处理器980处理后,经RF电路99以发送给比如另一PC,或者将音频数据输出至存储器920以便进一步处理。
WiFi属于短距离无线传输技术,PC通过WiFi模块970可以帮助用户收发电子邮件、浏览网页和访问流式媒体等,它为用户提供了无线的宽带互联网访问。虽然图9示出了WiFi模块970,但是可以理解的是,其并不属于PC的必须构成,完全可以根据需要在不改变发明的本质的范围内而省略。
处理器980是PC的控制中心,利用各种接口和线路连接整个PC的各个部分,通过运行或执行存储在存储器920内的软件程序和/或模块,以及调用存储在存储器920内的数据,执行PC的各种功能和处理数据,从而对PC进行整体监控。可选的,处理器980可包括一个或多个处理单元;优选的,处理器980可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器980中。
PC还包括给各个部件供电的电源990(比如电池),优选的,电源可以通过电源管理系统与处理器980逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。
尽管未示出,PC还可以包括摄像头、蓝牙模块等,在此不再赘述。
在本发明实施例中,该PC所包括的处理器980还具有控制执行以上由终端设备执行的操作。
在上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见其他实施例的相关描述。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统,装置和模块的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统,装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述模块的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个模块或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或模块的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的模块可以是或者也可以不是物理上分开的,作为模块显示的部件可以是或者也可以不是物理模块,即可以位于一个地方,或者也可以分布到多个网络模块上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能模块可以集成在一个处理模块中,也可以是各个模块单独物理存在,也可以两个或两个以上模块集成在一个模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。所述集成的模块如果以软件功能模块的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。
所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本发明实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一计算机可读存储介质传输。所述计算机可读存储介质可 以是计算机能够存储的任何可用介质或者是包含一个或多个可用介质集成的通信设备、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘Solid State Disk(SSD))等。
以上对本申请所提供的技术方案进行了详细介绍,本申请中应用了具体个例对本申请的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本申请的方法及其核心思想;同时,对于本领域的一般技术人员,依据本申请的思想,在具体实施方式及应用范围上均会有改变之处,综上所述,本说明书内容不应理解为对本申请的限制。

Claims (17)

  1. 一种基于虚拟现实应用录制视频的方法,其特征在于,所述方法用于录制虚拟角色在所述虚拟现实应用中的第三人称视角画面,所述虚拟角色是指与所述虚拟现实应用进行交互的对象在所述虚拟现实应用中的角色,所述基于虚拟现实应用录制视频的方法包括:
    配置第三人称视角,其中所述第三人称视角是配置为所述虚拟现实应用的第三人称视角虚拟摄像机的虚拟控制器的拍摄视角;
    获取虚拟角色在所述虚拟现实应用中的位置信息;
    获取第三人称视角当前的方位信息,其中所述第三人称视角当前的方位信息是第三人称视角虚拟摄像机在虚拟现实应用中的方位信息;
    根据所述位置信息和所述第三人称视角当前的方位信息,获取当前虚拟角色所在的所述虚拟现实应用对应的场景数据;
    根据所述场景数据、所述第三人称视角当前的方位信息和所述虚拟角色的姿态数据,录制所述虚拟角色当前在所述虚拟现实应用中的第三人称视角画面。
  2. 根据权利要求1所述的方法,其特征在于,所述响应于第三人称视角录制操作之前,所述方法还包括:
    配置所述虚拟现实应用,使得所述虚拟现实应用将所述虚拟控制器作为所述第三人称视角虚拟摄像机。
  3. 根据权利要求2所述的方法,其特征在于,所述第三人称视角虚拟摄像机的方位信息跟随所述虚拟角色的位置变化而变化。
  4. 根据权利要求1-3中任一所述的方法,其特征在于,所述录制所述虚拟角色当前在所述虚拟现实应用中的第三人称视角画面之后,所述方法还包括:
    接收第一调整操作;
    根据所述第一调整操作输入的第一方位参数调整所述第三人称视角的方位信息,以控制所述第三人称视角虚拟摄像机在所述虚拟现实应用中的位移和旋转。
  5. 根据权利要求1-3中任一所述的方法,其特征在于,所述方法还包括:
    接收第二调整操作;
    根据所述第二调整操作输入的第一指令,调整所述虚拟控制器的视场角,所述视场角是指所述第三人称视角虚拟摄像机拍摄第三人称视角画面的视角范围。
  6. 根据权利要求5所述的方法,其特征在于,所述方法还包括:
    接收第三调整操作;
    根据所述第三调整操作,将所述第三人称视角的方位信息设置为固定值。
  7. 根据权利要求4所述的方法,其特征在于,所述方位信息包括视角位置和视角方位,所述方法还包括:
    接收第三人称视角配置操作;
    根据所述第三人称视角配置操作,将所述第三人称视角的视角方位与虚拟现实头盔的视角方位绑定,使得所述第三人称视角的视角方位跟随所述虚拟现实头盔的视角方位变化而变化,以获取第一人称视角画面,所述第一人称视角画面是指所述虚拟角色在所述位置信息指示的位置时,所述第三人称视角虚拟摄像机拍摄的视场角内的画面。
  8. 一种终端设备,其特征在于,所述终端设备用于录制虚拟角色在所述虚拟现实应用中的第三人称视角画面,所述虚拟角色是指与所述虚拟现实应用进行交互的对象在所述虚拟现实应用中的角色,所述终端设备包括配置器、输入输出单元和虚拟控制器;
    所述配置器用于配置第三人称视角,其中所述第三人称视角是配置为所述虚拟现实应用的第三人称视角虚拟摄像机的虚拟控制器的拍摄视角;
    所述虚拟控制器用于通过所述输入输出单元获取虚拟角色在所述虚拟现实应用中的位置信息,以及获取第三人称视角当前的方位信息;其中所述第三人称视角当前的方位信息是指第三人称视角虚拟摄像机在虚拟现实应用中的方位信息;
    根据所述位置信息和所述第三人称视角当前的方位信息,获取当前虚拟角色所在的所述虚拟现实应用对应的场景数据;
    根据所述场景数据、所述第三人称视角当前的方位信息和所述虚拟角色的姿态数据,录制所述虚拟角色当前在所述虚拟现实应用中的第三人称视角画面。
  9. 根据权利要求8所述的终端设备,其特征在于,所述配置器还用于:
    将所述虚拟现实应用配置为将所述虚拟控制器作为所述第三人称视角虚拟摄像机。
  10. 根据权利要求9所述的终端设备,其特征在于,所述第三人称视角虚拟摄像机的方位信息跟随所述虚拟角色的位置变化而变化。
  11. 根据权利要求8-10中任一所述的终端设备,其特征在于,所述输入输出单元在所述虚拟控制器录制所述虚拟角色当前在所述虚拟现实应用中的第三人称视角画面之后,还用于接收第一调整操作;
    所述虚拟控制器还用于根据所述输入输出单元接收到的所述第一调整操作输入的所述第一方位参数,调整第三人称视角的方位信息,以控制所述第三人称视角虚拟摄像机在所述虚拟现实应用中的位移和旋转。
  12. 根据权利要求8-10中任一所述的终端设备,其特征在于,所述输入输出单元还用于接收第二调整操作;
    所述虚拟控制器还用于根据所述输入输出单元接收到的所述第二调整操作输入的第一指令,调整所述虚拟控制器的视场角,所述视场角是指所述第三人称视角虚拟摄像机拍摄第三人称视角画面的视角范围。
  13. 根据权利要求12所述的终端设备,所述虚拟控制器还用于:
    通过所述输入输出单元接收第三调整操作;
    将所述第三人称视角当前的方位信息设置为固定值。
  14. 根据权利要求11所述的终端设备,其特征在于,所述方位信息包括视角位置和视角方位,所述配置器还用于:
    通过所述输入输出单元接收第三人称视角配置操作;
    将所述第三人称视角的视角方位与所述虚拟现实头盔的视角方位绑定,使得所述第三人称视角的视角方位跟随所述虚拟现实头盔的视角方位变化而变化,以获取第一人称视角画面,所述第一人称视角画面是指所述虚拟角色在所述位置信息时,所述第三人称视角虚拟摄像机拍摄的视场角内的画面。
  15. 一种终端设备,其特征在于,所述终端设备包括:
    至少一个处理器、存储器和输入输出单元;
    其中,所述存储器用于存储程序代码,所述处理器用于调用所述存储器中存储的程序代码来执行如权利要求1-7任一项中所述的方法。
  16. 一种计算机存储介质,其特征在于,其包括指令,当其在计算机上运 行时,使得计算机执行如权利要求1-7任一项中所述的方法。
  17. 一种包含指令的计算机程序产品,其特征在于,当其在计算机上运行时,使得计算机执行上述权利要求1-7任一项中所述的方法。
PCT/CN2017/098865 2017-08-24 2017-08-24 基于虚拟现实应用录制视频的方法、终端设备及存储介质 WO2019037040A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
PCT/CN2017/098865 WO2019037040A1 (zh) 2017-08-24 2017-08-24 基于虚拟现实应用录制视频的方法、终端设备及存储介质
CN201780054182.2A CN109952757B (zh) 2017-08-24 2017-08-24 基于虚拟现实应用录制视频的方法、终端设备及存储介质
EP17922767.3A EP3675488B1 (en) 2017-08-24 2017-08-24 Method for recording video on the basis of a virtual reality application, terminal device, and storage medium
US16/588,506 US11000766B2 (en) 2017-08-24 2019-09-30 Video recording method based on virtual reality application, terminal device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/098865 WO2019037040A1 (zh) 2017-08-24 2017-08-24 基于虚拟现实应用录制视频的方法、终端设备及存储介质

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/588,506 Continuation US11000766B2 (en) 2017-08-24 2019-09-30 Video recording method based on virtual reality application, terminal device, and storage medium

Publications (1)

Publication Number Publication Date
WO2019037040A1 true WO2019037040A1 (zh) 2019-02-28

Family

ID=65439683

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/098865 WO2019037040A1 (zh) 2017-08-24 2017-08-24 基于虚拟现实应用录制视频的方法、终端设备及存储介质

Country Status (4)

Country Link
US (1) US11000766B2 (zh)
EP (1) EP3675488B1 (zh)
CN (1) CN109952757B (zh)
WO (1) WO2019037040A1 (zh)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111131735A (zh) * 2019-12-31 2020-05-08 歌尔股份有限公司 视频录制方法、视频播放方法、装置及计算机存储介质
CN112527112A (zh) * 2020-12-08 2021-03-19 中国空气动力研究与发展中心计算空气动力研究所 一种多通道沉浸式流场可视化人机交互方法
CN112887695A (zh) * 2021-01-07 2021-06-01 深圳市大富网络技术有限公司 一种全景图共享处理方法、系统以及终端
CN114415840A (zh) * 2022-03-30 2022-04-29 北京华建云鼎科技股份公司 一种虚拟现实交互系统

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110460794A (zh) * 2019-09-09 2019-11-15 北京西山居互动娱乐科技有限公司 一种视频录制的方法及装置
CN113518189B (zh) * 2020-04-09 2022-10-18 华为技术有限公司 拍摄方法、系统、电子设备及存储介质
CN111586304B (zh) * 2020-05-25 2021-09-14 重庆忽米网络科技有限公司 一种基于5g及vr技术的全景摄像系统及方法
CN112090071B (zh) * 2020-09-18 2022-02-11 腾讯科技(深圳)有限公司 虚拟环境的加载方法、装置、电子设备及计算机存储介质
CN112214910A (zh) * 2020-10-30 2021-01-12 株洲中车时代电气股份有限公司 操纵台的验证方法、装置、电子设备及存储介质
CN112619166A (zh) * 2020-12-21 2021-04-09 网易(杭州)网络有限公司 游戏录屏方法、装置、电子设备及存储介质
CN112843732B (zh) * 2020-12-31 2023-01-13 上海米哈游天命科技有限公司 拍摄图像的方法、装置、电子设备及存储介质
CN112905000A (zh) * 2020-12-31 2021-06-04 国网河北省电力有限公司雄安新区供电公司 站房vr演示方法、计算机可读存储介质及站房vr演示系统
CN112843692B (zh) * 2020-12-31 2023-04-18 上海米哈游天命科技有限公司 拍摄图像的方法、装置、电子设备及存储介质
CN114286142B (zh) * 2021-01-18 2023-03-28 海信视像科技股份有限公司 一种虚拟现实设备及vr场景截屏方法
CN112817453A (zh) * 2021-01-29 2021-05-18 聚好看科技股份有限公司 虚拟现实设备和虚拟现实场景中物体的视线跟随方法
US11868526B2 (en) 2021-05-03 2024-01-09 Apple Inc. Method and device for debugging program execution and content playback
CN113325951B (zh) * 2021-05-27 2024-03-29 百度在线网络技术(北京)有限公司 基于虚拟角色的操作控制方法、装置、设备以及存储介质
CN113538640A (zh) * 2021-07-08 2021-10-22 潘宁馨 一种动画片制作方法
CN113648649B (zh) * 2021-08-23 2024-06-07 网易(杭州)网络有限公司 游戏界面的控制方法、装置、计算机可读介质及终端设备
CN114237396B (zh) * 2021-12-15 2023-08-15 北京字跳网络技术有限公司 动作调整方法、装置、电子设备及可读存储介质
CN116828131A (zh) * 2022-03-17 2023-09-29 北京字跳网络技术有限公司 基于虚拟现实的拍摄处理方法、装置及电子设备
CN115050228B (zh) * 2022-06-15 2023-09-22 北京新唐思创教育科技有限公司 一种素材收集方法及装置、电子设备
CN115665461B (zh) * 2022-10-13 2024-03-22 聚好看科技股份有限公司 一种视频录制方法及虚拟现实设备
CN115639976B (zh) * 2022-10-28 2024-01-30 深圳市数聚能源科技有限公司 一种虚拟现实内容多模式多角度同步展示方法及系统
CN117119294A (zh) * 2023-08-24 2023-11-24 腾讯科技(深圳)有限公司 虚拟场景的拍摄方法、装置、设备、介质和程序

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130069804A1 (en) * 2010-04-05 2013-03-21 Samsung Electronics Co., Ltd. Apparatus and method for processing virtual world
CN103942384A (zh) * 2014-04-17 2014-07-23 北京航空航天大学 基于头盔显示器的动态飞机装配场景实时立体可视化方法
CN105915766A (zh) * 2016-06-07 2016-08-31 腾讯科技(深圳)有限公司 基于虚拟现实的控制方法和装置
CN106131530A (zh) * 2016-08-26 2016-11-16 万象三维视觉科技(北京)有限公司 一种裸眼3d虚拟现实展示系统及其展示方法
CN106412555A (zh) * 2016-10-18 2017-02-15 网易(杭州)网络有限公司 游戏录制方法、装置及虚拟现实设备
CN106843456A (zh) * 2016-08-16 2017-06-13 深圳超多维光电子有限公司 一种基于姿态追踪的显示方法、装置和虚拟现实设备
CN106980369A (zh) * 2017-03-01 2017-07-25 广州市英途信息技术有限公司 虚拟现实项目的第三视角视频的合成和输出系统及方法

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101247481A (zh) * 2007-02-16 2008-08-20 李西峙 基于角色扮演的实时三维电影/游戏的制作及播放的系统和方法
JP2008259154A (ja) * 2007-04-06 2008-10-23 Kubo Tex Corp 3次元仮想空間を利用した感知装置のリアルタイム状態把握、制御の方法
US8854298B2 (en) * 2010-10-12 2014-10-07 Sony Computer Entertainment Inc. System for enabling a handheld device to capture video of an interactive application
US9861882B2 (en) * 2014-09-05 2018-01-09 Trigger Global Inc. Augmented reality gaming systems and methods
JP2018524134A (ja) * 2015-06-14 2018-08-30 株式会社ソニー・インタラクティブエンタテインメント Vr観戦のための拡大された視野再レンダリング
US10724874B2 (en) * 2015-10-13 2020-07-28 Here Global B.V. Virtual reality environment responsive to predictive route navigation
US9573062B1 (en) * 2015-12-06 2017-02-21 Silver VR Technologies, Inc. Methods and systems for virtual reality streaming and replay of computer video games
US20170249785A1 (en) * 2016-02-29 2017-08-31 Vreal Inc Virtual reality session capture and replay systems and methods
US11163358B2 (en) * 2016-03-17 2021-11-02 Sony Interactive Entertainment Inc. Spectating virtual (VR) environments associated with VR user interactivity
US11181990B2 (en) * 2016-03-18 2021-11-23 Sony Interactive Entertainment Inc. Spectator view tracking of virtual reality (VR) user in VR environments
US10112111B2 (en) * 2016-03-18 2018-10-30 Sony Interactive Entertainment Inc. Spectator view perspectives in VR environments
US10245507B2 (en) * 2016-06-13 2019-04-02 Sony Interactive Entertainment Inc. Spectator management at view locations in virtual reality environments
CN205987196U (zh) * 2016-08-26 2017-02-22 万象三维视觉科技(北京)有限公司 一种裸眼3d虚拟现实展示系统
US10445925B2 (en) * 2016-09-30 2019-10-15 Sony Interactive Entertainment Inc. Using a portable device and a head-mounted display to view a shared virtual reality space
US11071915B2 (en) * 2016-09-30 2021-07-27 Sony Interactive Entertainment Inc. Delivery of spectator feedback content to virtual reality environments provided by head mounted display
US10688396B2 (en) * 2017-04-28 2020-06-23 Sony Interactive Entertainment Inc. Second screen virtual window into VR environment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130069804A1 (en) * 2010-04-05 2013-03-21 Samsung Electronics Co., Ltd. Apparatus and method for processing virtual world
CN103942384A (zh) * 2014-04-17 2014-07-23 北京航空航天大学 基于头盔显示器的动态飞机装配场景实时立体可视化方法
CN105915766A (zh) * 2016-06-07 2016-08-31 腾讯科技(深圳)有限公司 基于虚拟现实的控制方法和装置
CN106843456A (zh) * 2016-08-16 2017-06-13 深圳超多维光电子有限公司 一种基于姿态追踪的显示方法、装置和虚拟现实设备
CN106131530A (zh) * 2016-08-26 2016-11-16 万象三维视觉科技(北京)有限公司 一种裸眼3d虚拟现实展示系统及其展示方法
CN106412555A (zh) * 2016-10-18 2017-02-15 网易(杭州)网络有限公司 游戏录制方法、装置及虚拟现实设备
CN106980369A (zh) * 2017-03-01 2017-07-25 广州市英途信息技术有限公司 虚拟现实项目的第三视角视频的合成和输出系统及方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3675488A4 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111131735A (zh) * 2019-12-31 2020-05-08 歌尔股份有限公司 视频录制方法、视频播放方法、装置及计算机存储介质
CN112527112A (zh) * 2020-12-08 2021-03-19 中国空气动力研究与发展中心计算空气动力研究所 一种多通道沉浸式流场可视化人机交互方法
CN112887695A (zh) * 2021-01-07 2021-06-01 深圳市大富网络技术有限公司 一种全景图共享处理方法、系统以及终端
CN114415840A (zh) * 2022-03-30 2022-04-29 北京华建云鼎科技股份公司 一种虚拟现实交互系统

Also Published As

Publication number Publication date
EP3675488A4 (en) 2021-03-17
EP3675488A1 (en) 2020-07-01
CN109952757A (zh) 2019-06-28
US11000766B2 (en) 2021-05-11
EP3675488B1 (en) 2024-02-28
CN109952757B (zh) 2020-06-05
US20200023276A1 (en) 2020-01-23

Similar Documents

Publication Publication Date Title
US11000766B2 (en) Video recording method based on virtual reality application, terminal device, and storage medium
US11151773B2 (en) Method and apparatus for adjusting viewing angle in virtual environment, and readable storage medium
CN110636324B (zh) 界面显示方法、装置、计算机设备及存储介质
WO2019105438A1 (zh) 视频特效添加方法、装置及智能移动终端
KR102227087B1 (ko) 글래스 타입 단말기 및 그것의 제어방법
WO2021104236A1 (zh) 一种共享拍摄参数的方法及电子设备
WO2018161426A1 (zh) 一种拍照方法和终端
WO2019149049A1 (zh) 一种业务处理方法、终端、服务器及相关产品
CN109729411B (zh) 直播互动方法及装置
WO2020186988A1 (zh) 资讯的展示方法、装置、终端及存储介质
WO2016173427A1 (zh) 一种残影效果的实现方法,装置以及计算机可读介质
US11954200B2 (en) Control information processing method and apparatus, electronic device, and storage medium
CN108694073B (zh) 虚拟场景的控制方法、装置、设备及存储介质
WO2019196929A1 (zh) 一种视频数据处理方法及移动终端
WO2021136266A1 (zh) 虚拟画面同步方法及穿戴式设备
WO2021197121A1 (zh) 图像拍摄方法及电子设备
CN109151546A (zh) 一种视频处理方法、终端及计算机可读存储介质
CN112612439B (zh) 弹幕显示方法、装置、电子设备及存储介质
WO2019105441A1 (zh) 视频录制方法和视频录制终端
CN110769313B (zh) 视频处理方法及装置、存储介质
CN112565911B (zh) 弹幕显示方法、弹幕生成方法、装置、设备及存储介质
WO2022095465A1 (zh) 信息显示方法及装置
CN109710151B (zh) 一种文件处理方法及终端设备
WO2021136329A1 (zh) 视频剪辑方法及头戴式设备
CN109068055A (zh) 一种构图方法、终端和存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17922767

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2017922767

Country of ref document: EP

Effective date: 20200324