WO2018103635A1 - 一种vr场景下的攀爬操作处理方法、装置及可读存储介质 - Google Patents

一种vr场景下的攀爬操作处理方法、装置及可读存储介质 Download PDF

Info

Publication number
WO2018103635A1
WO2018103635A1 PCT/CN2017/114616 CN2017114616W WO2018103635A1 WO 2018103635 A1 WO2018103635 A1 WO 2018103635A1 CN 2017114616 W CN2017114616 W CN 2017114616W WO 2018103635 A1 WO2018103635 A1 WO 2018103635A1
Authority
WO
WIPO (PCT)
Prior art keywords
climbing
model
character
point
hand model
Prior art date
Application number
PCT/CN2017/114616
Other languages
English (en)
French (fr)
Inventor
刘小宁
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Publication of WO2018103635A1 publication Critical patent/WO2018103635A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • A63F13/285Generating tactile feedback signals via the game input device, e.g. force feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/302Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device specially adapted for receiving control signals not targeted to a display device or game input means, e.g. vibrating driver's seat, scent dispenser

Definitions

  • the present invention relates to the field of computer technologies, and in particular, to a climbing operation processing method, apparatus, and readable storage medium in a VR scenario.
  • Virtual Reality (VR) technology is a computer simulation system that can create and experience virtual worlds. It uses computer to generate a simulation environment, which is an interactive 3D dynamic view and entity behavior of multi-source information fusion. System simulation to immerse users in the environment.
  • VR games An important application of VR technology is VR games.
  • One of the special features of VR games is that by using interactive controllers, players can interact with each other in the game as in the real world. For this reason, the interactive controller is used to climb in the game. Operation is an important gameplay for VR games.
  • a VR game uses an interactive controller with a positioning function to interact with the user, which is quite different from a conventional game using a mouse, a keyboard, a handle, and the like.
  • the climbing operation is performed by using a traditional game controller.
  • a climbing area is set on the climbing route of the hand model, and each climbing area corresponds to an identification indication.
  • the player's body is oriented and multiple climbing areas need to be connected in series using the climbing route.
  • the hand model In the existing climbing operation method in the VR scene, in addition to specifying the climbing point in the VR scene, the hand model is limited to only complete the climbing operation on the designated climbing line in the climbing area. Because the position control of the game virtual camera lens needs to map the movement vector of the player's hand to the movement route defined by the climbing route, which will greatly limit the simulation of the hand model's operation on the player in the VR scene, so that the hand The climbing operation simulated by the model is inconsistent with the player's climbing action in the real world.
  • the embodiment of the invention provides a method and a device for processing a climbing operation in a VR scenario, which are used to solve the problem that the climbing operation simulated by the hand model does not match the actual climbing action of the user, and the accuracy of the user climbing action is realized. simulation.
  • the embodiment of the present invention provides the following technical solutions:
  • an embodiment of the present invention provides a method for processing a climbing operation in a VR scenario, including:
  • the interaction controller When the hand model in the VR scene captures the climbing point, acquiring a motion vector generated by the interaction controller in the measurement space from the climbing start time to the current climbing time, the interaction controller climbs Motion controlling the hand model to perform a climbing operation in the VR scenario, the hand model being part of a character model;
  • the embodiment of the present invention further provides a climbing operation processing apparatus in a VR scenario, including:
  • a motion vector acquisition module configured to acquire, when the hand model in the VR scene captures a climbing point, acquire a motion vector generated by the interaction controller in the measurement space from a climbing start moment to a current climbing moment, The climbing motion of the interactive controller controls the hand model to perform a climbing operation in the VR scene, the hand model being part of a character model;
  • a scaling factor correction module configured to perform rotation correction on the initial movement scaling factor of the climbing point configuration according to a direction of the character model in the character space and a direction of the climbing point in the character space, to obtain a correction After the moving zoom factor;
  • a display adjustment module configured to adjust a position of the virtual camera lens of the character model mounted in the world coordinate system according to the corrected moving zoom factor and the motion vector, where the position adjustment of the virtual camera lens is changed Display of the VR scene.
  • the motion vector generated by the interaction controller from the climbing start time to the current climbing time in the measurement space is acquired, and the interaction controller is
  • the climbing motion control hand model performs a climbing operation in a VR scene
  • the hand model belongs to a part of the character model, and then configures the climbing point according to the direction of the character model in the character space and the direction of the climbing point in the character space.
  • the initial moving zoom factor is rotated and corrected, and the corrected moving zoom factor is obtained.
  • the corrected moving zoom factor and the motion vector are used to adjust the position of the virtual camera lens mounted in the character model in the character space, and the position adjustment of the virtual camera lens is adjusted. Change the display of the VR scene.
  • the movement of the hand model in the embodiment of the present invention is completely controlled by the interaction controller operated by the user, and the movement of the user operation interaction controller is determined by the user, and does not need to be bound by the climbing area and the climbing route to solve the hand model.
  • the simulated climbing operation does not match the actual climbing action of the user, and an accurate simulation of the user's climbing action is realized.
  • the motion vector generated by the interactive controller in the measurement space from the climbing start moment to the current climbing moment represents the direction and moving distance of the user's climbing motion in the real world, and the initial setting of the climbing point in the VR scene is set.
  • Moving the scaling factor which needs to be corrected according to the direction difference between the direction of the character model in the world coordinate system and the direction of the climbing point in the character space, and can be adjusted by using the corrected moving scaling factor and the motion vector.
  • the position of the virtual camera lens mounted by the character model in the character space, so the adjustment of the virtual camera lens in the embodiment of the present invention does not depend on the moving route defined by the climbing area and the climbing route, and the climbing operation in any VR scene. Can be achieved, allowing users to experience climbing movements close to real physics.
  • FIG. 1 is a schematic structural diagram of a VR positioning system according to an embodiment of the present invention.
  • FIG. 2 is a schematic diagram of a scene of a climbing point according to an embodiment of the present invention.
  • FIG. 3 is a schematic diagram of a scene of a hand model climbing in an embodiment of the present invention.
  • FIG. 4 is a schematic block diagram of a method for processing a climbing operation in a VR scenario according to an embodiment of the present invention
  • FIG. 5 is a schematic block diagram of a calculation process of a climbing operation process in a VR scenario according to an embodiment of the present invention
  • FIG. 6 is a schematic structural diagram of a structure of a climbing operation processing apparatus in a VR scenario according to an embodiment of the present disclosure
  • FIG. 7 is a schematic structural diagram of a display adjustment module according to an embodiment of the present disclosure.
  • FIG. 8 is a schematic structural diagram of a structure of a climbing operation processing apparatus in another VR scenario according to an embodiment of the present disclosure
  • FIG. 9 is a schematic structural diagram of a structure of a climbing operation processing apparatus in another VR scenario according to an embodiment of the present disclosure.
  • FIG. 10 is a schematic structural diagram of a structure of a climbing operation processing apparatus in another VR scenario according to an embodiment of the present disclosure
  • FIG. 11 is a schematic structural diagram of a structure of a climbing operation processing method applied to a server in a VR scenario according to an embodiment of the present invention.
  • the embodiment of the invention provides a method and a device for processing a climbing operation in a VR scenario, which are used to solve the problem that the climbing operation simulated by the hand model does not match the actual climbing action of the user, and the accuracy of the user climbing action is realized. simulation.
  • An embodiment of the method for processing a climbing operation in the VR scenario of the present invention may be specifically applied to a field in which a hand model is used to simulate a climbing motion performed by a user using an interactive controller in a VR scenario.
  • the VR scenario in the embodiment of the present invention may specifically refer to a VR game scenario, and may also be a VR operation scenario of the application, such as a VR application operation scenario of the office software, a VR application operation scenario of the role, and the like.
  • FIG. 1 is a VR positioning provided in the embodiment of the present application.
  • the VR positioning system 10 includes a processing device 11, a VR display device 12, and an interaction controller 13.
  • the VR display device 12 is connected to the processing device 11.
  • the VR display device 12 is a VR headset (virtual reality head-mounted display device), and may be a VR glasses, a VR helmet, or the like.
  • the VR display device is a device that uses a head-mounted display to close a person's visual and auditory sense to the outside world, and guides the user to create a feeling in a virtual environment.
  • the processing device 11 can be a terminal device such as a computer.
  • the interaction controller 13 is a control device produced by the VR device manufacturer. For example, the user can hold an interaction controller by using the right and left hands.
  • the interaction controller is a hand control device matched by the VR device, and can track the position, rotation, and input of the hands, and obtain The relevant data is used in the VR scenario.
  • Hand model A model used to represent the player's hands in a VR game. The position of the hand model itself is changed by the positioning result of the positioning controller.
  • Climbing point A transparent collision bounding box placed in the game scene to indicate the area where the character hand model can be climbed and grabbed.
  • Measurement space The real physical space calibrated by the positioning system produced by the VR hardware equipment manufacturer, usually several square meters to several tens of square meters. The measurement space is used to obtain the rotation and movement of the VR helmet and positioning controller in space.
  • Character Space The local space that represents the player model itself in a VR game. Usually the origin of the coordinates is at the sole or waist of the player's model.
  • World Coordinate System A unique coordinate system that uniformly calibrates all objects in the entire VR game scene.
  • a method for processing a climbing operation in a VR scenario may include the following steps:
  • the hand model in the VR scene captures the climbing point, obtain the motion vector generated by the interaction controller in the measurement space from the climbing start time to the current climbing time, and the interaction controller climbs
  • the hand model is used to perform the climbing operation in the VR scene, and the hand model is part of the character model.
  • a sensor is disposed in the interaction controller.
  • the interaction control acquires the first position coordinates in the measurement space and transmits the position coordinates to the processing device, and the processing device converts the first position coordinates into the second position coordinates of the world coordinate system in the character space.
  • the measurement space is a real physical space calibrated by a positioning system produced by a VR hardware device manufacturer, usually several square meters to several tens of square meters, and the measurement space is used to acquire a VR display (such as a VR helmet) and an interactive controller in the real physics. Rotation and movement in space.
  • the interaction controller can be operated by the user, and the user can hold the interaction controller to perform the climbing motion, so that the interaction controller enters the climbing motion state from the static state.
  • the interaction controller is climbing.
  • the interaction controller can generate a first position in the measurement space, and the processing device acquires the interaction control at different moments.
  • the first position information generated by the device is then converted to the second position in the VR scene.
  • the climbing start time refers to the first moment when the interactive controller enters the climbing motion state from the stationary state, and the moment may be an image frame of the VR scene, and the interaction controller may be acquired every other image frame.
  • the interaction controller is held by the user, thereby controlling the climbing motion, and the processing device controls the second of the hand model in the VR scene according to the first position change of the interaction controller in the measurement space.
  • the position change, the climbing operation of the hand model is implemented in the VR scene, and the climbing motion of the interactive controller is completely controlled by the user.
  • the hand model belongs to a part of the character model, and the movement of the hand model in the VR scene is used to simulate the climbing operation performed by the interaction controller, and the hand model can be first displayed as a hand in the VR scene.
  • Modeling can also be a game prop that replaces the hand for climbing operations, such as gloves or hooks.
  • the character model can not appear in the VR scene, and the character model is mounted on the virtual character.
  • the angle of view of the camera lens is the viewing angle of the VR scene when the user wears the VR display. As the hand model moves, the angle of view of the virtual camera lens also needs to be moved, so that the user experiences the movement of the character model during the climbing process.
  • the method for processing a climbing operation in a VR scenario provided by the embodiment of the present invention further includes the following steps:
  • A1 The movement of the hand model in the character space is controlled by the movement of the interactive controller;
  • A2 detecting whether the hand model collides with the collision bounding box configured by the climbing point after moving, and determining that the hand model touches the climbing point when the hand model collides with the collision bounding box;
  • A3. Detect whether the climbing button set on the interaction controller is triggered. When the climbing button is triggered, determine that the hand model captures the climbing point;
  • A4. Timely start the climbing motion of the interaction controller, and record the position of the interaction controller at different times in the measurement space.
  • FIG. 3 is a schematic diagram of a climbing point
  • FIG. 4 is a schematic diagram of a hand model climbing scene.
  • the user moves the interaction controller, and the movement of the interaction controller can control the movement of the hand model in the character space.
  • the climbing point 30 has a collision bounding box 40, and the collision bounding box 40 is set.
  • the hand model collides with the collision bounding box 40, it can be determined that the hand model touches the climbing point, and when the hand model does not collide with the collision bounding box 40, the hand model is determined.
  • step A3 is executed to detect whether the climbing button set on the interaction controller is triggered.
  • the climbing button it is determined whether the hand model captures the climbing point, whether or not
  • the climbing movement is determined by the user-controlled interaction controller. For example, if the user triggers the climbing button, it is determined that the hand model captures the climbing point, and when the hand model captures the climbing point, the interactive controller can be started.
  • the climbing motion is performed to record the position of the interactive controller at different times in the measurement space, and then step 101 is performed.
  • the method for processing the climbing operation in the VR scenario provided by the embodiment of the present invention further includes the following steps:
  • the fixed player climber corresponds to the position of the hand model in the VR scene, so that the hand model The type no longer changes with the change of the position of the interactive controller in the player's hand. If the climbing button of the interactive controller in the player's hand is released, the response of the hand model to the change of the position of the interactive controller in the player's hand is restored.
  • the interactive controller is provided with a climbing button.
  • the user holds the interactive controller for the climbing action, the user holds the handheld controller with both hands, and the climbing button is triggered, indicating that the user wants Fix the hand to this position.
  • the hand model will be fixed to point A, waiting to continue moving.
  • the player's interactive controller's climbing button is released, it corresponds to the character space, the hand.
  • the model can move from point A and prepare to move from point A to another point (such as point B).
  • the method for processing the climbing operation in the VR scenario further includes the following steps:
  • the player touches and grabs the climbing point placed in the character space through the hand model, and gives the player a successful crawling feedback through the mechanical vibration of the built-in motor of the interaction controller, so that the player can determine the current through the vibration of the interaction controller.
  • the climbing point has been taken and subsequent climbing can be carried out.
  • the user controls the hand model to implement the climbing action in the character space by moving the interaction controller, and the hand model belongs to a part of the character model, and the character model also needs to move with the movement of the hand model.
  • Which of the climbing points the hand model moves into the VR scene is determined by the user-controlled interactive controller, and the hand model captures the climbing point, and the direction of the climbing point is not the same as the direction of the character model.
  • the moving zoom factor configured at the climbing point it is necessary to use the direction of the climbing point and the direction of the character model to perform a rotation correction on the initial moving scaling factor of the climbing point configuration to obtain a corrected moving scaling factor.
  • the initial movement scaling factor of the climbing point configuration is pre-configured, and the corrected moving scaling factor is obtained according to the direction of the climbing point and the direction of the character model.
  • the initial moving zoom factor can be configured at the climbing point, which is used to realize that the player's hand moves a lot in the real space but can be played in the game.
  • the problem of small movement If you do not zoom, for example, according to the 1:1 relationship to restore the movement range of the interactive controller, there will be a large movement of the player's hand. As a result, the game lens moves largely back and forth, and the large movement of the lens causes the lens to pass through the wall in some terrains.
  • step 102 performs a rotation correction on the initial movement scaling factor of the climbing point configuration according to the direction of the character model in the world coordinate system and the direction of the climbing point in the world coordinate system, including:
  • Rotation correction is performed on the initial moving scaling factor of the climbing point configuration using the rotation direction difference.
  • the initial movement scaling factor configured for the climbing point needs to be rotated corrected.
  • the direction of the climbing point is not the same as the direction of the character model.
  • it is necessary to calculate the difference in the direction of rotation between the direction of the climbing point and the direction of the character model for example, using the character model to rotate the value in the direction of the world coordinate system minus the climbing point.
  • the direction rotation value of the world coordinate system can obtain the rotation direction difference, and the initial movement scaling coefficient of the climbing point arrangement is rotated by the rotation direction difference, and the movement scaling factor after the rotation correction can be obtained.
  • the motion vector generated by the interaction controller from the start time of climbing to the current climbing time is a motion vector obtained in the measurement space, and the motion vector needs to be converted according to the corrected moving scaling factor for control.
  • the position of the virtual camera lens is adjusted to avoid large-scale movement of the user's hand, which causes the virtual camera lens to move forward and backward greatly.
  • the large movement of the lens may cause the lens to pass through the wall in some terrains, and the corrected moving zoom factor is used.
  • the movement vector adjusts the position of the virtual camera lens mounted in the character model in the world coordinate system
  • the corrected moving zoom factor and the motion vector can jointly determine the position of the virtual camera lens in the world coordinate system, and the position adjustment of the virtual camera lens It can meet the display zoom requirements in the VR scene, and the position adjustment of the virtual camera lens changes the display of the VR scene, such as displaying the hand model and the climbing point in the VR scene, and displaying other items of the VR scene.
  • step 103 uses the corrected moving scale factor and direction of movement.
  • the position of the virtual camera lens mounted by the character model in the world coordinate system including:
  • the scaled motion vector real-time value can be obtained by multiplying the motion vector real-time value of the interaction controller by the rotation-corrected motion scaling coefficient of the climbing point configuration, and the scaled motion vector is used to adjust the virtual camera lens in the world coordinate. The location in the system.
  • step E2 uses the scaled motion vector to adjust the position of the virtual camera lens mounted by the character model in the world coordinate system, including:
  • E22 Adjust the movement of the character model according to the final vector, and drive the position adjustment of the virtual camera lens in the world coordinate system by the movement of the character model.
  • the position of the character model and the game virtual camera lens needs to be moved, so that the movement of the hand following the player changes.
  • the scaled motion vector is inverted to obtain the final vector that the virtual camera lens needs to move. This is to simulate the climbing experience and it is necessary to translate the movement of the player's hand into the reverse movement of the virtual camera lens. For example, the player's hand moving from top to bottom in the measurement space needs to be transformed into a reverse movement of the virtual camera lens in the game scene from bottom to top to simulate the upward climbing experience.
  • the final vector obtained by reversing the motion vector the position of the character model and the game virtual camera lens is moved, and the VR scene of the shot rendering is changed, thereby completing the entire climbing process.
  • the interaction controller acquires the time from the climbing start time to the current in the measurement space.
  • the movement vector generated by the climbing moment, the climbing motion control hand model of the interactive controller performs the climbing operation in the VR scene, the hand model belongs to a part of the character model, and then according to the direction and the climbing of the character model in the world coordinate system.
  • the direction of the climbing point in the world coordinate system is rotated and corrected for the initial moving scaling factor of the climbing point configuration, and the corrected moving scaling factor is obtained.
  • the moving zoom factor and the motion vector adjust the position of the virtual camera lens mounted in the character model in the world coordinate system, and the position adjustment of the virtual camera lens changes the display of the VR scene.
  • the movement of the hand model in the embodiment of the present invention is completely controlled by the interaction controller operated by the user, and the movement of the user operation interaction controller is determined by the user, and does not need to be bound by the climbing area and the climbing route to solve the hand model.
  • the simulated climbing operation does not match the actual climbing action of the user, and an accurate simulation of the user's climbing action is realized.
  • the motion vector generated by the interactive controller in the measurement space from the climbing start moment to the current climbing moment represents the direction and moving distance of the user's climbing motion in the real world, and the initial setting of the climbing point in the VR scene is set.
  • the initial moving scaling factor needs to be corrected according to the direction difference between the direction of the character model in the world coordinate system and the direction of the climbing point in the world coordinate system, and the corrected moving scaling factor and the moving vector can be used.
  • Adjusting the position of the virtual camera lens mounted by the character model in the world coordinate system so the adjustment of the virtual camera lens in the embodiment of the present invention does not depend on the moving route defined by the climbing area and the climbing route, and the climbing in any VR scene. Climbing operations can be implemented to allow users to experience climbing movements that are close to real physics.
  • the following is an example of a VR scene as a VR game scene.
  • the climbing demand for any scene such as a mountain, a ladder, a house, etc. is enhanced by simulating a real physical climbing operation to enhance the player in the game.
  • the sense of immersion Referring to FIG.
  • a climbing point is set in a 3D game scene, and an initial moving scaling factor of a player's moving distance along three directions of the XYZ axis is configured at the climbing point, when the hand model touches And when grabbing the climbing point, keep the position of the hand model in the game scene unchanged, record and change the movement vector of the player's hands in the measurement space, according to the rotation direction difference and the climbing point of the climbing point and the character model.
  • the configured moving zoom factor transforms the motion vector, and uses the transformed vector to modify the position of the character model and the game virtual camera lens in the world coordinate system in real time.
  • the change of the position of the game virtual camera lens causes the game scene seen by the player in the VR helmet. Changes have occurred in this way to allow players to experience an operating experience close to real physical climbing in VR games.
  • the basic experience flow is as follows: the player wears the VR helmet, the two-hand grip interactive controller is in the measurement space of the VR positioning system, and the head rotates the VR helmet to rotate and move to control the virtual in the game.
  • the movement and rotation of the camera lens controls the movement and rotation of the corresponding hand model in the game by an interactive controller held by both hands.
  • the climbing operation usually requires both hands, but the method is applicable to both hands.
  • the hand model touches and grabs the climbing point set in the scene, and gives the player a successful grasp of the feedback through the mechanical vibration of the built-in motor of the interactive controller.
  • the mechanical vibration refers to the feedback of the hand model after the hand model is successfully captured. The vibration of the player's hand.
  • the climbing point needs to be placed and the moving scaling factor of the player moving the distance in different directions is configured in the climbing point. There is no need to set up the climbing area. Any climbing point can be grabbed and climbed as long as both hands can be touched. There is no limit to climbing between the connected climbing areas.
  • the game virtual camera lens is controlled by the distance moved by the player's hands and the zoom factor set by the different direction of the climb point setting, without the limitation of the preset movement route.
  • the climbing grip accuracy in the embodiment of the present invention is ensured by controlling the position and shape of the climbing point collision bounding box.
  • the so-called target direction is the direction in which the game is planned to guide the player. To complete a climb, you need to touch and grab the climbing point with one hand, move to the target direction and complete the second climb. The hands are alternately operated to complete the whole climbing process.
  • the direction of movement of the hand model is determined by the direction and distance of movement of the player's handheld interactive controller.
  • FIG. 5 Please refer to FIG. 5, which mainly includes the following climbing operation calculation process:
  • Step 1 Collision detection of the hand and the climbing point.
  • the climbing point with the collision bounding box is set at different positions in the scene, and the player controls the movement of the hand model in the game by moving the interactive controller held by the hand, and when the hand model moves into the climbing point collision bounding box
  • the game engine's physical system detects a collision and triggers an incoming event notification.
  • the exit notification is also triggered when the hand model moves out of the collision bounding box.
  • the positioning system is a hardware device that uses sensors to acquire the position and rotation information of the VR helmet and the interaction controller in the physical space.
  • the VR helmet is also called VR glasses.
  • the player wears the head and displays the virtual scene on the front screen.
  • the positioning system can capture the position and direction of the VR helmet in real time and feed it back into the virtual world.
  • the hand model is used to represent the model of the player's hands in the VR game, and the position and orientation of the hand model itself in the virtual scene is changed by the positioning result of the interaction controller. Climbing operation and hand phase Off, using the hand model or holding other models by hand to climb can be considered the same type.
  • Step 2 Monitor the status of the button being pressed.
  • the processing device After receiving the collision bounding box event of the hand model entering the climbing point, it starts to monitor whether the climbing button located on the interactive controller in the player's hand is manually pressed by the player. At this point, the button is pressed by the player, and the processing device receives the signal generated by the interaction controller to determine the start of the climb.
  • Step 3 Climb begins.
  • the player is triggered to climb the motor of the interactive controller held by the hand to vibrate, informing the player that the climbing point has been gripped; secondly, fixing the position of the player climbing hand corresponding to the hand model, so that the hand model no longer follows the player.
  • the position of the interactive controller changes in the hand; finally, the interface provided by the hardware device acquires the coordinate value of the interactive controller held by the player in the measurement space and saves it.
  • Step 4 Continuous monitoring of the status of the climb button being pressed.
  • the climb button of the interactive controller in the player's hand is released, the response of the hand model to the change of the position of the interactive controller in the player's hand is restored. If the climbing button remains in the pressed state, it enters the climbing process.
  • Step 5 Climb the process.
  • the interface provided by the hardware device continuously acquires the coordinate value of the interactive controller held by the player in the measurement space in real time, and performs the subtraction calculation with the initial coordinate value saved at the beginning of the step 3 climbing, and obtains The real-time value of the interaction controller movement vector held by the player in the measurement space from the start of the climb.
  • the movement scaling factor along the XYZ axis direction of the climbing point configuration is rotated and corrected.
  • the direction of the climbing point is not the same as the direction of the character model.
  • the subtraction is the direction of the climbing point model in the world coordinate system, and the subtraction is the character model.
  • the real-time value of the hand movement vector is multiplied by the movement amplitude scaling value of the three directions of the rotation correction XYZ of the climbing point configuration, and the real-time value of the motion vector after the scaling correction is obtained.
  • Step 6 Move the character model and the position of the game virtual camera lens.
  • the position of the character model and the game virtual camera lens needs to be moved so that the movement of the hand following the player changes.
  • the real-time value of the motion vector after scaling correction is inverted to obtain the final value that the character model and the game virtual camera lens need to move. This is to simulate the climbing experience and it is necessary to translate the movement of the player's hand into the reverse movement of the game's virtual camera lens. For example, the top-to-bottom movement of the player's hand in the measurement space needs to be translated into a backward-to-up movement of the game virtual camera lens in the game scene to simulate the upward climbing experience.
  • the position of the character model and the game virtual camera lens are moved, and the game scene rendered by the lens is changed to complete the entire climbing process.
  • the measured data is the displacement vector of the hands of the player in the real physical space, and the helmet worn by the player does not move in the real space.
  • the climbing experience requires lens movement, which is applied to the virtual camera lens by inverting the motion vector of the interactive controller, such as a movement from top to bottom of the player's hand, and the movement of the lens from bottom to top in the game. .
  • the lens position changes, the game scene that the player sees will change, and the camera moves upwards, and the player has the experience of climbing up.
  • the embodiment of the invention solves the realization of the climbing operation in any VR game scene.
  • the lens By placing the climbing point and configuring the zooming coefficient of the moving distance of the player in different directions in the climbing point, the lens can be effectively prevented from passing through the wall during the climbing process. A bad experience with the player pushing and pulling the entire game scene, allowing the player to experience a climbing operation close to real physics.
  • a climbing operation processing apparatus 300 in a VR scenario may include: a motion vector acquiring module 301, a scaling coefficient correction module 302, and a display adjustment module 303, where
  • the motion vector obtaining module 301 is configured to acquire, when the hand model in the VR scene captures the climbing point, the motion vector generated by the interaction controller in the measurement space from the climbing start time to the current climbing time.
  • the climbing motion of the interactive controller controls the hand model to perform a climbing operation in the VR scene, the hand model being part of a character model;
  • a scaling factor correction module 302 configured to perform rotation correction on an initial movement scaling factor of the climbing point configuration according to a direction of the character model in a world coordinate system and a direction of the climbing point in the world coordinate system , obtaining the corrected moving scaling factor;
  • a display adjustment module 303 configured to adjust, by using the corrected moving zoom factor and the motion vector, a position of the virtual camera lens mounted by the character model in the world coordinate system, where the position of the virtual camera lens is adjusted Change the display of the VR scene.
  • the scaling factor correction module 302 is specifically configured to calculate a direction of the character model in a world coordinate system and a direction of the climbing point in the world coordinate system. a rotation direction difference; the initial movement scaling factor of the climbing point configuration is subjected to rotation correction using the rotation direction difference.
  • the display adjustment module 303 includes:
  • a motion vector scaling module 3031 configured to perform a scaling process on the motion vector generated by the interaction controller in the measurement space from a climbing start time to a current climbing time, using the corrected moving scaling factor, to obtain a scaled movement vector;
  • the lens position adjustment module 3032 is configured to adjust a position of the virtual camera lens mounted by the character model in the world coordinate system by using the scaled motion vector.
  • the lens position adjustment module 3032 is specifically configured to perform a reverse calculation on the scaled motion vector to obtain a final vector that the virtual camera lens needs to move;
  • the vector adjusts the movement of the character model, and the position adjustment of the virtual camera lens in the world coordinate system is driven by the movement of the character model.
  • the climbing operation processing apparatus 300 in the VR scenario further includes:
  • a hand model control module 304 for controlling the hand model by movement of the interaction controller Movement in the VR scene;
  • the collision detecting module 305 is configured to detect whether the hand model is collided with the collision bounding box configured by the climbing point after the movement of the hand model, and when the hand model collides with the collision bounding box, determine the hand model touch Meet the climbing point;
  • the capture detection module 306 is configured to detect whether a climbing button set on the interaction controller is triggered, and when the climbing button is triggered, determining that the hand model captures the climbing point;
  • the location recording module 307 is configured to time the interaction controller to start a climbing motion, and record the position of the interaction controller at different times in the measurement space.
  • the climbing operation processing device 300 in the VR scene further includes:
  • the locking module 308 is configured to keep the position of the hand model under the VR scene unchanged when the hand model captures the climbing point.
  • the climbing operation processing apparatus 300 in the VR scenario further includes: a vibration feedback module 309, for the VR, as shown in FIG.
  • a vibration feedback module 309 for the VR, as shown in FIG.
  • the acquisition interactive controller moves from the climbing start time to the current climbing in the measurement space.
  • the motion vector generated at any time, the climbing motion control hand model of the interactive controller performs the climbing operation in the VR scene, the hand model belongs to a part of the character model, and then according to the direction and the climbing point of the character model in the world coordinate system.
  • the direction in the world coordinate system is rotated and corrected for the initial moving scaling factor of the climbing point configuration, and the corrected moving scaling factor is obtained, and finally the virtual camera lens mounted by the character model is adjusted using the corrected moving scaling factor and the motion vector.
  • the position in the world coordinate system, the position adjustment of the virtual camera lens changes the display of the VR scene.
  • the movement of the hand model in the embodiment of the present invention is completely controlled by the interaction controller operated by the user, and the movement of the user operation interaction controller is determined by the user, and does not need to be bound by the climbing area and the climbing route to solve the hand model.
  • the simulated climbing operation does not match the actual climbing action of the user, and an accurate simulation of the user's climbing action is realized.
  • the vector representation of the motion generated by the interactive controller from the start of the climb to the current climb time in the measurement space The user moves the direction and the moving distance of the climbing motion in the real world.
  • the climbing point in the VR scene there is an initial moving scaling factor, which needs to be oriented and climbed in the world coordinate system according to the character model.
  • the difference in direction between the directions of the climb points in the world coordinate system is corrected, and the corrected moving zoom factor and the motion vector can be used to adjust the position of the virtual camera lens of the character model mount in the world coordinate system, and thus the embodiment of the present invention
  • the adjustment of the virtual camera lens does not depend on the movement route defined by the climbing area and the climbing route.
  • the climbing operation in any VR scene can be realized, allowing the user to experience the climbing movement close to the real physics.
  • FIG. 11 is a schematic structural diagram of a processing device of a climbing operation processing device (hereinafter may be referred to as a processing device) in a VR scenario according to an embodiment of the present invention.
  • the processing device is applied to a VR positioning system, and an interaction controller, as shown in FIG.
  • the VR display device is connected to the processing device in a wired or wireless manner, and the VR display device is connected to the processing device in a wireless or wired manner.
  • the processing device 1100 can vary considerably depending on configuration or performance, and can include one or more central processing units (CPUs) 1122 (eg, one or more processors) and memory 1132, one or More than one storage medium 1130 storing storage application 1142 or data 1144 (eg, one or one storage device in Shanghai).
  • the memory 1132 and the storage medium 1130 may be short-term storage or persistent storage.
  • the program stored on storage medium 1130 may include one or more modules (not shown), each of which may include a series of instruction operations in the processing device.
  • central processor 1122 can be configured to communicate with storage medium 1130, executing a series of instruction operations in storage medium 1130 on processing device 1100.
  • Processing device 1100 may also include one or more power sources 1126, one or more wired or wireless network interfaces 1150, one or more input and output interfaces 1158, and/or one or more operating systems 1141.
  • the steps of the climbing operation processing method in the VR scene executed by the processing device in the above embodiment may be based on the processing device configuration shown in FIG.
  • the central processing unit 1122 is configured to acquire, when the hand model in the VR scene captures the climbing point, acquire a motion vector generated by the interaction controller in the measurement space from the climbing start time to the current climbing time.
  • the climbing motion of the interactive controller controls the hand model to perform a climbing operation in the VR scene
  • the hand model belongs to a part of the character model; an initial movement scaling factor configured for the climbing point according to a direction of the character model in the character space and a direction of the climbing point in the character space Performing rotation correction to obtain a corrected moving scaling factor; adjusting the position of the virtual camera lens in the character space using the corrected moving scaling factor and the motion vector; according to the position of the virtual camera lens Adjust to change the display of the VR scene.
  • the central processing unit 1122 is configured to calculate a rotation direction difference between a direction of the character model on a world coordinate system in the character space and a direction of the climbing point in the world coordinate system; Rotation correction is performed on the initial movement scaling factor of the climbing point configuration according to the rotation direction difference.
  • the central processing unit 1122 is further configured to use the corrected moving scaling factor to perform a scaling process on the motion vector generated by the interaction controller in the measurement space from the start time of the climbing to the current climbing time.
  • a scaled motion vector adjusting a position of the virtual camera lens of the character model mount in the world coordinate system in the character space according to the scaled motion vector.
  • the central processing unit 1122 is configured to perform a reverse calculation on the scaled motion vector to obtain a final vector that the virtual camera lens needs to move; and adjust the movement of the character model according to the final vector, by using the The movement of the character model drives the position adjustment of the virtual camera lens in the world coordinate system.
  • the central processing unit 1122 is further configured to: control movement of the hand model in the character space by movement of the interaction controller; and detect whether the hand model is set at the climbing point Colliding the collision bounding box; when detecting that the hand model collides with the collision bounding box, determining that the hand model touches the climbing point; detecting whether the climbing button is triggered; When it is detected that the climbing button is triggered, it is determined that the hand model captures the climbing point; and the position of the interaction controller in the measurement space is recorded at different times.
  • the central processing unit 1122 is further configured to fix the position of the hand model in the character space to remain unchanged.
  • the network interface 1150 is configured to send a feedback signal to the interaction controller when the hand model in the VR scenario captures the climbing point, so that the interaction controller is configured according to the feedback signal. Issue a feedback message that the crawl point has been successfully crawled.
  • the device embodiments described above are merely illustrative, wherein the The units described for the separate components may or may not be physically separate, and the components displayed as the units may or may not be physical units, ie may be located in one place, or may be distributed over multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • the connection relationship between the modules indicates that there is a communication connection between them, and specifically, one or more communication buses or signal lines can be realized. Those of ordinary skill in the art can understand and implement without any creative effort.
  • the present invention can be implemented by means of software plus necessary general hardware, and of course, dedicated hardware, dedicated CPU, dedicated memory, dedicated memory, Special components and so on.
  • functions performed by computer programs can be easily implemented with the corresponding hardware, and the specific hardware structure used to implement the same function can be various, such as analog circuits, digital circuits, or dedicated circuits. Circuits, etc.
  • software program implementation is a better implementation in more cases.
  • the technical solution of the present invention which is essential or contributes to the prior art, can be embodied in the form of a software product stored in a readable storage medium, such as a floppy disk of a computer.
  • U disk mobile hard disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), disk or optical disk, etc., including a number of instructions to make a computer device (may be A personal computer, server, or network device, etc.) performs the methods described in various embodiments of the present invention.
  • a computer device may be A personal computer, server, or network device, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种VR场景下的攀爬操作处理方法和装置,用于解决手部模型模拟的攀爬操作与用户实际攀爬动作不匹配的问题。所述方法中,当VR场景下的手部模型抓取到攀爬点时,获取交互控制器在测量空间中从攀爬开始时刻到当前攀爬时刻产生的移动向量,手部模型属于角色模型的一部分(101);根据角色模型在角色空间中的方向和攀爬点在角色空间中的方向对攀爬点配置的初始移动缩放系数进行旋转矫正,得到矫正后的移动缩放系数(102);使用矫正后的移动缩放系数和移动向量调整角色模型挂载的虚拟相机镜头在角色空间中的位置,根据所述虚拟相机镜头的位置调整改变VR场景的显示(103)。

Description

一种VR场景下的攀爬操作处理方法、装置及可读存储介质
本申请要求于2016年12月07日提交中国专利局,申请号为201611116880.4,发明名称为“一种VR场景下的攀爬操作处理方法和装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本发明涉及计算机技术领域,尤其涉及一种VR场景下的攀爬操作处理方法、装置及可读存储介质。
背景技术
虚拟现实(Virtual Reality,VR)技术是一种可以创建和体验虚拟世界的计算机仿真系统,它利用计算机生成一种模拟环境,是一种多源信息融合的交互式的三维动态视景和实体行为的系统仿真,使用户沉浸到该环境中。
VR技术的重要应用是VR游戏,VR游戏的一大特色就是通过使用交互控制器能够让玩家在游戏中像在现实世界中一样进行操作交互,为此通过操作交互控制器在游戏中进行攀爬操作是VR游戏的一个重要的玩法。
在目前的VR游戏中,有一种场景是VR游戏使用带定位功能的交互控制器与用户进行交互操作,这与传统游戏使用鼠标、键盘、手柄等设备进行操作的区别很大。现有技术中,利用传统的游戏手柄进行攀爬操作,现有的攀爬场景设计中,在手部模型的攀爬路线上设置攀爬区域,每个攀爬区域中都对应有一个标识指示玩家的身体朝向,并且多个攀爬区域需要使用攀爬路线串联起来。
现有的VR场景下的攀爬操作处理方法中,在VR场景中除了指定攀爬点之外,还会限制手部模型只能在攀爬区域内的指定的攀爬线路上完成攀爬操作,因为游戏虚拟相机镜头的位置控制需要将玩家手部的移动向量映射到攀爬路线所定义的移动路线上去,这就会极大限制手部模型在VR场景下对玩家操作的模拟,使得手部模型模拟出的攀爬操作与玩家在真实世界中的攀爬动作不一致。
发明内容
本发明实施例提供了一种VR场景下的攀爬操作处理方法和装置,用于解决手部模型模拟的攀爬操作与用户实际攀爬动作不匹配的问题,实现对用户攀爬动作的准确模拟。
为解决上述技术问题,本发明实施例提供以下技术方案:
第一方面,本发明实施例提供一种VR场景下的攀爬操作处理方法,包括:
当在所述VR场景下的手部模型抓取到攀爬点时,获取交互控制器在测量空间中从攀爬开始时刻到当前攀爬时刻产生的移动向量,所述交互控制器的攀爬运动控制所述手部模型在所述VR场景下进行攀爬操作,所述手部模型属于角色模型的一部分;
根据所述角色模型在角色空间中的方向和所述攀爬点在所述角色空间中的方向对所述攀爬点配置的初始移动缩放系数进行旋转矫正,得到矫正后的移动缩放系数;
使用所述矫正后的移动缩放系数和所述移动向量调整所述角色模型挂载的虚拟相机镜头在所述角色空间中的位置,根据所述虚拟相机镜头的位置调整改变所述VR场景的显示。
第二方面,本发明实施例还提供一种VR场景下的攀爬操作处理装置,包括:
移动向量获取模块,用于当在所述VR场景下的手部模型抓取到攀爬点时,获取交互控制器在测量空间中从攀爬开始时刻到当前攀爬时刻产生的移动向量,所述交互控制器的攀爬运动控制所述手部模型在所述VR场景下进行攀爬操作,所述手部模型属于角色模型的一部分;
缩放系数校正模块,用于根据所述角色模型在角色空间中的方向和所述攀爬点在所述角色空间中的方向对所述攀爬点配置的初始移动缩放系数进行旋转矫正,得到矫正后的移动缩放系数;
显示调整模块,用于根据所述矫正后的移动缩放系数和所述移动向量调整所述角色模型挂载的虚拟相机镜头在所述世界坐标系中的位置,所述虚拟相机镜头的位置调整改变所述VR场景的显示。
从以上技术方案可以看出,本发明实施例具有以下优点:
在本发明实施例中,当VR场景下的手部模型抓取到攀爬点时,获取交互控制器在测量空间中从攀爬开始时刻到当前攀爬时刻产生的移动向量,交互控制器的攀爬运动控制手部模型在VR场景下进行攀爬操作,手部模型属于角色模型的一部分,然后根据角色模型在角色空间中的方向和攀爬点在角色空间中的方向对攀爬点配置的初始移动缩放系数进行旋转矫正,得到矫正后的移动缩放系数,最后使用矫正后的移动缩放系数和移动向量调整角色模型挂载的虚拟相机镜头在角色空间中的位置,虚拟相机镜头的位置调整改变VR场景的显示。本发明实施例中手部模型的移动完全由用户操作的交互控制器来控制,用户操作交互控制器的移动由用户来决定,不需要受到攀爬区域和攀爬路线的约束,解决手部模型模拟的攀爬操作与用户实际攀爬动作不匹配的问题,实现对用户攀爬动作的准确模拟。交互控制器在测量空间中从攀爬开始时刻到当前攀爬时刻产生的移动向量表示了用户在真实世界中进行攀爬动作移动的方向和移动距离,对于VR场景下的攀爬点设置有初始移动缩放系数,该初始移动缩放系数需要根据角色模型在世界坐标系中的方向和攀爬点在角色空间中的方向之间的方向差异进行矫正,使用矫正后的移动缩放系数和移动向量可以调整角色模型挂载的虚拟相机镜头在角色空间中的位置,因此本发明实施例中虚拟相机镜头的调整不依赖于攀爬区域和攀爬路线所定义的移动路线,任意VR场景下的攀爬操作都可以实现,让用户体验到接近真实物理的攀爬运动。
附图说明
为了更清楚地说明本发明实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域的技术人员来讲,还可以根据这些附图获得其他的附图。
图1为本发明实施例中提供的一种VR定位系统的架构示意图;
图2为本发明实施例中攀爬点的场景示意图;
图3为本发明实施例中手部模型攀爬的场景示意图;
图4为本发明实施例提供的一种VR场景下的攀爬操作处理方法的流程方框示意图;
图5为本发明实施例提供的VR场景下的攀爬操作处理的计算流程方框示意图;
图6为本发明实施例提供的一种VR场景下的攀爬操作处理装置的组成结构示意图;
图7为本发明实施例提供的一种显示调整模块的组成结构示意图;
图8为本发明实施例提供的另一种VR场景下的攀爬操作处理装置的组成结构示意图;
图9为本发明实施例提供的另一种VR场景下的攀爬操作处理装置的组成结构示意图;
图10为本发明实施例提供的另一种VR场景下的攀爬操作处理装置的组成结构示意图;
图11为本发明实施例提供的VR场景下的攀爬操作处理方法应用于服务器的组成结构示意图。
具体实施方式
本发明实施例提供了一种VR场景下的攀爬操作处理方法和装置,用于解决手部模型模拟的攀爬操作与用户实际攀爬动作不匹配的问题,实现对用户攀爬动作的准确模拟。
为使得本发明的发明目的、特征、优点能够更加的明显和易懂,下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,下面所描述的实施例仅仅是本发明一部分实施例,而非全部实施例。基于本发明中的实施例,本领域的技术人员所获得的所有其他实施例,都属于本发明保护的范围。
本发明的说明书和权利要求书及上述附图中的术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,以便包含一系列单元的过程、方法、系统、产品或设备不必限于那些单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它单元。
本发明VR场景下的攀爬操作处理方法的一个实施例,具体可以应用于VR场景下使用手部模型对用户使用交互控制器执行的攀爬运动进行模拟的场 景中,本发明实施例中VR场景具体可以指的是VR游戏场景,也可以指的是应用程序的VR操作场景,例如办公软件的VR应用操作场景、角色的VR应用操作场景等。
本申请实施例中提供的一种VR场景下的攀爬操作处理方法,该方法应用于一种VR定位系统,请参阅图1所示,图1为本申请实施例中提供的一种VR定位系统,该VR定位系统10包括处理装置11、VR显示装置12和交互控制器13。VR显示装置12与处理装置11连接。该VR显示设备12为VR头显(虚拟现实头戴式显示设备),可以为VR眼镜、VR头盔等。VR显示装置是一种利用头戴式显示器将人的对外界的视觉、听觉封闭,引导用户产生一种身在虚拟环境中的感觉的装置。该处理装置11可以为计算机等终端设备。交互控制器13是VR设备厂商生产的控制设备,例如,用户可以使用左右手各手持一个交互控制器,交互控制器是VR设备配套的手部操控设备,可以追踪双手的位置、旋转和输入,获取到的相关数据在VR场景中使用。
为了方便理解,首先对本申请实施例中涉及的词语进行解释:
手部模型:在VR游戏中用来代表玩家双手的模型,由定位控制器的定位结果来改变手部模型自身的位置和方向。
攀爬点:在游戏场景中摆放的透明的碰撞包围盒,用于标明角色手部模型可以进行攀爬抓取的区域。
测量空间:由VR硬件设备厂商生产的定位系统所标定的真实物理空间,通常为几平米到几十平米。测量空间用于获取VR头盔和定位控制器在空间中的旋转和移动。
角色空间:VR游戏中代表玩家模型自身的局部空间。通常坐标原点在玩家模型脚底或者腰部。
世界坐标系:对整个VR游戏场景中所有物体进行统一标定的唯一的坐标系。
请参阅图2所示,本发明一个实施例提供的VR场景下的攀爬操作处理方法,可以包括如下步骤:
101、当VR场景下的手部模型抓取到攀爬点时,获取交互控制器在测量空间中从攀爬开始时刻到当前攀爬时刻产生的移动向量,交互控制器的攀爬运 动控制手部模型在VR场景下进行攀爬操作,手部模型属于角色模型的一部分。
在本发明实施例中,该交互控制器中设置有传感器,当用户手持该交互传感器进行攀爬操作时,用户移动该交互传感器,该交互控制器中的传感器检测到该交互控制器被移动,该交互控制获取到在测量空间中的第一位置坐标,并将该位置坐标发送给处理装置,处理装置将该第一位置坐标转换为在角色空间中的世界坐标系的第二位置坐标。其中,测量空间是由VR硬件设备厂商生产的定位系统所标定的真实物理空间,通常为几平米到几十平米,测量空间用于获取VR显示器(例如VR头盔)和交互控制器在该真实物理空间中的旋转和移动。
本发明实施例中,交互控制器可以由用户手持操作,用户可以握起交互控制器进行攀爬运动,从而交互控制器从静止状态进入攀爬运动状态,本发明实施例中交互控制器在攀爬运动过程中从攀爬开始时刻(如第一时刻)及在该第一时刻之后的每一个时刻,交互控制器都可以产生一个在测量空间中的第一位置,处理装置获取不同时刻交互控制器产生的第一位置信息,然后将该第一位置转换为在VR场景中的第二位置。其中,攀爬开始时刻是指交互控制器从静止状态进入攀爬运动状态时的第一个时刻,该时刻可以是VR场景的图像帧,可以每隔一个图像帧就获取到交互控制器产生的一个位置,本发明实施例中以当前攀爬时刻为第N个时刻为例,则需要获取从攀爬开始时刻到当前攀爬时刻中每一个时刻分别产生的第二位置。需要说明的是,本发明实施例中,交互控制器由用户手持,从而控制进行攀爬运动,处理装置根据交互控制器在测量空间的第一位置变化控制手部模型在VR场景中的第二位置变化,在VR场景中实现手部模型的攀爬操作,交互控制器的攀爬运动完全由用户来控制,本发明实施例中不需要设置攀爬区域,也不需要提前规划好攀爬路线。另外本发明实施例中手部模型属于角色模型的一部分,手部模型在VR场景的移动用于模拟交互控制器进行的攀爬操作,手部模型在VR场景下可以先显示为一个手部的造型,也可以是代替手部进行攀爬操作的一个游戏道具,例如手套或者钩子等,在VR场景下角色模型可以不出现在VR场景中的,角色模型上挂载的虚 拟相机镜头的视角是用户头戴VR显示器时观看VR场景的视角,随着手部模型的移动,虚拟相机镜头的视角也需要移动,从而使用户体验到攀爬过程中的角色模型的移动。
在本发明的一些实施例中,除了执行前述的步骤101之外,本发明实施例提供的VR场景下的攀爬操作处理方法,还包括如下步骤:
A1、通过交互控制器的移动控制手部模型在角色空间的移动;
A2、检测手部模型移动后是否与攀爬点配置的碰撞包围盒进行碰撞,当手部模型碰撞到碰撞包围盒时,确定手部模型触碰到攀爬点;
A3、检测交互控制器上设置的攀爬按键是否被触发,当攀爬按键被触发时,确定手部模型抓取到攀爬点;
A4、对交互控制器开始进行攀爬运动进行计时,记录交互控制器在测量空间中不同时刻的位置。
其中,在步骤101执行之前可以先执行步骤A1,请结合图3和图4进行理解,图3为攀爬点示意图,图4为手部模型攀爬场景示意图。用户移动交互控制器,该交互控制器的移动可以控制手部模型在角色空间的移动,手部模型在角色空间下移动时,攀爬点30带有碰撞包围盒40,碰撞包围盒40被设置在角色空间中的不同位置,当手部模型碰撞到碰撞包围盒40时,就可以确定手部模型触碰到攀爬点,当手部模型没有碰撞到碰撞包围盒40时,确定手部模型没有触碰到攀爬点,此时用户可以继续控制交互控制器移动。手部模型触碰到攀爬点30时执行步骤A3,检测交互控制器上设置的攀爬按键是否被触发,当攀爬按键被触发时,确定手部模型抓取到攀爬点,是否进行攀爬运动由用户控制交互控制器来确定,例如用户将攀爬按键触发,则确定手部模型抓取到攀爬点,手部模型抓取到攀爬点时,就可以对交互控制器开始进行攀爬运动进行计时,记录交互控制器在测量空间中不同时刻的位置,接下来再执行步骤101。
进一步的,在本发明的一些实施例中,当攀爬按键被触发时,本发明实施例提供的VR场景下的攀爬操作处理方法,还包括如下步骤:
B1、固定手部模型在角色空间下的位置保持不变。
其中,固定玩家攀爬手对应手部模型在VR场景下的位置,使得该手部模 型不再随玩家手中交互控制器位置的改变而变化,如果玩家手中交互控制器的攀爬按键被松开,则恢复手部模型对玩家手中交互控制器位置改变的响应。
在一个应用场景中,交互控制器上设置有攀爬按键,当用户手持交互控制器进行攀爬动作时,用户双手用力握住手持控制器,该攀爬按键被触发,此时表明用户想要将手固定到该位置,在角色空间中,手部模型就会固定到A点,等待继续移动,当玩家手中的交互控制器的攀爬按键被松开时,对应到角色空间中,手部模型可以从A点开始移动,准备从A点移动到另一点(如B点)。
在本发明的一些实施例中,当VR场景下的手部模型抓取到攀爬点时,本发明实施例提供的VR场景下的攀爬操作处理方法,还包括如下步骤:
C1、通过交互控制器发送攀爬点抓取成功的振动反馈消息。
其中,玩家通过手部模型触碰并抓取角色空间中摆放的攀爬点,通过交互控制器内置马达的力学振动给予玩家成功抓取的反馈,使得玩家能够通过交互控制器的振动确定当前已经抓紧攀爬点,可以进行后续的攀爬运动。
102、根据角色模型在世界坐标系中的方向和攀爬点,在世界坐标系中的方向对攀爬点配置的初始移动缩放系数进行旋转矫正,得到矫正后的移动缩放系数。
在本发明实施例中,用户通过移动交互控制器来控制手部模型在角色空间中实现攀爬动作,该手部模型属于角色模型的一部分,则角色模型也需要随着手部模型的移动产生移动,手部模型移动到VR场景中的哪个攀爬点由用户控制的交互控制器来决定,手部模型抓取到攀爬点,该攀爬点的方向和角色模型的方向并不相同,为了使用配置在攀爬点的移动缩放系数,需要使用攀爬点的方向和角色模型的方向对攀爬点配置的初始移动缩放系数进行旋转矫正,得到矫正后的移动缩放系数。其中,攀爬点配置的初始移动缩放系数为预先配置,根据攀爬点的方向和角色模型的方向进行旋转校正后得到矫正后的移动缩放系数。以VR场景具体为VR游戏场景为例,为了适应游戏场景地形复杂的情形,可以在攀爬点配置了初始移动缩放系数,用于实现玩家手部在真实空间中移动幅度很大但在游戏中可以移动幅度很小的问题。如果不做缩放,比如按照1:1的关系还原交互控制器的移动幅度,就会出现玩家手部大幅度前后移动, 导致游戏镜头大幅度前后移动,镜头的大幅度移动在一些地形中会造成镜头穿墙的问题。
在本发明的一些实施例中,步骤102根据角色模型在世界坐标系中的方向和攀爬点在世界坐标系中的方向对攀爬点配置的初始移动缩放系数进行旋转矫正,包括:
D1、计算角色模型在世界坐标系中的方向和攀爬点在世界坐标系中的方向之间的旋转方向差;
D2、使用旋转方向差对攀爬点配置的初始移动缩放系数进行旋转矫正。
其中,对攀爬点配置的初始移动缩放系数需要进行旋转矫正。由于攀爬点的方向和角色模型的方向并不相同。为了使用配置在攀爬点的移动缩放系数,需要计算攀爬点的方向和角色模型的方向之间的旋转方向差,例如,使用角色模型在世界坐标系的方向旋转值减去攀爬点在世界坐标系的方向旋转值即可得到旋转方向差,将攀爬点配置的初始移动缩放系数按旋转方向差进行旋转,可以得到旋转矫正后的移动缩放系数。
103、使用矫正后的移动缩放系数和移动向量调整角色模型挂载的虚拟相机镜头在世界坐标系中的位置,虚拟相机镜头的位置调整改变VR场景的显示。
在本发明实施例中,交互控制器从攀爬开始时刻到当前攀爬时刻产生的移动向量是在测量空间得到的移动向量,该移动向量需要按照矫正后的移动缩放系数进行转换才能用于控制虚拟相机镜头的位置调整,避免出现用户手部大幅度前后移动,导致虚拟相机镜头大幅度前后移动,镜头的大幅度移动在一些地形中会造成镜头穿墙的问题,使用矫正后的移动缩放系数和移动向量调整角色模型挂载的虚拟相机镜头在世界坐标系中的位置,矫正后的移动缩放系数和移动向量可以共同确定虚拟相机镜头在世界坐标系中的位置,该虚拟相机镜头的位置调整能够符合VR场景下的显示缩放要求,虚拟相机镜头的位置调整改变VR场景的显示,例如显示VR场景下的手部模型和攀爬点,以及显示VR场景的其它道具等。
在本发明的一些实施例中,步骤103使用矫正后的移动缩放系数和移动向 量调整角色模型挂载的虚拟相机镜头在世界坐标系中的位置,包括:
E1、使用矫正后的移动缩放系数对交互控制器在测量空间中从攀爬开始时刻到当前攀爬时刻产生的移动向量进行缩放处理,得到缩放后的移动向量。
E2、使用缩放后的移动向量调整角色模型挂载的虚拟相机镜头在世界坐标系中的位置。
其中,用交互控制器的移动向量实时值乘以攀爬点配置的旋转矫正后的移动缩放系数可以得到缩放后的移动向量实时值,该缩放后的移动向量用于调整虚拟相机镜头在世界坐标系中的位置。
进一步的,在本发明的一些实施例中,步骤E2使用缩放后的移动向量调整角色模型挂载的虚拟相机镜头在世界坐标系中的位置,包括:
E21、对缩放后的移动向量进行取反计算,得到虚拟相机镜头需要移动的最终向量;
E22、按照最终向量调整角色模型的移动,通过角色模型的移动驱动虚拟相机镜头在世界坐标系中的位置调整。
其中,得到缩放后的移动向量后,需要移动角色模型及游戏虚拟相机镜头的位置,使得跟随玩家的手部移动发生改变。首先对缩放后的移动向量取反,得到虚拟相机镜头需要移动的最终向量。这是为了模拟攀爬体验,需要将玩家手部的移动转化成虚拟相机镜头的反向移动。比如玩家手部在测量空间中从上到下的移动,需要转化成虚拟相机镜头在游戏场景中反向的从下到上的移动,来模拟向上攀爬的体验。最后根据取反后移动向量得到的最终向量来移动角色模型及游戏虚拟相机镜头的位置,镜头渲染的VR场景随之改变,从而完成整个攀爬过程。
通过以上实施例对本发明实施例的描述可知,在本发明实施例中,当VR场景下的手部模型抓取到攀爬点时,获取交互控制器在测量空间中从攀爬开始时刻到当前攀爬时刻产生的移动向量,交互控制器的攀爬运动控制手部模型在VR场景下进行攀爬操作,手部模型属于角色模型的一部分,然后根据角色模型在世界坐标系中的方向和攀爬点在世界坐标系中的方向对攀爬点配置的初始移动缩放系数进行旋转矫正,得到矫正后的移动缩放系数,最后使用矫正后 的移动缩放系数和移动向量调整角色模型挂载的虚拟相机镜头在世界坐标系中的位置,虚拟相机镜头的位置调整改变VR场景的显示。本发明实施例中手部模型的移动完全由用户操作的交互控制器来控制,用户操作交互控制器的移动由用户来决定,不需要受到攀爬区域和攀爬路线的约束,解决手部模型模拟的攀爬操作与用户实际攀爬动作不匹配的问题,实现对用户攀爬动作的准确模拟。交互控制器在测量空间中从攀爬开始时刻到当前攀爬时刻产生的移动向量表示了用户在真实世界中进行攀爬动作移动的方向和移动距离,对于VR场景下的攀爬点设置有初始移动缩放系数,该初始移动缩放系数需要根据角色模型在世界坐标系中的方向和攀爬点在世界坐标系中的方向之间的方向差异进行矫正,使用矫正后的移动缩放系数和移动向量可以调整角色模型挂载的虚拟相机镜头在世界坐标系中的位置,因此本发明实施例中虚拟相机镜头的调整不依赖于攀爬区域和攀爬路线所定义的移动路线,任意VR场景下的攀爬操作都可以实现,让用户体验到接近真实物理的攀爬运动。
为便于更好的理解和实施本发明实施例的上述方案,下面举例相应的应用场景来进行具体说明。
下面以VR场景具体为VR游戏场景为例进行说明,在使用交互控制器的VR游戏中,针对山峰,梯子,房屋等任意场景的攀爬需求,通过模拟真实物理攀爬操作来增强玩家在游戏中的沉浸感。请参阅图5所示,本发明实施例中在3D游戏场景中设置攀爬点,并在攀爬点配置玩家沿其XYZ轴三个方向移动距离的初始移动缩放系数,当手部模型触碰并抓取攀爬点时,保持手部模型在游戏场景中的位置不变,记录和变换玩家双手在测量空间中的移动向量,根据攀爬点和角色模型的旋转方向差和攀爬点上配置的移动缩放系数对移动向量进行变换,利用变换后的向量实时修改角色模型和游戏虚拟相机镜头在世界坐标系中位置,游戏虚拟相机镜头位置的改变导致玩家在VR头盔中看到的游戏场景发生变化,通过此方法来让玩家在VR游戏中体验到接近真实物理攀爬的操作体验。
基本体验流程为如下:玩家佩戴VR头盔,双手握取交互控制器处于VR定位系统的测量空间中,转动头部带动VR头盔旋转和移动来控制游戏中虚拟 相机镜头的移动和旋转,通过双手持有的交互控制器控制游戏中对应手部模型的移动和旋转,攀爬操作通常需要双手,但是本方法对单双手都适用。手部模型触碰并抓取场景中设置的攀爬点,通过交互控制器内置马达的力学振动给予玩家成功抓取的反馈,力学振动是指手部模型抓取成功后通过定位控制器反馈给玩家的手上的振动。
本发明实施例中只需要摆放攀爬点并且在攀爬点中配置玩家在不同方向移动距离的移动缩放系数。不需要攀爬区域的设置,任何攀爬点只要双手可以触碰到都可以进行抓取攀爬,没有只能在连线的攀爬区域之间进行攀爬的限制。游戏虚拟相机镜头由玩家双手移动的距离和攀爬点设定的不同方向移动缩放系数来控制,没有预先设定的移动路线的限制。
本发明实施例中攀爬握取精度通过控制攀爬点碰撞包围盒的位置和形状来保证。攀爬点碰撞包围盒和场景越贴合,则游戏引擎物理系统对攀爬时玩家双手是否碰撞到攀爬点的检测越符合玩家心理预期,体验效果越好。玩家手部紧握并向任意目标方向移动,所谓目标方向是游戏中策划引导玩家的方向。完成一次攀爬需要一只手触碰并抓取攀爬点,向目标方向移动后完成第二次攀爬,双手交替操作完成整个攀爬过程。手部模型的移动方向由玩家手持交互控制器的移动方向和移动距离来确定。
请参阅图5所示,主要包括如下的攀爬操作计算流程:
步骤1:手部与攀爬点碰撞检测。
攀爬点带有碰撞包围盒被设置在场景中不同位置,玩家通过移动手部握取的交互控制器来控制游戏中手部模型的移动,当手部模型移动进入到攀爬点碰撞包围盒时,游戏引擎的物理系统会检测到碰撞并触发进入事件通知。当手部模型移动退出碰撞包围盒时,同样会触发退出通知。
其中,定位系统是利用传感器来获取VR头盔和交互控制器在物理空间位置和旋转信息的硬件设备。VR头盔也叫VR眼镜,玩家戴在头部,在眼前屏幕上显示虚拟场景,定位系统能够实时捕获VR头盔的位置和方向并反馈到虚拟世界中。手部模型在VR游戏中用来代表玩家双手的模型,由交互控制器的定位结果来改变手部模型自身在虚拟场景中的位置和方向。攀爬操作和手部相 关,使用手部模型或者通过手部持有其他模型进行攀爬操作都可看作同一类型。步骤2:攀爬按键被按下的状态监测。
收到手部模型进入攀爬点的碰撞包围盒事件后,开始监测位于玩家手中交互控制器上的攀爬按键是否被玩家手动按下。此时按键被玩家按下,处理装置接收到交互控制器产生的信号,则确定攀爬开始。
步骤3:攀爬开始。
首先,触发玩家攀爬手中握取的交互控制器的马达进行振动,告知玩家已经握牢攀爬点;其次,固定玩家攀爬手对应手部模型的位置,使得该手部模型不再随玩家手中交互控制器位置的改变而变化;最后通过硬件设备提供的接口获取此时玩家攀爬手中握取的交互控制器在测量空间中的坐标值并进行保存。
步骤4:攀爬按键被按下状态的持续监测。
如果玩家手中交互控制器的攀爬按键被松开,则恢复手部模型对玩家手中交互控制器位置改变的响应。如果攀爬按键持续保持在被按下状态,则进入攀爬过程。
步骤5:攀爬过程。
首先,通过硬件设备提供的接口实时不间断地获取玩家攀爬手中握取的交互控制器在测量空间中的坐标值,并与步骤3攀爬开始时保存的起始坐标值进行减法计算,获得自攀爬开始时刻在测量空间中玩家手中握取的交互控制器移动向量的实时值。
其次,对攀爬点配置的沿XYZ轴方向的移动缩放系数进行旋转矫正。由于攀爬点的方向和角色模型的方向并不相同。为了使用配置在攀爬点的缩放系数,需要计算攀爬点方向和角色模型方向的旋转方向差,例如,减数是攀爬点模型在世界坐标系中的方向,被减数是角色模型在世界坐标系中的方向,则旋转方向差=减数-被减数,将攀爬点配置的移动缩放系数按旋转差值进行旋转,得到旋转矫正后的移动缩放系数。
最后,用手部移动向量实时值乘以攀爬点配置的旋转矫正后的XYZ三个方向的移动幅度缩放值,得到缩放矫正后的移动向量实时值。
步骤6:移动角色模型及游戏虚拟相机镜头的位置。
得到缩放矫正后的移动向量实时值后,需要移动角色模型及游戏虚拟相机镜头的位置,使得跟随玩家的手部移动发生改变。
首先,对缩放矫正后的移动向量实时值取反,得到角色模型和游戏虚拟相机镜头需要移动的最终值。这是为了模拟攀爬体验,需要将玩家手部的移动转化成游戏虚拟相机镜头的反向移动。比如,玩家手部在测量空间中从上到下的移动,需要转化成游戏虚拟相机镜头在游戏场景中反向的从下到上的移动,来模拟向上攀爬的体验。
然后,根据取反后的移动向量实时值,移动角色模型及游戏虚拟相机镜头的位置,镜头渲染的游戏场景随之改变完成整个攀爬过程。
通过前述6个步骤的举例说明可知,测量获得的数据是玩家在真实物理空间中双手的位移向量,真实空间中玩家所戴的头盔并没有移动。但是攀爬体验需要的是镜头移动,将交互控制器的移动向量取反后应用给虚拟相机镜头,比如玩家手部从上到下的一次运动,在游戏里表现为镜头从下到上的运动。镜头位置改变,玩家看到的游戏场景才会发生变化,镜头向上运动,玩家才有往上爬的体验。本发明实施例解决了任意VR游戏场景下的攀爬操作实现,通过摆放攀爬点并在攀爬点中配置玩家在不同方向移动距离的缩放系数,可以有效抑制攀爬过程中镜头穿墙和玩家推拉整个游戏场景的不良体验,让玩家体验到接近真实物理的攀爬操作。
需要说明的是,对于前述的各方法实施例,为了简单描述,故将其都表述为一系列的动作组合,但是本领域技术人员应该知悉,本发明并不受所描述的动作顺序的限制,因为依据本发明,某些步骤可以采用其他顺序或者同时进行。其次,本领域技术人员也应该知悉,说明书中所描述的实施例均属于优选实施例,所涉及的动作和模块并不一定是本发明所必须的。
为便于更好的实施本发明实施例的上述方案,下面还提供用于实施上述方案的相关装置。
请参阅图6所示,本发明实施例提供的一种VR场景下的攀爬操作处理装置300,可以包括:移动向量获取模块301、缩放系数校正模块302和显示调整模块303,其中,
移动向量获取模块301,用于当所述VR场景下的手部模型抓取到攀爬点时,获取交互控制器在测量空间中从攀爬开始时刻到当前攀爬时刻产生的移动向量,所述交互控制器的攀爬运动控制所述手部模型在所述VR场景下进行攀爬操作,所述手部模型属于角色模型的一部分;
缩放系数校正模块302,用于根据所述角色模型在世界坐标系中的方向和所述攀爬点在所述世界坐标系中的方向对所述攀爬点配置的初始移动缩放系数进行旋转矫正,得到矫正后的移动缩放系数;
显示调整模块303,用于使用所述矫正后的移动缩放系数和所述移动向量调整所述角色模型挂载的虚拟相机镜头在所述世界坐标系中的位置,所述虚拟相机镜头的位置调整改变所述VR场景的显示。
在本发明的一些实施例中,所述缩放系数校正模块302,具体用于计算所述角色模型在世界坐标系中的方向和所述攀爬点在所述世界坐标系中的方向之间的旋转方向差;使用所述旋转方向差对所述攀爬点配置的初始移动缩放系数进行旋转矫正。
在本发明的一些实施例中,请参阅图7所示,所述显示调整模块303,包括:
移动向量缩放模块3031,用于使用所述矫正后的移动缩放系数对所述交互控制器在测量空间中从攀爬开始时刻到当前攀爬时刻产生的移动向量进行缩放处理,得到缩放后的移动向量;
镜头位置调整模块3032,用于使用所述缩放后的移动向量调整所述角色模型挂载的虚拟相机镜头在所述世界坐标系中的位置。
进一步的,在本发明的一些实施例中,所述镜头位置调整模块3032,具体用于对所述缩放后的移动向量进行取反计算,得到虚拟相机镜头需要移动的最终向量;按照所述最终向量调整所述角色模型的移动,通过所述角色模型的移动驱动所述虚拟相机镜头在所述世界坐标系中的位置调整。
在本发明的一些实施例中,请参阅图8所示,所述VR场景下的攀爬操作处理装置300,还包括:
手部模型控制模块304,用于通过交互控制器的移动控制所述手部模型在 所述VR场景下的移动;
碰撞检测模块305,用于检测所述手部模型移动后是否与攀爬点配置的碰撞包围盒进行碰撞,当所述手部模型碰撞到所述碰撞包围盒时,确定所述手部模型触碰到所述攀爬点;
抓取检测模块306,用于检测所述交互控制器上设置的攀爬按键是否被触发,当所述攀爬按键被触发时,确定所述手部模型抓取到所述攀爬点;
位置记录模块307,用于对所述交互控制器开始进行攀爬运动进行计时,记录所述交互控制器在所述测量空间中不同时刻的位置。
进一步的,在本发明的一些实施例中,请参阅图9所示,当所述攀爬按键被触发时,所述VR场景下的攀爬操作处理装置300,还包括:
锁定模块308,用于当所述手部模型抓取到所述攀爬点时,将所述手部模型在所述VR场景下的位置保持不变。
在本发明的一些实施例中,请参阅图10所示,相对于图6所示,所述VR场景下的攀爬操作处理装置300,还包括:振动反馈模块309,用于当所述VR场景下的手部模型抓取到攀爬点时,通过所述交互控制器发送攀爬点抓取成功的振动反馈消息。
通过以上对本发明实施例的描述可知,在本发明实施例中,当VR场景下的手部模型抓取到攀爬点时,获取交互控制器在测量空间中从攀爬开始时刻到当前攀爬时刻产生的移动向量,交互控制器的攀爬运动控制手部模型在VR场景下进行攀爬操作,手部模型属于角色模型的一部分,然后根据角色模型在世界坐标系中的方向和攀爬点在世界坐标系中的方向对攀爬点配置的初始移动缩放系数进行旋转矫正,得到矫正后的移动缩放系数,最后使用矫正后的移动缩放系数和移动向量调整角色模型挂载的虚拟相机镜头在世界坐标系中的位置,虚拟相机镜头的位置调整改变VR场景的显示。本发明实施例中手部模型的移动完全由用户操作的交互控制器来控制,用户操作交互控制器的移动由用户来决定,不需要受到攀爬区域和攀爬路线的约束,解决手部模型模拟的攀爬操作与用户实际攀爬动作不匹配的问题,实现对用户攀爬动作的准确模拟。交互控制器在测量空间中从攀爬开始时刻到当前攀爬时刻产生的移动向量表示 了用户在真实世界中进行攀爬动作移动的方向和移动距离,对于VR场景下的攀爬点设置有初始移动缩放系数,该初始移动缩放系数需要根据角色模型在世界坐标系中的方向和攀爬点在世界坐标系中的方向之间的方向差异进行矫正,使用矫正后的移动缩放系数和移动向量可以调整角色模型挂载的虚拟相机镜头在世界坐标系中的位置,因此本发明实施例中虚拟相机镜头的调整不依赖于攀爬区域和攀爬路线所定义的移动路线,任意VR场景下的攀爬操作都可以实现,让用户体验到接近真实物理的攀爬运动。
图11是本发明实施例提供的一种VR场景下的攀爬操作处理装置(以下可以简称处理装置)处理装置结构示意图,该处理装置应用于如图1示出的VR定位系统,交互控制器与该处理装置有线或者无线连接,VR显示装置与该处理装置进行无线或者有线连接。
该处理装置1100可因配置或性能不同而产生比较大的差异,可以包括一个或一个以上中央处理器(central processing units,CPU)1122(例如,一个或一个以上处理器)和存储器1132,一个或一个以上存储应用程序1142或数据1144的存储介质1130(例如一个或一个以上海量存储设备)。其中,存储器1132和存储介质1130可以是短暂存储或持久存储。存储在存储介质1130的程序可以包括一个或一个以上模块(图示没标出),每个模块可以包括对处理装置中的一系列指令操作。更进一步地,中央处理器1122可以设置为与存储介质1130通信,在处理装置1100上执行存储介质1130中的一系列指令操作。
处理装置1100还可以包括一个或一个以上电源1126,一个或一个以上有线或无线网络接口1150,一个或一个以上输入输出接口1158,和/或,一个或一个以上操作系统1141。
上述实施例中由处理装置所执行的VR场景下的攀爬操作处理方法步骤可以基于该图4所示的处理装置结构。
具体的,中央处理器1122,用于当在VR场景下的手部模型抓取到攀爬点时,获取交互控制器在测量空间中从攀爬开始时刻到当前攀爬时刻产生的移动向量,交互控制器的攀爬运动控制所述手部模型在所述VR场景下进行攀爬操 作,所述手部模型属于角色模型的一部分;根据所述角色模型在角色空间中的方向和所述攀爬点在所述角色空间中的方向对所述攀爬点配置的初始移动缩放系数进行旋转矫正,得到矫正后的移动缩放系数;使用所述矫正后的移动缩放系数和所述移动向量调整所述的虚拟相机镜头在所述角色空间中的位置;根据所述虚拟相机镜头的位置调整改变所述VR场景的显示。
可选的,中央处理器1122,用于计算所述角色模型在所述角色空间内世界坐标系上的方向和所述攀爬点在所述世界坐标系中的方向之间的旋转方向差;根据所述旋转方向差对所述攀爬点配置的初始移动缩放系数进行旋转矫正。
可选的,中央处理器1122,还用于使用所述矫正后的移动缩放系数对所述交互控制器在测量空间中从攀爬开始时刻到当前攀爬时刻产生的移动向量进行缩放处理,得到缩放后的移动向量;根据所述缩放后的移动向量调整所述角色模型挂载的虚拟相机镜头在所述角色空间内世界坐标系中的位置。
可选的,中央处理器1122,用于对所述缩放后的移动向量进行取反计算,得到虚拟相机镜头需要移动的最终向量;按照所述最终向量调整所述角色模型的移动,通过所述角色模型的移动驱动所述虚拟相机镜头在所述世界坐标系中的位置调整。
可选的,中央处理器1122,还用于通过所述交互控制器的移动控制所述手部模型在所述角色空间中的移动;检测所述手部模型是否与设置于所述攀爬点的碰撞包围盒发生碰撞;当检测到所述手部模型碰撞到所述碰撞包围盒时,则确定所述手部模型触碰到所述攀爬点;检测所述攀爬按键是否被触发;当检测到所述攀爬按键被触发时,则确定所述手部模型抓取到所述攀爬点;记录在不同时刻所述交互控制器在所述测量空间中的位置。
可选的,中央处理器1122,还用于固定所述手部模型在所述角色空间下的位置保持不变。
可选的,网络接口1150,用于当所述VR场景下的手部模型抓取到攀爬点时,向所述交互控制器发送反馈信号,以使所述交互控制器根据所述反馈信号发出已成功抓取所述攀爬点的反馈消息。
另外需说明的是,以上所描述的装置实施例仅仅是示意性的,其中所述作 为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。另外,本发明提供的装置实施例附图中,模块之间的连接关系表示它们之间具有通信连接,具体可以实现为一条或多条通信总线或信号线。本领域普通技术人员在不付出创造性劳动的情况下,即可以理解并实施。
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到本发明可借助软件加必需的通用硬件的方式来实现,当然也可以通过专用硬件包括专用集成电路、专用CPU、专用存储器、专用元器件等来实现。一般情况下,凡由计算机程序完成的功能都可以很容易地用相应的硬件来实现,而且,用来实现同一功能的具体硬件结构也可以是多种多样的,例如模拟电路、数字电路或专用电路等。但是,对本发明而言更多情况下软件程序实现是更佳的实施方式。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在可读取的存储介质中,如计算机的软盘、U盘、移动硬盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁碟或者光盘等,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本发明各个实施例所述的方法。

Claims (16)

  1. 一种虚拟现实VR场景下的攀爬操作处理方法,其特征在于,包括:
    当在VR场景下的手部模型抓取到攀爬点时,获取交互控制器在测量空间中从攀爬开始时刻到当前攀爬时刻产生的移动向量,所述交互控制器的攀爬运动控制所述手部模型在所述VR场景下进行攀爬操作,所述手部模型属于角色模型的一部分;
    根据所述角色模型在角色空间中的方向和所述攀爬点在所述角色空间中的方向对所述攀爬点配置的初始移动缩放系数进行旋转矫正,得到矫正后的移动缩放系数;
    使用所述矫正后的移动缩放系数和所述移动向量调整所述角色模型挂载的虚拟相机镜头在所述角色空间中的位置;
    根据所述虚拟相机镜头的位置调整改变所述VR场景的显示。
  2. 根据权利要求1所述的方法,其特征在于,所述根据所述角色模型在角色空间中的方向和所述攀爬点在角色空间中的方向对所述攀爬点配置的初始移动缩放系数进行旋转矫正,包括:
    计算所述角色模型在所述角色空间内世界坐标系上的方向和所述攀爬点在所述世界坐标系中的方向之间的旋转方向差;
    根据所述旋转方向差对所述攀爬点配置的初始移动缩放系数进行旋转矫正。
  3. 根据权利要求1所述的方法,其特征在于,所述使用所述矫正后的移动缩放系数和所述移动向量调整所述角色模型挂载的虚拟相机镜头在所述角色空间中的位置,包括:
    使用所述矫正后的移动缩放系数对所述交互控制器在测量空间中从攀爬开始时刻到当前攀爬时刻产生的移动向量进行缩放处理,得到缩放后的移动向量;
    根据所述缩放后的移动向量调整所述角色模型挂载的虚拟相机镜头在所述角色空间内世界坐标系中的位置。
  4. 根据权利要求3所述的方法,其特征在于,所述根据所述缩放后的移动向量调整所述角色模型挂载的虚拟相机镜头在所述世界坐标系中的位置,包 括:
    对所述缩放后的移动向量进行取反计算,得到虚拟相机镜头需要移动的最终向量;
    按照所述最终向量调整所述角色模型的移动,通过所述角色模型的移动驱动所述虚拟相机镜头在所述世界坐标系中的位置调整。
  5. 根据权利要求1所述的方法,其特征在于,所述交互控制器上设置有攀爬按键,所述方法还包括:
    通过所述交互控制器的移动控制所述手部模型在所述角色空间中的移动;
    检测所述手部模型是否与设置于所述攀爬点的碰撞包围盒发生碰撞;
    当检测到所述手部模型碰撞到所述碰撞包围盒时,则确定所述手部模型触碰到所述攀爬点;
    检测所述攀爬按键是否被触发;
    当检测到所述攀爬按键被触发时,则确定所述手部模型抓取到所述攀爬点;
    记录在不同时刻所述交互控制器在所述测量空间中的位置。
  6. 根据权利要求5所述的方法,其特征在于,当所述攀爬按键被触发时,所述方法还包括:
    固定所述手部模型在所述角色空间下的位置保持不变。
  7. 根据权利要求1至6中任一项所述的方法,其特征在于,当所述VR场景下的手部模型抓取到攀爬点时,所述方法还包括:
    向所述交互控制器发送反馈信号,以使所述交互控制器根据所述反馈信号发出已成功抓取所述攀爬点的反馈消息。
  8. 一种虚拟现实VR场景下的攀爬操作处理装置,其特征在于,包括:
    移动向量获取模块,用于当在所述VR场景下的手部模型抓取到攀爬点时,获取交互控制器在测量空间中从攀爬开始时刻到当前攀爬时刻产生的移动向量,所述交互控制器的攀爬运动控制所述手部模型在所述VR场景下进行攀爬操作,所述手部模型属于角色模型的一部分;
    缩放系数校正模块,用于根据所述角色模型在角色空间中的方向和所述攀爬点在所述角色空间中的方向对所述攀爬点配置的初始移动缩放系数进行旋 转矫正,得到矫正后的移动缩放系数;
    显示调整模块,用于根据所述矫正后的移动缩放系数和所述移动向量调整所述角色模型挂载的虚拟相机镜头在所述世界坐标系中的位置,根据所述虚拟相机镜头的位置调整改变所述VR场景的显示。
  9. 根据权利要求8所述的装置,其特征在于,所述缩放系数校正模块,具体用于计算所述角色模型在所述角色空间内世界坐标系上的方向和所述攀爬点在所述角色空间中的方向之间的旋转方向差;根据所述旋转方向差对所述攀爬点配置的初始移动缩放系数进行旋转矫正。
  10. 根据权利要求8所述的装置,其特征在于,所述显示调整模块,包括:
    移动向量缩放模块,用于使用所述矫正后的移动缩放系数对所述交互控制器在测量空间中从攀爬开始时刻到当前攀爬时刻产生的移动向量进行缩放处理,得到缩放后的移动向量;
    镜头位置调整模块,用于根据所述缩放后的移动向量调整所述角色模型挂载的虚拟相机镜头在所述角色空间内世界坐标系中的位置。
  11. 根据权利要求10所述的装置,其特征在于,所述镜头位置调整模块,具体用于对所述缩放后的移动向量进行取反计算,得到虚拟相机镜头需要移动的最终向量;按照所述最终向量调整所述角色模型的移动,通过所述角色模型的移动驱动所述虚拟相机镜头在所述世界坐标系中的位置调整。
  12. 根据权利要求8所述的装置,其特征在于,所述交互控制器上设置有攀爬按键,所述VR场景下的攀爬操作处理装置,还包括:
    手部模型控制模块,用于通过所述交互控制器的移动控制所述手部模型在所述角色空间中的移动;
    碰撞检测模块,用于检测所述手部模型是否与设置于所述攀爬点的碰撞包围盒进行碰撞,当所述手部模型碰撞到所述碰撞包围盒时,确定所述手部模型触碰到所述攀爬点;
    抓取检测模块,用于检测所述交互控制器上设置的攀爬按键是否被触发,当所述攀爬按键被触发时,确定所述手部模型抓取到所述攀爬点;
    位置记录模块,用于记录在不同时刻所述交互控制器在所述测量空间中的位置。
  13. 根据权利要求12所述的装置,其特征在于,当所述攀爬按键被触发时,所述VR场景下的攀爬操作处理装置,还包括:
    锁定模块,用于当所述手部模型抓取到所述攀爬点时,固定所述手部模型在所述VR场景下的位置保持不变。
  14. 根据权利要求8至13中任一项所述的装置,其特征在于,所述VR场景下的攀爬操作处理装置,还包括:振动反馈模块,用于当所述VR场景下的手部模型抓取到攀爬点时,向所述交互控制器反馈信号,以使所述交互控制器根据所述反馈信号发出已成功抓取所述攀爬点的反馈消息。
  15. 一种虚拟现实VR场景下的攀爬操作处理装置,其特征在于,包括:
    存储器,用于存储计算机可执行程序代码;
    处理器,与所述存储器耦合;
    其中所述程序代码包括指令,当所述处理器执行所述指令时,所述指令使所述虚拟现实VR场景下的攀爬操作处理装置执行权利要求1至7中任一项所述的方法。
  16. 一种计算机可读存储介质,其特征在于,包括指令,当其在计算机上运行时,使得计算机执行如权利要求1至7任一项所述的方法。
PCT/CN2017/114616 2016-12-07 2017-12-05 一种vr场景下的攀爬操作处理方法、装置及可读存储介质 WO2018103635A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201611116880.4A CN106582012B (zh) 2016-12-07 2016-12-07 一种vr场景下的攀爬操作处理方法和装置
CN201611116880.4 2016-12-07

Publications (1)

Publication Number Publication Date
WO2018103635A1 true WO2018103635A1 (zh) 2018-06-14

Family

ID=58596530

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/114616 WO2018103635A1 (zh) 2016-12-07 2017-12-05 一种vr场景下的攀爬操作处理方法、装置及可读存储介质

Country Status (2)

Country Link
CN (1) CN106582012B (zh)
WO (1) WO2018103635A1 (zh)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110548280A (zh) * 2019-09-11 2019-12-10 珠海金山网络游戏科技有限公司 一种虚拟相机的控制方法及装置
CN110652723A (zh) * 2018-06-29 2020-01-07 深圳市掌网科技股份有限公司 一种在球体游戏室产生力反馈效果的互动系统
CN110652722A (zh) * 2018-06-29 2020-01-07 深圳市掌网科技股份有限公司 一种在球体游戏室产生力反馈效果的互动系统
CN111652908A (zh) * 2020-04-17 2020-09-11 国网山西省电力公司晋中供电公司 一种虚拟现实场景的操作碰撞检测方法
CN111773724A (zh) * 2020-07-31 2020-10-16 网易(杭州)网络有限公司 一种跨越虚拟障碍的方法和装置
CN113457136A (zh) * 2021-06-29 2021-10-01 完美世界(北京)软件科技发展有限公司 游戏动画的生成方法及装置、存储介质、终端
CN114327076A (zh) * 2022-01-04 2022-04-12 上海三一重机股份有限公司 作业机械与作业环境的虚拟交互方法、装置及系统
CN114495611A (zh) * 2020-11-11 2022-05-13 郑州畅想高科股份有限公司 一种基于vr的机械间检修模拟培训方法
CN115509360A (zh) * 2022-10-11 2022-12-23 云宝宝大数据产业发展有限责任公司 基于元宇宙虚拟现实vr交互系统

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106582012B (zh) * 2016-12-07 2018-12-11 腾讯科技(深圳)有限公司 一种vr场景下的攀爬操作处理方法和装置
CN107281752A (zh) * 2017-06-16 2017-10-24 苏州蜗牛数字科技股份有限公司 一种在vr游戏中设置智能虚拟导游的方法
CN109688343A (zh) * 2017-10-18 2019-04-26 深圳市掌网科技股份有限公司 增强现实演播室的实现方法和装置
CN108196669B (zh) * 2017-12-14 2021-04-02 网易(杭州)网络有限公司 游戏角色模型的修正方法、装置、处理器及头戴式显示设备
CN111246095B (zh) 2020-01-17 2021-04-27 腾讯科技(深圳)有限公司 控制镜头运动的方法、装置、设备及存储介质
CN111768474B (zh) * 2020-05-15 2021-08-20 完美世界(北京)软件科技发展有限公司 动画生成方法、装置、设备
CN111694432B (zh) * 2020-06-11 2023-04-07 济南大学 一种基于虚拟手交互的虚拟手位置矫正方法及系统
CN111729311B (zh) * 2020-06-22 2024-05-10 苏州幻塔网络科技有限公司 攀爬跳跃方法、装置、计算机设备及计算机可读存储介质
CN111784850B (zh) * 2020-07-03 2024-02-02 深圳市瑞立视多媒体科技有限公司 基于虚幻引擎的物体抓取仿真方法及相关设备
CN111882943A (zh) * 2020-08-17 2020-11-03 阿呆科技(北京)有限公司 一种基于虚拟现实登山运动的戒毒康复训练系统
CN113018865B (zh) * 2021-03-31 2022-07-29 腾讯科技(深圳)有限公司 攀爬线生成方法、装置、计算机设备及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103608844A (zh) * 2011-06-22 2014-02-26 微软公司 全自动动态关节连接的模型校准
CN103886124A (zh) * 2012-12-21 2014-06-25 达索系统德尔米亚公司 虚拟对象的方位校正
CN105027190A (zh) * 2013-01-03 2015-11-04 美达公司 用于虚拟或增强介导视觉的射出空间成像数字眼镜
CN105894566A (zh) * 2015-12-01 2016-08-24 乐视致新电子科技(天津)有限公司 模型渲染方法及装置
CN106096188A (zh) * 2016-06-28 2016-11-09 王勇 攀岩模拟定线方法及其系统
CN106582012A (zh) * 2016-12-07 2017-04-26 腾讯科技(深圳)有限公司 一种vr场景下的攀爬操作处理方法和装置

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020036617A1 (en) * 1998-08-21 2002-03-28 Timothy R. Pryor Novel man machine interfaces and applications
JP4384697B2 (ja) * 2008-03-26 2009-12-16 株式会社コナミデジタルエンタテインメント ゲーム装置、ゲーム処理方法、ならびに、プログラム
US20140282275A1 (en) * 2013-03-15 2014-09-18 Qualcomm Incorporated Detection of a zooming gesture

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103608844A (zh) * 2011-06-22 2014-02-26 微软公司 全自动动态关节连接的模型校准
CN103886124A (zh) * 2012-12-21 2014-06-25 达索系统德尔米亚公司 虚拟对象的方位校正
CN105027190A (zh) * 2013-01-03 2015-11-04 美达公司 用于虚拟或增强介导视觉的射出空间成像数字眼镜
CN105894566A (zh) * 2015-12-01 2016-08-24 乐视致新电子科技(天津)有限公司 模型渲染方法及装置
CN106096188A (zh) * 2016-06-28 2016-11-09 王勇 攀岩模拟定线方法及其系统
CN106582012A (zh) * 2016-12-07 2017-04-26 腾讯科技(深圳)有限公司 一种vr场景下的攀爬操作处理方法和装置

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110652723A (zh) * 2018-06-29 2020-01-07 深圳市掌网科技股份有限公司 一种在球体游戏室产生力反馈效果的互动系统
CN110652722A (zh) * 2018-06-29 2020-01-07 深圳市掌网科技股份有限公司 一种在球体游戏室产生力反馈效果的互动系统
CN110548280A (zh) * 2019-09-11 2019-12-10 珠海金山网络游戏科技有限公司 一种虚拟相机的控制方法及装置
CN110548280B (zh) * 2019-09-11 2023-02-17 珠海金山数字网络科技有限公司 一种虚拟相机的控制方法及装置
CN111652908A (zh) * 2020-04-17 2020-09-11 国网山西省电力公司晋中供电公司 一种虚拟现实场景的操作碰撞检测方法
CN111773724A (zh) * 2020-07-31 2020-10-16 网易(杭州)网络有限公司 一种跨越虚拟障碍的方法和装置
CN111773724B (zh) * 2020-07-31 2024-04-26 网易(上海)网络有限公司 一种跨越虚拟障碍的方法和装置
CN114495611A (zh) * 2020-11-11 2022-05-13 郑州畅想高科股份有限公司 一种基于vr的机械间检修模拟培训方法
CN113457136A (zh) * 2021-06-29 2021-10-01 完美世界(北京)软件科技发展有限公司 游戏动画的生成方法及装置、存储介质、终端
CN114327076A (zh) * 2022-01-04 2022-04-12 上海三一重机股份有限公司 作业机械与作业环境的虚拟交互方法、装置及系统
CN115509360A (zh) * 2022-10-11 2022-12-23 云宝宝大数据产业发展有限责任公司 基于元宇宙虚拟现实vr交互系统
CN115509360B (zh) * 2022-10-11 2023-10-20 云宝宝大数据产业发展有限责任公司 基于元宇宙虚拟现实vr交互系统

Also Published As

Publication number Publication date
CN106582012B (zh) 2018-12-11
CN106582012A (zh) 2017-04-26

Similar Documents

Publication Publication Date Title
WO2018103635A1 (zh) 一种vr场景下的攀爬操作处理方法、装置及可读存储介质
US10282882B2 (en) Augmented reality simulation continuum
CN107132917B (zh) 用于虚拟现实场景中的手型显示方法及装置
WO2016123816A1 (zh) 一种射击游戏的瞄准方法及装置
US11173362B2 (en) Analysis apparatus, analysis method, and recording medium
WO2019019968A1 (zh) 虚拟角色的位移控制方法、装置和存储介质
CN108369478A (zh) 用于交互反馈的手部跟踪
JP6723738B2 (ja) 情報処理装置、情報処理方法及びプログラム
CN103501869A (zh) 手动和基于相机的游戏控制
JP2012239761A5 (zh)
US20190332182A1 (en) Gesture display method and apparatus for virtual reality scene
WO2018103656A1 (zh) 一种vr场景下的道具运动处理方法、装置及存储介质
CN108355347B (zh) 交互控制方法、装置、电子设备及存储介质
CN102004840A (zh) 一种基于计算机实现虚拟拳击的方法和系统
WO2015098251A1 (ja) 情報処理装置、記録媒体および情報処理方法
JP2018514836A (ja) バーチャル及びオーグメンテッドリアリティ環境におけるコントローラ可視化
WO2017012362A1 (zh) 调整虚拟物件在虚拟空间中姿态角的方法及装置
CN111784850B (zh) 基于虚幻引擎的物体抓取仿真方法及相关设备
WO2023078272A1 (zh) 虚拟对象显示方法、装置、电子设备及可读介质
CN107102725B (zh) 一种基于体感手柄进行虚拟现实移动的控制方法及系统
KR101348419B1 (ko) 영상컨텐츠를 제공하는 가상 골프 시뮬레이션 장치 및 그 방법
US20230162458A1 (en) Information processing apparatus, information processing method, and program
JP2015002911A (ja) 運動解析装置および運動解析プログラム
JP6772424B2 (ja) 投影装置、投影方法及びプログラム
WO2018074054A1 (ja) 表示制御装置、表示制御方法及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17879024

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17879024

Country of ref document: EP

Kind code of ref document: A1