WO2019153750A1 - 用于对虚拟环境进行视角切换的方法、装置、设备及存储介质 - Google Patents

用于对虚拟环境进行视角切换的方法、装置、设备及存储介质 Download PDF

Info

Publication number
WO2019153750A1
WO2019153750A1 PCT/CN2018/106552 CN2018106552W WO2019153750A1 WO 2019153750 A1 WO2019153750 A1 WO 2019153750A1 CN 2018106552 W CN2018106552 W CN 2018106552W WO 2019153750 A1 WO2019153750 A1 WO 2019153750A1
Authority
WO
WIPO (PCT)
Prior art keywords
view
jump
viewing angle
jump element
scale
Prior art date
Application number
PCT/CN2018/106552
Other languages
English (en)
French (fr)
Inventor
仇蒙
汪俊明
潘佳绮
张雅
张书婷
肖庆华
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to EP18904652.7A priority Critical patent/EP3751522A4/en
Publication of WO2019153750A1 publication Critical patent/WO2019153750A1/zh
Priority to US16/886,111 priority patent/US11256384B2/en
Priority to US17/541,350 priority patent/US11703993B2/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5252Changing parameters of virtual cameras using two or more virtual cameras concurrently or sequentially, e.g. automatically switching between fixed virtual cameras when a character changes room or displaying a rear-mirror view in a car-driving game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5255Changing parameters of virtual cameras according to dedicated instructions from a player, e.g. using a secondary joystick to rotate the camera around a player's character
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/90Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
    • A63F13/92Video game devices specially adapted to be hand-held while playing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0414Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • G06F3/04855Interaction with scrollbars
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]

Definitions

  • the present application relates to the field of human-computer interaction, and in particular, to a method, device, device, and storage medium for performing perspective switching on a three-dimensional virtual environment.
  • 3D virtual environments such as: virtual reality applications, 3D map applications, military simulation applications, first-person shooting games. , FPS), Multiplayer Online Battle Arena Games (MOBA), etc.
  • the terminal uses the first person perspective of the virtual character to display the environment picture in the virtual environment.
  • the user interface of the FPS game displays a left-hand joystick and a right-hand sliding screen.
  • the user can use the left-hand joystick to change the position of the virtual character in the three-dimensional virtual environment, and use the right hand to slide the screen to change the virtual character.
  • the direction of the view in the 3D virtual environment For example, if the user slides to the left with the right finger, the virtual character is rotated to the left in the viewing direction in the three-dimensional virtual environment.
  • the embodiment of the present invention provides a method, a device, a device, and a storage medium for performing a view switching on a virtual environment, which can solve the problem that when a user performs a wide range of view switching, the user needs to slide the right hand to slide the screen multiple times.
  • the problem of switching the angle of view and slowing the switching of the angle of view is completed.
  • the technical solution is as follows:
  • a method for performing perspective switching on a virtual environment comprising:
  • the first user interface includes an environment picture and a view angle switching area when the virtual object observes the virtual environment in a first view direction, and the view switch area includes at least one view jump for performing a view jump element;
  • the target view jump element being one of the at least one view jump element
  • a second user interface is displayed, the second user interface including an environment screen when the virtual object observes the virtual environment in a second view direction.
  • a viewing angle switching device configured to implement a view switching of a virtual environment, and the device includes:
  • a display module configured to display a first user interface, where the first user interface includes an environment image and a view switching area when the virtual object observes the virtual environment in a first view direction, and the view switchover area includes at least one for performing a view jump Turning perspective jump element;
  • a receiving module configured to receive a view jump signal triggered on a target view jump element, where the target view jump element is one of the at least one view jump element;
  • a processing module configured to determine a second viewing angle direction corresponding to the target viewing angle jump element
  • the display module is further configured to display a second user interface, where the second user interface includes an environment screen when the virtual object observes the virtual environment in a second view direction.
  • an electronic device for implementing a view switching of a virtual environment, the device comprising a processor and a memory, the memory storing at least one instruction, the instruction being by the processor Loaded and executed to implement a method for viewing angle switching of a virtual environment as described above.
  • a computer readable storage medium having stored therein at least one instruction loaded by a processor and executed to implement a method for viewing a view of a virtual environment as described above.
  • a computer program product storing at least one instruction loaded by a processor and executed to implement a method for view switching a virtual environment as described above.
  • the second display interface is displayed according to the second view direction corresponding to the target view jump element by receiving the view jump signal triggered on the target view jump element in the view switching area, and the virtual object is changed by sliding the screen with the right hand in the related art.
  • the problem of low human-computer interaction brought by the direction of the viewing angle is that the viewing angle of the virtual object is switched by clicking the target view jump element without sliding the screen by the right hand, thereby improving the switching speed of the viewing angle, thereby improving the user's large
  • FIG. 1 is a schematic diagram of a display interface of a three-dimensional virtual environment provided by an exemplary embodiment of the present application
  • FIG. 2 is a schematic diagram of a world coordinate system and a local coordinate system of a camera model in a three-dimensional virtual environment provided by an exemplary embodiment of the present application;
  • FIG. 3 is a schematic diagram of rotation of a camera model around a vector R according to an exemplary embodiment of the present application
  • FIG. 4 is a schematic diagram of rotation of a camera model around a vector U according to an exemplary embodiment of the present application
  • FIG. 5 is a schematic diagram of rotation of a camera model around a vector L according to an exemplary embodiment of the present application
  • FIG. 6 is a perspective view of a camera model in a three-dimensional virtual environment provided by an exemplary embodiment of the present application.
  • FIG. 7 is a schematic diagram of a view conversion of a camera model provided by an exemplary embodiment of the present application.
  • FIG. 8 is a schematic diagram of an implementation environment provided by an exemplary embodiment of the present application.
  • FIG. 9 is a schematic diagram of an implementation environment provided by an exemplary embodiment of the present application.
  • FIG. 10 is a flowchart of a method for performing perspective switching on a virtual environment according to an exemplary embodiment of the present application.
  • FIG. 11 is a schematic diagram of a user interface of a three-dimensional virtual environment provided by an exemplary embodiment of the present application.
  • FIG. 12 is a schematic diagram of a user interface of a three-dimensional virtual environment provided by an exemplary embodiment of the present application.
  • FIG. 13 is a schematic diagram of a user interface of a three-dimensional virtual environment provided by an exemplary embodiment of the present application.
  • FIG. 14 is a schematic diagram of a user interface of a three-dimensional virtual environment provided by an exemplary embodiment of the present application.
  • FIG. 15 is a flowchart of a method for performing perspective switching on a virtual environment according to an exemplary embodiment of the present application.
  • FIG. 16 is a flowchart of a method for performing perspective switching on a virtual environment according to an exemplary embodiment of the present application.
  • 17 is a schematic diagram of a user interface of a three-dimensional virtual environment provided by an exemplary embodiment of the present application.
  • FIG. 18 is a schematic diagram of a user interface of a three-dimensional virtual environment provided by an exemplary embodiment of the present application.
  • FIG. 19 is a flowchart of a method for performing perspective switching on a virtual environment according to an exemplary embodiment of the present application.
  • FIG. 20 is a structural block diagram of a view switching device according to an exemplary embodiment of the present application.
  • FIG. 21 is a structural block diagram of an electronic device according to an exemplary embodiment of the present application.
  • Virtual environment A virtual environment that is displayed (or provided) when an application is running on a terminal.
  • the virtual environment can be a real-world simulation environment, a semi-simulated semi-fiction environment, or a purely fictitious environment.
  • the virtual environment may be any one of a two-dimensional virtual environment, a two-dimensional virtual environment, and a three-dimensional virtual environment.
  • the following embodiments are exemplified by the virtual environment being a three-dimensional virtual environment, but are not limited thereto.
  • the virtual environment is also used for a virtual environment battle between at least two virtual characters.
  • the virtual environment is further used for playing a virtual firearm between at least two virtual characters.
  • the virtual environment is further configured to use a virtual firearm to compete between at least two virtual characters within a target area, and the target area range is continuously smaller as time passes in the virtual environment.
  • the virtual environment is a three-dimensional virtual environment as an example.
  • a virtual object is a movable object in a 3D virtual environment.
  • the movable object may be a virtual character, a virtual creature, an anime character, or the like, such as a character, an animal, a plant, a oil drum, a wall, a rock, and the like displayed in a three-dimensional virtual environment.
  • the virtual object is a three-dimensional model created based on animated bone techniques.
  • Each virtual object has its own shape and volume in a three-dimensional virtual environment, occupying a part of the space in the three-dimensional virtual environment.
  • a virtual object includes a center point. When the virtual object is a regular object, the center point is a geometric center of the virtual object; when the virtual object is an irregular object, the center point may be according to the virtual object.
  • the state of motion is preset.
  • Viewing direction The viewing direction when viewing in a three-dimensional virtual environment with the first person perspective or the third person perspective of the virtual object.
  • World coordinate system refers to the three-dimensional coordinate system established for the three-dimensional virtual environment.
  • the world coordinate system is used to measure the coordinates, direction vector, motion vector and other parameters of the virtual object in the three-dimensional virtual environment.
  • the ground coordinate system Includes three axes: the X axis, the Y axis, and the Z axis.
  • the plane in which the X and Z axes are located is the horizontal plane, that is, the reference ground, and the upper half of the Y axis represents the height relative to the ground.
  • the posture parameters of the virtual object in the three-dimensional virtual environment include: three-dimensional coordinates (X, Y, Z), wherein X and Y represent coordinates of the virtual object with respect to the ground, and Y represents height of the virtual object with respect to the ground.
  • Camera model A three-dimensional model located around a virtual object in a three-dimensional virtual environment.
  • the camera model is located near the head of the virtual object or at the head of the virtual object.
  • the camera model is located behind the virtual object 110, and the three-dimensional virtual environment is observed through the perspective of the camera model.
  • FIG. 1 is a schematic diagram of an interface of a three-dimensional virtual environment provided by an exemplary embodiment of the present application.
  • the display interface 100 of the three-dimensional virtual environment includes a virtual object 110 , a view switching area 120 , and a virtual The environment picture 130 of the three-dimensional virtual environment and the virtual left-hand joystick 140 observed from the perspective of the object 110, wherein the virtual left-hand joystick 140 is an optional control element, and the user can manipulate the virtual object 110 to advance by the virtual left-hand joystick 140. , back, move to the left and move to the right.
  • the three-dimensional virtual environment refers to a virtual environment generated by the application during the running process, which enables the user to enter a multimedia three-dimensional virtual world, and the user can operate the virtual object in the three-dimensional virtual environment by operating the device or the operation interface. Control, observe objects, people, landscapes, etc. in a three-dimensional virtual environment from the perspective of a virtual object, or interact with objects, people, landscapes, etc. in a virtual object and a three-dimensional virtual environment, for example, by operating a virtual soldier against a target enemy Attack.
  • the virtual object 110 is a three-dimensional model built in a three-dimensional virtual environment, and the environment picture of the three-dimensional virtual environment displayed in the display interface 100 is an object observed by the angle of view of the virtual object 110.
  • the displayed environmental image 130 of the three-dimensional virtual environment includes the ground 134, the sky 135, the horizon 133, the hill 131, and the building 132.
  • the virtual object 110 is displayed in the display interface 130; in the first person perspective, the virtual object 110 is not displayed in the display interface 130.
  • the perspective of the virtual object 110 can be simulated (or implemented) by establishing a camera model in a three-dimensional virtual environment.
  • the camera model is located near the head of the virtual object or at the head of the virtual object.
  • the camera model is located behind the virtual object 110, and the three-dimensional virtual environment is observed through the perspective of the camera model.
  • the three-dimensional virtual environment has a world coordinate system of the X-axis, the Y-axis, and the Z-axis, and the virtual models in the world coordinate system all have corresponding position coordinates.
  • the position coordinates of the camera model are reference points P in the world.
  • the coordinates (X1, Y1, Z1) in the coordinate system which may be the physical center point of the camera model.
  • the camera model in the three-dimensional virtual environment can be described by a vertically upward vector U (Up), a horizontal right vector R (Right), a front right vector L (Look), and a reference point P.
  • the motion of the camera model may include the following actions: rotation around the vector R, rotation around the vector U, rotation around the vector L, panning along the vector R, lifting along the vector U, and translation along the vector L, this embodiment of the present application
  • the viewing angle switching of the virtual object mainly involves rotation around the vector R and/or rotation around the vector U.
  • Fig. 3 shows a schematic diagram of the rotation of the camera model around the vector R.
  • the left side of the figure shows the camera model without rotation, and the right side of the figure is the camera model rotated around the vector R.
  • Figure 4 shows a schematic diagram of the rotation of the camera model around the vector U.
  • the left side of the figure shows the camera model without rotation, and the right side of the figure is the camera model rotated around the vector U.
  • Fig. 5 shows a schematic diagram of the rotation of the camera model around the vector L.
  • the left side of the figure shows the camera model without rotation, and the right side of the figure is the camera model rotated around the vector L.
  • the implementation of the camera model can usually be carried out from the rotation of the camera model, the scaling of the camera model, and the vibration of the camera model.
  • the properties of the camera model include the angle of view, the line of sight, the speed of the wheel movement, the acceleration of the wheel movement, and the virtual The viewpoint of the object in the 3D virtual environment and the height of the virtual object.
  • the modes and parameters associated with the camera model can be defined as:
  • the projection method is divided into perspective and orthogonal mode.
  • the camera model will render the observable objects in a full perspective, producing near-large and small effects.
  • the perspective mode parameters include a Field of view.
  • Orthographic The camera model uniformly renders observable objects in a way that is not perspective.
  • the parameter of the orthogonal mode is the size of the field of view (Size).
  • Clipping Planes The range in which the camera model is rendered, including the nearest point (Near) and the farthest point (Far).
  • Viewport Rect Use four values to control the position and size of the camera model's view in the display, using the screen coordinate system. It consists of three parameters x (start of horizontal position), y (start of vertical position), and w (width).
  • the camera model 300 includes a fixed viewing angle ⁇ and a fixed viewing distance d.
  • the environmental image of the three-dimensional virtual environment displayed in the display interface is a part of the three-dimensional virtual environment included in the perspective and the viewing distance of the camera model. surroundings.
  • the direction of the viewing angle ⁇ is the direction in which the central axis of the viewing angle ⁇ is pointed.
  • the direction of the viewing angle ⁇ is the direction of the vector L.
  • the framing transformation of the camera model is to transform the camera model to the origin of the world coordinate system, and rotate the camera model so that the optical axis is aligned with the positive direction of the z-axis of the world coordinate system.
  • All objects in the 3D virtual environment such as objects A, B, C, and D in Figure 7, are transformed with the camera model to ensure that the field of view of the camera model is constant.
  • FIG. 8 is a structural block diagram of a terminal provided by an exemplary embodiment of the present application.
  • the terminal 800 includes an operating system 820 and an application 822.
  • Operating system 820 is the underlying software that provides application 822 with secure access to computer hardware.
  • Application 822 is an application that supports a virtual environment.
  • the application 822 is an application that supports a three-dimensional virtual environment.
  • the application 822 can be a virtual reality application, a three-dimensional map application, a military simulation application, a Third-Personal Shooting Game (TPS), a first-person shooting game (FPS), a MOBA. Game, multiplayer shooter survival game.
  • the application 822 can be a stand-alone version of an application, such as a stand-alone version of a 3D gaming application.
  • FIG. 9 is a structural block diagram of a computer system provided by an exemplary embodiment of the present application.
  • the computer system 900 includes a first terminal 920, a server 940, and a second terminal 960.
  • the first terminal 920 installs and runs an application that supports the virtual environment.
  • the application may be any one of a virtual reality application, a three-dimensional map application, a military simulation application, a TPS game, an FPS game, a MOBA game, a multiplayer gun battle survival game.
  • the first terminal 920 is a terminal used by the first user, and the first user uses the first terminal 920 to control the first virtual object located in the virtual environment to perform activities, including but not limited to: adjusting body posture, crawling, walking, running, At least one of riding, jumping, driving, picking, shooting, attacking, and throwing.
  • the first virtual object is a first virtual character, such as a simulated persona or an anime character.
  • the first terminal 920 is connected to the server 940 via a wireless network or a wired network.
  • the server 940 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center.
  • Server 940 is used to provide background services for applications that support a three-dimensional virtual environment.
  • the server 940 undertakes the main computing work, the first terminal 920 and the second terminal 960 undertake the secondary computing work; or the server 940 undertakes the secondary computing work, and the first terminal 920 and the second terminal 960 undertake the main computing work;
  • the server 940, the first terminal 920, and the second terminal 960 use a distributed computing architecture for collaborative computing.
  • the second terminal 960 installs and runs an application that supports the virtual environment.
  • the application can be any of a virtual reality application, a three-dimensional map application, a military simulation application, an FPS game, a MOBA game, a multiplayer gun battle survival game.
  • the second terminal 960 is a terminal used by the second user, and the second user uses the second terminal 960 to control the second virtual object located in the virtual environment to perform activities, including but not limited to: adjusting body posture, crawling, walking, running, At least one of riding, jumping, driving, picking, shooting, attacking, and throwing.
  • the second virtual object is a second virtual character, such as a simulated character or an anime character.
  • first avatar and the second avatar are in the same virtual environment.
  • first avatar and the second avatar may belong to the same team, the same organization, have a friend relationship, or have temporary communication rights.
  • first avatar and the second avatar may also belong to different teams, different organizations, or two groups that are hostile.
  • the applications installed on the first terminal 920 and the second terminal 960 are the same, or the applications installed on the two terminals are the same type of applications of different control system platforms.
  • the first terminal 920 can be generally referred to as one of the plurality of terminals
  • the second terminal 960 can be generally referred to as one of the plurality of terminals. This embodiment is only illustrated by the first terminal 920 and the second terminal 960.
  • the terminal types of the first terminal 920 and the second terminal 960 are the same or different, and the terminal types include: a game console, a desktop computer, a smart phone, a tablet computer, an e-book reader, and an MP3 (Moving Picture Experts Group Audio Layer III).
  • the expert compresses the standard audio layer 3) at least one of a player, an MP4 (Moving Picture Experts Group Audio Layer IV) player and a laptop portable computer.
  • the following embodiments are exemplified by the fact that the terminal is a mobile terminal.
  • FIG. 10 is a flowchart of a method for performing perspective switching on a virtual environment provided by an exemplary embodiment of the present application, which may be applied to the terminal in the embodiment of FIG. 8 and FIG. 9 or FIG. 9 .
  • the method is applied to the terminal in the embodiment, and the method includes:
  • Step 1001 displaying a first user interface.
  • the terminal displays a first user interface, where the first user interface includes an environment image and a view switching area when the virtual object observes the three-dimensional virtual environment in a first view direction, and the view switch area includes at least one view jump element for performing a view jump.
  • the view jump element is an element control in the view switching area.
  • the first viewing angle direction is a viewing angle direction according to a first person perspective of the virtual object or a third person perspective around the virtual object.
  • the view jump element includes a jump element represented by a first direction scale, and the first direction scale is used to represent all view directions or partial view directions in a 360 degree horizontal view direction centered on the position of the virtual object.
  • the first user interface 1100 includes a virtual object 1110 , an environment screen 1130 , a view switching area 1120 , and a virtual left hand stick 1140 .
  • the environment screen 1130 in the first user interface 1100 is an environment screen when the virtual object observes the three-dimensional virtual environment in the first viewing angle direction, and includes a hill 1131 directly in front of the virtual object 1110, a building 1132 in the northeast direction of the virtual object 1110, and a ground. 1134, sky 1135 and horizon 1133.
  • the view switching area 1120 is superimposed and displayed on the environment picture 1130, and is an elongated area located at the top edge of the environment picture 530, which includes a view jump element represented by the first direction scale.
  • the first direction scale is used to indicate all the viewing angle directions in the 360-degree horizontal viewing direction centered on the position of the virtual object, with 0 degrees in the north and 45 degrees in the clockwise order.
  • Direction 90 degrees is the east direction
  • 135 degrees is the southeast direction
  • 180 degrees is the south direction
  • 225 degrees is the southwest direction
  • 270 degrees is the west direction
  • 315 degrees is the northwest direction.
  • the view jump element includes a jump element represented by a second direction scale, and the second direction scale is used to indicate all view directions or partial view directions between the first look-up direction of the virtual object and the second top view direction. .
  • the first user interface 1200 includes a virtual object 1210, an environment screen 1230, a view switching area 1220, and a virtual left hand stick 1240.
  • the environment screen 1230 in the first user interface 1200 is an environment screen when the virtual object 1210 observes the three-dimensional virtual environment in the first viewing angle direction, and includes a hill 1231 that is far from the virtual object 1210 and a factory that is closer to the virtual object 1210. 1232, the earth 1234, the sky 1235, and the horizon 1233, wherein the virtual object's angle of view is in the -45 degree direction of the top view.
  • the view switching area 620 is superimposed and displayed on the environment screen 1230, and is an elongated area located at the right edge of the environment picture 1230, which includes a view jump element represented by the second direction scale, in the exemplary embodiment, the second The direction scale is used to indicate all the viewing angle directions between the first bottom view direction and the second top view direction of the virtual object, wherein the vertical upward direction is the first bottom view direction, and the direction corresponds to an angle of 90 degrees to The straight downward direction is the second top view direction, and the angle corresponding to the direction is -90 degrees.
  • the view jump element includes a jump element represented by the first sub-area, and each of the first sub-areas is used to represent one of the 360-degree horizontal view directions centered on the position of the virtual object.
  • the first user interface 1300 includes a virtual object 1310, an environment screen 1330, a view switching area 1320, and a virtual left hand stick 1340.
  • the environment screen 1330 in the first user interface 1300 is an environment screen when the virtual object 1310 observes the three-dimensional virtual environment in the first viewing angle direction, and includes a hill 1331 directly in front of the virtual object 1310 and a building 1332 in the northeast direction of the virtual object 1310. Earth 1334, Sky 1335 and Horizon 1333.
  • the view switching area 1320 is superimposed and displayed on the environment screen 1330, and is an elongated area located at the top edge of the environment picture 1330, which includes a plurality of first sub-area representation angle jump elements, each of the first sub-areas corresponding to the virtual
  • the object is located at a center of the 360-degree horizontal viewing direction, wherein each of the first sub-regions may represent a corresponding viewing direction by a logo or a text.
  • the view jump element includes a jump element represented by the second sub-area, and each of the second sub-areas is used to represent a view direction between the first look-up direction of the virtual object and the second top view direction.
  • the first user interface 1400 includes a virtual object 1410, an environment screen 1430, a view switching area 1420, and a virtual left hand stick 1440.
  • the environment screen 1430 in the first user interface 1400 is an environment image when the virtual object 1410 observes the three-dimensional virtual environment in the first viewing angle direction, and includes a hill 1431 that is far from the virtual object 1410 and a factory that is closer to the virtual object 1410. 1432, the earth 1434, the sky 1435, and the horizon 1433, wherein the virtual object's angle of view is in the -45 degree direction of the top view.
  • the view switching area 1420 is superimposed and displayed on the environment screen 1430, and is an elongated area located at the right edge of the environment picture 1430, which includes a plurality of view sub-regions representing the view jump elements, each of the second sub-areas corresponding to A viewing direction between the first look-up direction of the virtual object and the second top view direction, wherein each of the second sub-areas may represent a corresponding view direction by a logo or a text.
  • FIG. 11 , FIG. 12 , FIG. 13 , and FIG. 14 may be separately implemented, or the embodiments of FIG. 11 and FIG. 12 may be implemented together, or the implementations of FIG. 13 and FIG. 14 may be implemented.
  • the examples are implemented in combination, or the embodiments of FIGS. 11 and 14 are implemented in combination, or the embodiments of FIGS. 12 and 13 are combined.
  • Step 1002 Receive a view jump signal triggered on the target view jump element.
  • the terminal receives a view jump signal triggered on the target view jump element, and the view switch signal is a signal generated by the user clicking the target view jump element, wherein the target view jump element is one of the at least one view jump element.
  • the user clicks on the target view switching element in the view switching area 1120, such as a scale of 30 degrees in the scale ruler, to generate a view jump signal, and the terminal receives the view jump signal.
  • the target view switching element in the view switching area 1120 such as a scale of 30 degrees in the scale ruler
  • the user clicks on the target view switching element in the view switching area 1220, such as a scale of 0 degrees in the scale scale, to generate a view jump signal, and the terminal receives the view jump signal.
  • the target view switching element in the view switching area 1220 such as a scale of 0 degrees in the scale scale
  • the user clicks on a target view switching element in the view switching area 1320, such as the first sub-area 6, to generate a view jump signal, and the terminal receives the view jump signal.
  • the user clicks on a target view switching element in the view switching area 1420, such as the second sub-area 4, to generate a view jump signal, and the terminal receives the view jump signal.
  • a target view switching element in the view switching area 1420 such as the second sub-area 4
  • Step 1003 Determine a second view direction corresponding to the target view jump element.
  • the terminal After acquiring the view jump signal, the terminal determines a second view direction corresponding to the target view jump element according to the view jump signal.
  • the terminal determines that the second viewing angle is 30 degrees eastward according to the angle of view jump signal generated by the user clicking the 30 degree scale in the viewing angle switching area 1120.
  • the terminal determines that the second viewing angle is 0 degrees of the head view according to the viewing angle jump signal generated by the user clicking the 0 degree scale in the viewing angle switching area 1220.
  • the terminal determines that the second viewing angle is north according to the direction corresponding to the pre-stored first sub-region 6 according to the viewing angle jump signal generated by the user in the first sub-region 6 in the viewing angle switching region 1320. 30 degrees east.
  • the terminal determines, according to the direction corresponding to the pre-stored second sub-area 4, that the second viewing angle is a head-up according to the angle of view jump signal generated by the second sub-region 4 in the viewing angle switching area 1420. 0 degrees.
  • Step 1004 displaying a second user interface.
  • the terminal After obtaining the second viewing angle direction, the terminal displays a second user interface, where the second user interface includes an environment image when the virtual object observes the three-dimensional virtual environment in the second viewing angle direction.
  • the terminal determines that the second viewing angle is 30 degrees east of the north, and displays the second user interface 1150.
  • the second user interface 1150 includes the virtual object 1110 and observes the three-dimensional virtual direction 30 degrees east of the north.
  • the terminal determines that the second viewing angle is 0 degrees of the head view, and displays the second user interface 1250.
  • the second user interface 1250 includes the virtual object 1210 when the three-dimensional virtual environment is observed in the 0 degree direction of the head-up.
  • the environment screen 1260 wherein the environment screen 1260 includes a hill 1231 that is far ahead of the virtual object 1110, a building 1232 that is closer to the front of the virtual object 1210, a sky 1235, a ground 1234, and a horizon 1233.
  • the terminal determines that the second viewing angle is 30 degrees north to the east, and displays the second user interface 1350.
  • the second user interface 1350 includes the virtual object 1310 and observes the three-dimensional direction 30 degrees to the north.
  • the environment screen 1360 in the virtual environment wherein the environment screen 1360 includes a building 1332 directly in front of the virtual object 110, a hill 1331, a sky 1335, a land 1334, and a horizon 1333 which are west of the virtual object 110.
  • the terminal determines that the second viewing angle is 0 degrees of the head-up, and displays the second user interface 1450.
  • the second user interface 1450 includes the virtual object 1410 when the three-dimensional virtual environment is observed in the 0 degree direction of the head-up.
  • the environment screen 1460 wherein the environment screen 1460 includes a hill 1431 that is far ahead of the virtual object 1410, a building 1432 that is closer to the front of the virtual object 1410, a sky 1435, a ground 1434, and a horizon 1433.
  • the second display interface is displayed according to the second view security line corresponding to the target view jump element by receiving the view jump signal triggered on the target view jump element in the view switching area.
  • the problem of low human-computer interaction efficiency caused by changing the viewing direction of the virtual object by sliding the screen with the right hand is to change the viewing direction of the virtual object by clicking the target viewing angle jump element instead of the right hand joystick.
  • the viewing angle is switched, which improves the efficiency of human-computer interaction.
  • FIG. 15 is a flowchart of a method for performing perspective switching on a virtual environment provided by an exemplary embodiment of the present application, which may be applied to the terminal in the embodiment of FIG. 8 and FIG. 9 or FIG. 9 .
  • the method is applied to the terminal in the embodiment, and the method includes:
  • Step 1501 displaying a first user interface.
  • the terminal displays a first user interface, where the first user interface includes an environment image and a view switching area when the virtual object observes the three-dimensional virtual environment in a first view direction, and the view switch area includes at least one view jump element for performing a view jump.
  • the view jump element is an element control in the view switching area.
  • Step 1502 Receive a view jump signal triggered on the target view jump element.
  • the terminal receives a view jump signal triggered on the target view jump element, and the view switch signal is a signal generated by the user clicking the target view jump element, wherein the target view jump element is one of the at least one view jump element.
  • the complete click event consists of at least one finger press action (ACTION_DOWN) and one finger lift action (ACTION_UP), and may also include at least one finger swipe action (ACTION_MOVE), wherein each complete click Events are started by ACTION_DOWN.
  • Each action can have a corresponding touch coordinate on the touch screen.
  • the terminal When the user's finger presses on the touch display screen, the terminal starts to record the starting coordinates corresponding to the pressing area. When the user raises the finger, the terminal obtains the ending coordinate of the user's click operation, and the terminal determines between the starting coordinate and the ending coordinate, and Whether the sliding coordinate included between the starting coordinate and the ending coordinate is located in the target jump element. If all the coordinates corresponding to the user click operation are within the target jump element and the sliding distance exceeds the threshold, then the user's click operation is determined. The control is the target jump element.
  • Step 1503 In the preset correspondence, query a second view direction corresponding to the target view jump element.
  • the terminal After acquiring the view jump signal, the terminal queries the second view direction corresponding to the target view jump element in the preset correspondence relationship, wherein the preset correspondence relationship stores the view direction corresponding to each of the at least one view jump element.
  • the terminal stores a correspondence table of each view jump element and a corresponding view direction, and the terminal obtains a view direction corresponding to the target view jump element by querying the correspondence table, wherein the correspondence table can be One is shown.
  • the terminal has a viewing angle direction corresponding to each first direction scale, and the terminal can query a viewing angle direction corresponding to the target scale.
  • the target scale is 30 degrees
  • the corresponding direction is the north bias. 30 degrees east.
  • the terminal has a viewing angle direction corresponding to each second direction scale, and the terminal can query a viewing angle direction corresponding to the target scale.
  • the target scale is 0 degrees
  • the corresponding direction is a head-up view. 0 degree.
  • a view direction corresponding to each first sub-area is stored in the terminal, and the terminal may query a view direction corresponding to the target sub-area.
  • the target sub-area is the first sub-area 6 and corresponds to The direction is 30 degrees east of the north.
  • the terminal has a view direction corresponding to each second sub-area, and the terminal can query a view direction corresponding to the target sub-area.
  • the target sub-area is 0 degrees
  • the corresponding direction is 0 degrees of head-up.
  • step 1504 a second user interface is displayed.
  • the terminal After obtaining the second viewing angle direction, the terminal displays a second user interface, where the second user interface includes an environment image when the virtual object observes the three-dimensional virtual environment in the second viewing angle direction.
  • the terminal determines that the second viewing angle is 30 degrees east of the north, and displays the second user interface 1150.
  • the second user interface 1150 includes the virtual object 1110 and observes the three-dimensional virtual direction 30 degrees east of the north.
  • the terminal determines that the second viewing angle is 0 degrees of the head view, and displays the second user interface 1250.
  • the second user interface 1250 includes the virtual object 1210 when the three-dimensional virtual environment is observed in the 0 degree direction of the head-up.
  • the environment screen 1260 wherein the environment screen 1260 includes a hill 1231 that is far ahead of the virtual object 1210, a building 1232 that is closer to the front of the virtual object 1210, a sky 1235, a ground 1234, and a horizon 1233.
  • the terminal determines that the second viewing angle is 30 degrees north to the east, and displays the second user interface 1350.
  • the second user interface 1350 includes the virtual object 1310 and observes the three-dimensional direction 30 degrees to the north.
  • the environment screen 1360 in the virtual environment wherein the environment screen 1360 includes a building 1332 directly in front of the virtual object 1310, a hill 1331, a sky 1335, a land 1334, and a horizon 1333 which are westward of the virtual object 1310.
  • the terminal determines that the second viewing angle is 0 degrees of the head-up, and displays the second user interface 1450.
  • the second user interface 1450 includes the virtual object 1410 when the three-dimensional virtual environment is observed in the 0 degree direction of the head-up.
  • the environment screen 1460 wherein the environment screen 1460 includes a hill 1431 that is far ahead of the virtual object 1410, a building 1432 that is closer to the front of the virtual object 1410, a sky 1435, a ground 1434, and a horizon 1433.
  • step 1501 and step 1502 are performed in the terminal, and step 1503 is performed by the server.
  • the terminal After the terminal acquires the view jump signal, the terminal sends a view jump signal to the server, and the server according to the view angle.
  • the jump signal queries the second view direction corresponding to the target view jump element in the locally stored correspondence relationship, and sends the second view direction to the terminal, and the terminal displays the second user interface according to the second view direction sent by the server.
  • the second display interface is displayed according to the second view security line corresponding to the target view jump element by receiving the view jump signal triggered on the target view jump element in the view switching area.
  • the problem of low human-computer interaction efficiency caused by changing the viewing direction of the virtual object by sliding the screen with the right hand is to change the viewing direction of the virtual object by clicking the target viewing point jump element without sliding the screen by the right hand.
  • the viewing angle is switched, which improves the efficiency of human-computer interaction.
  • the second view direction is determined by querying the second view direction corresponding to the target view jump element, so that the second user interface is displayed, and the second view direction speed is obtained by querying the corresponding relationship. Fast, increasing the speed of view switching.
  • FIG. 16 is a flowchart of a method for performing perspective switching on a virtual environment provided by an exemplary embodiment of the present application, which may be applied to the terminal in the embodiment of FIG. 8 and FIG. 9 or FIG. 9 .
  • the method is applied to the terminal in the embodiment, and the method includes:
  • step 1601 a first user interface is displayed.
  • the terminal displays a first user interface, where the first user interface includes an environment image and a view switching area when the virtual object observes the three-dimensional virtual environment in a first view direction, and the view switch area includes at least one view jump element for performing a view jump.
  • the view jump element is an element control in the view switching area.
  • the first user interface 1700 includes a virtual object 1710, an environment screen 1730, a view switching area 1720, and a virtual left hand stick 1740.
  • the environment screen 1730 in the first user interface 1700 is an environment screen when the virtual object 1710 observes the three-dimensional virtual environment in the first viewing angle direction, and includes a hill 1731 directly in front of the virtual object 1710 and a building 1732 in the northeast direction of the virtual object 1710.
  • the earth is 1734, the sky is 1735, and the horizon is 1733.
  • the view switching area 1720 is superimposed and displayed on the environment screen 1730, which is an elongated area located at the top edge of the environment screen 1730, the reference view jump element is located in the central area of the view switching area 1720, and the view switch area 1720 is displayed with a slider. 1721, the user presses the slider 1721 to slide to the area where the target view jump element is located.
  • the first user interface 1800 includes a virtual object 1810, an environment screen 1830, a view switching area 1820, and a virtual left hand stick 1840.
  • the environment screen 1830 in the first user interface 1800 is an environment image when the virtual object 1810 observes the three-dimensional virtual environment in the first viewing angle direction, and includes a hill 1831 that is far from the virtual object 1810 and a factory that is closer to the virtual object 1810. 1832, the earth 1834, the sky 1835 and the horizon 1833.
  • the view switching area 1820 is superimposed and displayed on the environment screen 1830, which is an elongated area located at the right edge of the environment screen 1830.
  • the reference view jump element is located in the central area of the view switching area 1820, and the view switching area 1820 is displayed with a slide.
  • Step 1602 Receive a view jump signal triggered on the target view jump element.
  • the terminal receives a view jump signal triggered on the target view jump element, and the view switch signal is a signal generated by the user sliding the target view jump element, wherein the target view jump element is one of the at least one view jump element.
  • the slider 1721 in the user click view switching area 1720 is slid by the reference jump element located in the central area to the area where the target view jump element is located, and is triggered when the user presses the slider 1721.
  • the view jump signal continuously generates a view jump signal as the user slides the slider 1721 until the user slide slider 1721 stops at the target jump area.
  • the user clicks the slider 1821 in the view switching area 1820 to slide from the reference jump element located in the central area to the area where the target view jump element is located, which is triggered when the user presses the slider 1821.
  • the angle of view jump signal continuously generates a view jump signal as the user slides the slider 1821 until the user slides the slider 1821 to stop at the target jump area.
  • Step 1603 determining an offset value and an offset direction between the target view jump element and the reference view jump element.
  • the terminal shifts an offset value and an offset direction between the target view jump element and the reference view jump element.
  • the reference view jump element is a view jump element of the central area of the view switching area, and the terminal can pass the target view
  • the coordinates of the jump element and the coordinates of the reference view jump element are used to calculate the offset value to obtain the offset direction.
  • the terminal calculates a distance D1 between the reference view jump element and the target view jump element according to the reference point coordinate of the target view jump element and the reference point coordinate of the reference view jump element.
  • Obtaining an offset value of D1 is a direction corresponding to the target view jump element, wherein the reference point of the reference view jump element may be a center point of the reference view jump element, and a reference point of the target view jump element Can be the center point of the target perspective jump element.
  • the terminal calculates a distance D2 between the reference view jump element and the target view jump element according to the reference point coordinate of the target view jump element and the reference point coordinate of the reference view jump element.
  • Obtaining an offset value of D2 the offset direction is a direction corresponding to the target view jump element, wherein the reference point of the reference view jump element may be a center point of the reference view jump element, and a reference point of the target view jump element Can be the center point of the target perspective jump element.
  • Step 1604 determining an offset angle of the first viewing angle direction according to the offset value and the offset direction.
  • the terminal determines an offset angle of the first view direction according to an offset value and an offset direction between the target view jump element and the reference view jump element.
  • the viewing angle switching area 1720 is a full viewing angle direction in a 360-degree horizontal viewing angle direction centered on the position of the virtual object corresponding to the position of the virtual object, but only a partial viewing direction is displayed when displayed.
  • the viewing angle direction of the viewing angle switching area 1720 is 300 degrees to 0 degrees, and 0 degrees to 90 degrees, if the target angle jump element
  • the reference point is between 0 degrees and 90 degrees, and the length of the view switching area 1720 is L1, and the offset angle is 90*1/2*D1/L1.
  • the viewing angle switching area 1820 is a full viewing angle direction between the first bottom view direction and the second bottom view direction centered on the position of the virtual object corresponding to the position of the virtual object, wherein the vertical view direction is vertical
  • the direction is the first look-up direction
  • the angle corresponding to the direction is 90 degrees
  • the direction vertically downward is the second top view direction
  • the angle corresponding to the direction is -90 degrees.
  • the viewing angle direction is displayed, for example, the direction of the head-up is 0 degrees
  • the viewing angle of the viewing angle switching area 1820 is -45 degrees to 45 degrees.
  • the reference point of the target view jump element is between 0 degrees and 45 degrees
  • the length of the view switch area 1820 is L2, and the offset angle is 90*1/2*D2/L2.
  • Step 1605 After the first viewing angle is offset according to the offset angle, the second viewing direction is obtained.
  • the terminal rotates the offset angle in the direction of the first viewing angle to obtain the second viewing angle.
  • Step 1606 displaying a second user interface.
  • the terminal After obtaining the second viewing angle direction, the terminal displays a second user interface, where the second user interface includes an environment image when the virtual object observes the three-dimensional virtual environment in the second viewing angle direction.
  • the second user interface 1750 is displayed, and the second user interface 1750 includes an environment screen 1760 when the virtual object 1710 observes the three-dimensional virtual environment in the second viewing direction.
  • the environment screen 1730 includes a building 1732 directly in front of the virtual object 1710, a hill 1731 that is west of the virtual object 1710, a sky 1735, a ground 1734, and a horizon 1733.
  • the second user interface 1850 is displayed, and the second user interface 1850 includes an environment screen 1860 when the virtual object 1810 observes the three-dimensional virtual environment in the second viewing direction.
  • the environment screen 1860 includes a hill 1831 that is far ahead of the virtual object 1810, a building 1832 that is closer to the front of the virtual object 1810, a sky 1835, a ground 1834, and a horizon 1833.
  • the second display interface is displayed according to the second view security line corresponding to the target view jump element by receiving the view jump signal triggered on the target view jump element in the view switching area.
  • the problem of low human-computer interaction efficiency caused by changing the viewing direction of the virtual object by sliding the screen with the right hand is to change the viewing direction of the virtual object by clicking the target viewing point jump element without sliding the screen by the right hand.
  • the viewing angle is switched, which improves the efficiency of human-computer interaction.
  • the offset value and the offset direction between the target view jump element and the reference view jump element are determined, thereby determining the offset of the first view direction according to the offset value and the offset direction.
  • the angle, and then the first viewing angle is offset according to the offset angle, and the second viewing angle is obtained, which improves the accuracy of calculating the second viewing angle.
  • the jump element displayed by the view switching area may be all view directions including the virtual object, or may be a partial view direction including the virtual object, and the jump element displayed by the view switch area may be implemented by the embodiment of FIG.
  • FIG. 19 is a flowchart of a method for performing perspective switching on a virtual environment provided by an exemplary embodiment of the present application, which may be applied to the terminal in the embodiment of FIG. 8 and FIG. 9 or FIG. 9 .
  • the method is applied to the terminal in the embodiment, and the method includes:
  • step 1901 a first user interface is displayed.
  • the terminal displays a first user interface, where the first user interface includes an environment image and a view switching area when the virtual object observes the three-dimensional virtual environment in a first view direction, and the view switch area includes at least one view jump element for performing a view jump.
  • the view jump element is an element control in the view switching area.
  • Step 1902 Receive a view jump signal triggered on the target view jump element.
  • the terminal receives a view jump signal triggered on the target view jump element, and the view switch signal is a signal generated by the user clicking and/or sliding the target view jump element, wherein the target view jump element is at least one view jump element one of.
  • Step 1903 determining a second viewing angle direction corresponding to the target viewing angle jump element.
  • the terminal determines a second view direction corresponding to the target view jump element according to the view jump signal.
  • the terminal determines, by using the method in the embodiment, a second viewing angle direction corresponding to the target viewing angle jump element by using FIG. 15 and FIG. 16 .
  • Step 1904 displaying a target view jump element corresponding to the second view direction in a central area of the view switching area.
  • the terminal After determining the second viewing angle direction, the terminal changes the display position of the viewing angle jump element in the viewing angle switching area according to the second viewing angle direction, and displays the target viewing angle jumping element corresponding to the second viewing angle direction in the central area of the viewing angle switching area.
  • the terminal may perform step 1905a or step 1905b.
  • steps 1905a and 1905b cannot be implemented at the same time.
  • the terminal may execute 1905a and 1905b in succession, and the order of execution is not limited.
  • the target viewing angle jump element corresponding to the second viewing direction is displayed at the center of the viewing angle switching area 1120 . region.
  • the target view jump element corresponding to the second view direction is displayed in the central area of the view switching area 1220.
  • the target viewing angle jump element corresponding to the second viewing direction is displayed in the viewing angle switching area 1320 .
  • Central area when the terminal determines that the second viewing angle is 30 degrees east, the target viewing angle jump element corresponding to the second viewing direction, such as the first sub-region 6 , is displayed in the viewing angle switching area 1320 . Central area.
  • the target view jump element corresponding to the second view direction is displayed at the center of the view switch area 1420 . region.
  • Step 1905a displaying other perspective jump elements on other areas of the view switching area according to the relative positional relationship between the other view jump elements and the target view jump elements.
  • the terminal displays other perspective jump elements on other areas of the view switching area according to the relative positional relationship between the other view jump elements and the target view jump elements in the view switching area.
  • the first sub-region 1 to the first sub-region 7 displayed in the viewing angle switching region 1320 respectively correspond to all viewing angle directions in a 360-degree horizontal viewing angle direction centered on the virtual object.
  • the view direction is the direction corresponding to the first sub-area 4.
  • the view direction is the direction corresponding to the first sub-area 6.
  • the first sub-area 6 is displayed in the view switch area 1320.
  • the central area, the first sub-area 1 and the first sub-area 2 originally located at the left edge of the view switching area are displayed on the right edge of the view switching area.
  • the second sub-region 1 to the second sub-region 7 in the viewing angle switching region 1420 respectively correspond to all viewing directions between the first viewing direction of the virtual object and the second top view direction.
  • the viewing direction is the direction corresponding to the second sub-region 6.
  • the viewing direction is the direction corresponding to the second sub-region 4.
  • the second sub-region 4 is displayed in the viewing angle switching region 1420.
  • the central area, the second sub-area 3 and the second sub-area 4 originally located at the left edge of the view switching area are displayed on the right edge of the view switching area.
  • Step 1905b hiding other view jump elements on the view switching area according to the relative positional relationship between the other view jump elements and the target view jump elements.
  • the terminal hides the other position jump elements on other areas of the view switching area according to the relative positional relationship between the other view jump elements and the target view jump elements in the view switching area.
  • the scales in the view switching area 1520 respectively correspond to all the view directions in the 360-degree horizontal view direction centered on the position of the virtual object, but only the partial view direction is displayed, for example, the position of the virtual object.
  • the direction of the angle of view is 0 degrees
  • the direction of the angle of view is 30 degrees.
  • the scale of 30 degrees is displayed in the center of the angle of view switching area 520.
  • the northwest scale and the 330 degree scale originally located at the left edge of the viewing angle switching area are hidden.
  • the scales in the view switching area 1620 respectively correspond to all view directions between the first look-up direction of the virtual object and the second top view direction, but only part of the view direction is displayed, for example, only A viewing angle direction of the first object from the first looking direction of the virtual object to the second top view direction.
  • the direction of the angle of view is -45 degrees
  • the direction of the angle of view is 0 degrees after the angle of view is switched.
  • the scale of 0 degrees is displayed in the center of the area of view switching area 620.
  • the -60 degree scale, the -75 degree scale, and the -90 degree scale originally located at the lower edge of the viewing angle switching area are hidden.
  • a second user interface is displayed.
  • the terminal After obtaining the second viewing angle, the terminal displays a second user interface, where the second user interface includes an environment image when the virtual object observes the three-dimensional virtual environment in the second viewing angle direction and a viewing angle switching area superimposed on the environmental image, where the viewing angle jumps The turn element is displayed in the central area of the view switching area.
  • the second display interface is displayed according to the second view security line corresponding to the target view jump element by receiving the view jump signal triggered on the target view jump element in the view switching area.
  • the problem of low human-computer interaction efficiency caused by changing the viewing direction of the virtual object by sliding the screen with the right hand is to change the viewing direction of the virtual object by clicking the target viewing point jump element without sliding the screen by the right hand.
  • the viewing angle is switched, which improves the efficiency of human-computer interaction.
  • the second user interface can clearly reflect the second view direction of the virtual object, thereby improving the convenience of view switching.
  • FIG. 20 is a structural block diagram of a view switching device provided by an exemplary embodiment of the present application, which is used to implement view switching of a virtual environment, and the device can be applied to the embodiment of FIG. 8 and FIG.
  • the device includes a display module 2010, a receiving module 2020, and a processing module 2030.
  • the display module 2010 is configured to display a first user interface, where the first user interface includes an environment image and a view switching area when the virtual object observes the three-dimensional virtual environment in a first view direction, and the view switch area includes at least one for performing a view jump.
  • Perspective jump element
  • the receiving module 2020 is configured to receive a view jump signal triggered on the target view jump element, where the target view jump element is one of the at least one view jump element.
  • the processing module 2030 is configured to determine a second viewing angle direction corresponding to the target viewing angle jump element.
  • the display module 2010 is further configured to display a second user interface, where the second user interface includes an environment image when the virtual object observes the three-dimensional virtual environment in a second viewing angle direction.
  • the view jump element includes a jump element represented by a first direction scale, and the first direction scale is used to represent all view directions in a 360 degree horizontal view direction centered on the position of the virtual object. Or part of the viewing direction.
  • the view jump element includes a jump element represented by a second direction scale, and the second direction scale is used to represent all views between the first look-up direction of the virtual object and the second top view direction.
  • Direction or partial viewing direction is used to represent all views between the first look-up direction of the virtual object and the second top view direction.
  • the view jump element includes a jump element represented by the first sub-area, and each of the first sub-areas is used to represent one of 360 degree horizontal viewing directions centered on the position of the virtual object.
  • Direction of view is used to represent one of 360 degree horizontal viewing directions centered on the position of the virtual object.
  • the view jump element includes a jump element represented by the second sub-area, and each of the second sub-areas is used to represent between the first look-up direction of the virtual object and the second top view direction. A viewing angle.
  • the view switching area is superimposed and displayed on the environment screen.
  • the viewing angle switching region is an elongated region located at a predetermined edge of the environmental image, the predetermined edge including at least one of a top edge, a bottom edge, a left edge, and a right edge.
  • the processing module 2030 is further configured to query a second view direction corresponding to the target view jump element, where the preset view relationship stores a view direction corresponding to each of the at least one view jump element.
  • the processing module 2030 is further configured to determine an offset value and an offset direction between the target view jump element and the reference view jump element, where the reference view jump element is a view jump element corresponding to the first view direction; The value and the offset direction determine an offset angle of the first view direction; after the first view direction is offset according to the offset angle, the second view direction is obtained.
  • the display module 2010 is further configured to change a display position of the view jump element in the view switching area according to the second view direction.
  • the view jump element includes a jump element represented by a first direction scale
  • the display module 2010 is further configured to display, in a central area of the view switching area, a target view jump element corresponding to the second view direction; according to the relative positional relationship between the other view jump element and the target view jump element, the view switch Other view jump elements are displayed on other areas of the area; or other view jump elements are hidden on the view switch area according to the relative positional relationship between the other view jump elements and the target view jump elements.
  • the viewing angle switching area includes a first scale bar represented by a first direction scale and a slider located on the first scale strip; the angle of view jump signal is to place the slider on the first scale a signal for moving on the bar; the second viewing angle is a viewing angle direction corresponding to the scale indicated by the moving slider on the first scale bar; wherein the first direction scale is used to indicate that the position of the virtual object is centered All viewing directions or partial viewing directions in a 360-degree horizontal viewing direction.
  • the viewing angle switching region includes a second scale bar represented by a second direction scale and a slider located on the second scale strip; the angle of view jump signal is performed on the second scale strip a moving signal; the second viewing angle is a viewing angle direction corresponding to the scale indicated by the moved slider on the second scale bar; wherein the second direction scale is used to indicate the first looking direction of the virtual object to the second top view All viewing directions or partial viewing directions between directions.
  • FIG. 21 is a structural block diagram of an electronic device 2100 according to an exemplary embodiment of the present application, which is used to implement a view switching of a virtual environment.
  • the electronic device 2100 can be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III), and an MP4 (Moving Picture Experts Group Audio Layer IV). Audio level 4) Player, laptop or desktop computer.
  • Electronic device 2100 may also be referred to as a terminal, user device, portable terminal, laptop terminal, desktop terminal, and the like.
  • the electronic device 2100 includes a processor 2101 and a memory 2102.
  • the processor 2101 can include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like.
  • the processor 2101 may be configured by at least one of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). achieve.
  • the processor 2101 may also include a main processor and a coprocessor.
  • the main processor is a processor for processing data in an awake state, also called a CPU (Central Processing Unit); the coprocessor is A low-power processor for processing data in standby.
  • the processor 2101 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and rendering of content that needs to be displayed on the display screen.
  • the processor 2101 may further include an AI (Artificial Intelligence) processor for processing computational operations related to machine learning.
  • AI Artificial Intelligence
  • Memory 2102 can include one or more computer readable storage media, which can be non-transitory. Memory 2102 can also include high speed random access memory, as well as non-volatile memory, such as one or more disk storage devices, flash storage devices. In some embodiments, the non-transitory computer readable storage medium in the memory 2102 is configured to store at least one instruction for execution by the processor 2101 to implement the methods provided by the method embodiments of the present application. A method of switching views of a three-dimensional virtual environment.
  • the electronic device 2100 can also optionally include: a peripheral device interface 2103 and at least one peripheral device.
  • the processor 2101, the memory 2102, and the peripheral device interface 2103 may be connected by a bus or a signal line.
  • Each peripheral device can be connected to the peripheral device interface 2103 via a bus, signal line or circuit board.
  • the peripheral device includes at least one of a radio frequency circuit 2104, a touch display screen 2105, a camera 2106, an audio circuit 2107, a positioning component 2108, and a power source 2109.
  • the peripheral device interface 2103 can be used to connect at least one peripheral device associated with an I/O (Input/Output) to the processor 2101 and the memory 2102.
  • processor 2101, memory 2102, and peripheral interface 2103 are integrated on the same chip or circuit board; in some other embodiments, any of processor 2101, memory 2102, and peripheral interface 2103 or The two can be implemented on a separate chip or circuit board, which is not limited in this embodiment.
  • the RF circuit 2104 is configured to receive and transmit an RF (Radio Frequency) signal, also referred to as an electromagnetic signal.
  • the RF circuit 2104 communicates with the communication network and other communication devices via electromagnetic signals.
  • the radio frequency circuit 2104 converts the electrical signal into an electromagnetic signal for transmission, or converts the received electromagnetic signal into an electrical signal.
  • the radio frequency circuit 2104 includes an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and the like.
  • the radio frequency circuit 2104 can communicate with other terminals via at least one wireless communication protocol.
  • the wireless communication protocols include, but are not limited to, the World Wide Web, a metropolitan area network, an intranet, generations of mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity) networks.
  • the radio frequency circuit 2104 may further include an NFC (Near Field Communication) related circuit, which is not limited in this application.
  • the display 2105 is for displaying a UI (User Interface).
  • the UI can include graphics, text, icons, video, and any combination thereof.
  • the display 2105 is a touch display, the display 2105 also has the ability to capture touch signals over the surface or surface of the display 2105.
  • the touch signal can be input to the processor 2101 as a control signal for processing.
  • the display 2105 can also be used to provide virtual buttons and/or virtual keyboards, also referred to as soft buttons and/or soft keyboards.
  • the display screen 2105 may be one, and the front panel of the electronic device 2100 is disposed; in other embodiments, the display screen 2105 may be at least two, respectively disposed on different surfaces of the electronic device 2100 or in a folded design.
  • the display screen 2105 can be a flexible display screen disposed on a curved surface or a folded surface of the electronic device 2100. Even the display screen 2105 can be set to a non-rectangular irregular pattern, that is, a profiled screen.
  • the display 2105 can be made of a material such as an LCD (Liquid Crystal Display) or an OLED (Organic Light-Emitting Diode).
  • Camera component 2106 is used to capture images or video.
  • camera assembly 2106 includes a front camera and a rear camera.
  • the front camera is placed on the front panel of the terminal, and the rear camera is placed on the back of the terminal.
  • the rear camera is at least two, which are respectively a main camera, a depth camera, a wide-angle camera, and a telephoto camera, so as to realize the background blur function of the main camera and the depth camera, and the main camera Combine with a wide-angle camera for panoramic shooting and VR (Virtual Reality) shooting or other integrated shooting functions.
  • camera assembly 2106 can also include a flash.
  • the flash can be a monochrome temperature flash or a two-color temperature flash.
  • the two-color temperature flash is a combination of a warm flash and a cool flash that can be used for light compensation at different color temperatures.
  • the audio circuit 2107 can include a microphone and a speaker.
  • the microphone is used to collect sound waves of the user and the environment, and convert the sound waves into electrical signals for processing to the processor 2101 for processing, or input to the RF circuit 2104 for voice communication.
  • the microphones may be multiple, and are respectively disposed at different parts of the electronic device 2100.
  • the microphone can also be an array microphone or an omnidirectional acquisition microphone.
  • the speaker is then used to convert electrical signals from the processor 2101 or the RF circuit 2104 into sound waves.
  • the speaker can be a conventional film speaker or a piezoelectric ceramic speaker.
  • the audio circuit 2107 can also include a headphone jack.
  • the positioning component 2108 is configured to locate the current geographic location of the electronic device 2100 to implement navigation or LBS (Location Based Service).
  • the positioning component 2108 can be a positioning component based on a GPS (Global Positioning System) of the United States, a Beidou system of China, or a Galileo system of Russia.
  • the power source 2109 is used to power various components in the electronic device 2100.
  • the power source 2109 can be an alternating current, a direct current, a disposable battery, or a rechargeable battery.
  • the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery.
  • a wired rechargeable battery is a battery that is charged by a wired line
  • a wireless rechargeable battery is a battery that is charged by a wireless coil.
  • the rechargeable battery can also be used to support fast charging technology.
  • electronic device 2100 also includes one or more sensors 2110.
  • the one or more sensors 2110 include, but are not limited to, an acceleration sensor 2111, a gyro sensor 2112, a pressure sensor 2113, a fingerprint sensor 2114, an optical sensor 2115, and a proximity sensor 2116.
  • the acceleration sensor 2111 can detect the magnitude of the acceleration on the three coordinate axes of the coordinate system established by the electronic device 2100.
  • the acceleration sensor 2111 can be used to detect components of gravity acceleration on three coordinate axes.
  • the processor 2101 can control the touch display 2105 to display the user interface in a landscape view or a portrait view according to the gravity acceleration signal collected by the acceleration sensor 2111.
  • the acceleration sensor 2111 can also be used for the acquisition of game or user motion data.
  • the gyro sensor 2112 can detect the body direction and the rotation angle of the electronic device 2100, and the gyro sensor 2112 can cooperate with the acceleration sensor 2111 to collect the 3D action of the user on the electronic device 2100. Based on the data collected by the gyro sensor 2112, the processor 2101 can implement functions such as motion sensing (such as changing the UI according to the user's tilting operation), image stabilization at the time of shooting, game control, and inertial navigation.
  • functions such as motion sensing (such as changing the UI according to the user's tilting operation), image stabilization at the time of shooting, game control, and inertial navigation.
  • the pressure sensor 2113 may be disposed on a side border of the electronic device 2100 and/or a lower layer of the touch display screen 2105.
  • the pressure sensor 2113 When the pressure sensor 2113 is disposed on the side frame of the terminal 2100, the user's holding signal to the electronic device 2100 can be detected, and the processor 2101 performs left and right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 2113.
  • the operability control on the UI interface is controlled by the processor 2101 according to the user's pressure operation on the touch display screen 2105.
  • the operability control includes at least one of a button control, a scroll bar control, an icon control, and a menu control.
  • the fingerprint sensor 2114 is configured to collect the fingerprint of the user, and the processor 2101 identifies the identity of the user according to the fingerprint collected by the fingerprint sensor 2114, or the fingerprint sensor 2114 identifies the identity of the user according to the collected fingerprint. Upon identifying that the identity of the user is a trusted identity, the processor 2101 authorizes the user to perform related sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying and changing settings, and the like.
  • the fingerprint sensor 2114 can be disposed on the front, back or side of the electronic device 2100. When the physical button or vendor logo is provided on the electronic device 2100, the fingerprint sensor 2114 can be integrated with the physical button or the manufacturer logo.
  • Optical sensor 2115 is used to collect ambient light intensity.
  • the processor 1501 can control the display brightness of the touch display 2105 based on the ambient light intensity acquired by the optical sensor 2115. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 2105 is raised; when the ambient light intensity is low, the display brightness of the touch display screen 2105 is lowered.
  • the processor 2101 can also dynamically adjust the shooting parameters of the camera assembly 2106 according to the ambient light intensity acquired by the optical sensor 2115.
  • Proximity sensor 2116 also referred to as a distance sensor, is typically disposed on the front panel of electronic device 2100.
  • the proximity sensor 2116 is used to collect the distance between the user and the front of the electronic device 2100.
  • the processor 2101 controls the touch display 2105 to switch from the bright screen state to the touch screen state; when the proximity sensor 2116 When it is detected that the distance between the user and the front side of the electronic device 2100 is gradually increased, the processor 2101 controls the touch display screen 2105 to switch from the state of the screen to the bright state.
  • FIG. 19 does not constitute a limitation to the electronic device 1500, and may include more or less components than those illustrated, or may be combined with some components or may be arranged with different components.
  • the application further provides a computer readable storage medium, where the storage medium stores at least one instruction, at least one program, a code set or a set of instructions, the at least one instruction, the at least one program, the code set or The instruction set is loaded and executed by the processor to implement the method for performing perspective switching on the virtual environment provided by the foregoing method embodiment.
  • the present application also provides a computer program product comprising instructions that, when executed on a computer, cause the computer to perform the method of viewing angle switching for a virtual environment as described in the above aspects.
  • a plurality as referred to herein means two or more.
  • "and/or” describing the association relationship of the associated objects, indicating that there may be three relationships, for example, A and/or B, which may indicate that there are three cases where A exists separately, A and B exist at the same time, and B exists separately.
  • the character "/" generally indicates that the contextual object is an "or" relationship.
  • a person skilled in the art may understand that all or part of the steps of implementing the above embodiments may be completed by hardware, or may be instructed by a program to execute related hardware, and the program may be stored in a computer readable storage medium.
  • the storage medium mentioned may be a read only memory, a magnetic disk or an optical disk or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optics & Photonics (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

一种用于对虚拟环境进行视角切换的方法、设备及存储介质。所述方法包括:显示第一用户界面,第一用户界面包括虚拟对象以第一视角方向观察三维虚拟环境时的环境画面和视角切换区域,视角切换区域包括至少一个用于进行视角跳转的视角跳转元素(1001);接收在目标视角跳转元素上触发的视角跳转信号,目标视角跳转元素是所述至少一个视角跳转元素中的一个(1002);确定与目标视角跳转元素对应的第二视角方向(1003);显示第二用户界面,第二用户界面包括虚拟对象以第二视角方向观察三维虚拟环境时的环境画面(1004)。所述方法由于不通过右手滑动屏幕而是通过点击目标视角跳转元素切换虚拟对象的视角方向,提高了视角切换速度,从而提高了人机交互效率。

Description

用于对虚拟环境进行视角切换的方法、装置、设备及存储介质
本申请要求于2018年2月9日提交的申请号为201810135907.7、发明名称为“用于对三维虚拟环境进行视角切换的方法、装置、设备及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及人机交互领域,特别涉及一种用于对三维虚拟环境进行视角切换的方法、装置、设备及存储介质。
背景技术
在诸如智能手机、平板电脑之类的终端上,存在很多具有三维虚拟环境的应用程序,如:虚拟现实应用程序、三维地图应用程序、军事仿真应用程序、第一人称射击游戏(First-person shooting game,FPS)、多人在线战术竞技游戏(Multiplayer Online Battle Arena Games,MOBA)等。
在相关技术中,终端采用虚拟人物的第一人称视角来显示虚拟环境中的环境画面。以智能手机中的FPS游戏为例,FPS游戏的用户界面上显示有左手摇杆和右手滑动屏幕,用户可以使用左手摇杆改变虚拟人物在三维虚拟环境中的位置,使用右手滑动屏幕改变虚拟人物在三维虚拟环境中的视角方向。比如,用户使用右手指向左滑动,则将虚拟人物在三维虚拟环境中的视角方向向左侧转动。
由于右手滑动屏幕的操作区域有限,当用户进行较大范围的视角切换时,需要多次滑动右手滑动屏幕才能够完成视角切换,人机交互效率较低。
发明内容
本申请实施例提供了一种用于对虚拟环境进行视角切换的方法、装置、设备及存储介质,可以解决相关技术中用户进行较大范围的视角切换时,需要多次滑动右手滑动屏幕才能够完成视角切换,视角切换速度较慢的问题。所述技术方案如下:
一方面,提供了一种用于对虚拟环境进行视角切换的方法,所述方法包括:
显示第一用户界面,所述第一用户界面包括虚拟对象以第一视角方向观察虚拟环境时的环境画面和视角切换区域,所述视角切换区域包括至少一个用于进行视角跳转的视角跳转元素;
接收在目标视角跳转元素上触发的视角跳转信号,所述目标视角跳转元素是所述至少一个视角跳转元素中的一个;
确定与所述目标视角跳转元素对应的第二视角方向;
显示第二用户界面,所述第二用户界面包括所述虚拟对象以第二视角方向观察所述虚拟环境时的环境画面。
一方面,提供了一种视角切换装置,所述装置用于实现对虚拟环境进行视角切换,所述装置包括:
显示模块,用于显示第一用户界面,所述第一用户界面包括虚拟对象以第一视角方向观察虚拟环境时的环境画面和视角切换区域,所述视角切换区域包括至少一个用于进行视角跳转的视角跳转元素;
接收模块,用于接收在目标视角跳转元素上触发的视角跳转信号,所述目标视角跳转元素是所述至少一个视角跳转元素中的一个;
处理模块,用于确定与所述目标视角跳转元素对应的第二视角方向;
所述显示模块,还用于显示第二用户界面,所述第二用户界面包括所述虚拟对象以第二视角方向观察所述虚拟环境时的环境画面。
一方面,提供了一种电子设备,所述设备用于实现对虚拟环境进行视角切换,所述设备包括处理器和存储器,所述存储器中存储有至少一条指令,所述指令由所述处理器加载并执行以实现如上述的用于对虚拟环境进行视角切换的方法。
一方面,提供了一种计算机可读存储介质,所述存储介质中存储有至少一条指令,所述指令由处理器加载并执行以实现如上述的用于对虚拟环境进行视角切换的方法。
一方面,提供了一种计算机程序产品,所述计算机程序产品存储有至少一条指令,所述至少一条指令由处理器加载并执行以实现如上述的用于对虚拟环境进行视角切换的方法。
本申请实施例提供的技术方案带来的有益效果至少至少包括:
通过接收视角切换区域内的目标视角跳转元素上触发的视角跳转信号,根据目标视角跳转元素对应的第二视角方向显示第二显示界面,解决了相关技术 中通过右手滑动屏幕改变虚拟对象的视角方向所带来的人机交互效率较低的问题,由于不通过右手滑动屏幕而是通过点击目标视角跳转元素切换虚拟对象的视角方向,提高了视角切换速度,从而提高了用户在大范围改变视角方向时的人机交互效率。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其它的附图。
图1是本申请一个示例性实施例提供的三维虚拟环境的显示界面示意图;
图2是本申请一个示例性实施例提供的三维虚拟环境中的世界坐标系和摄像机模型的局部坐标系的示意图;
图3是本申请一个示例性实施例提供的摄像机模型绕向量R的旋转的示意图;
图4是本申请一个示例性实施例提供的摄像机模型绕向量U的旋转的示意图;
图5是本申请一个示例性实施例提供的摄像机模型绕向量L的旋转的示意图;
图6是本申请一个示例性实施例提供的三维虚拟环境中的摄像机模型的视角示意图;
图7是本申请一个示例性实施例提供的摄像机模型的取景变换的示意图;
图8是本申请一个示例性实施例提供的实施环境示意图;
图9是本申请一个示例性实施例提供的实施环境示意图;
图10是本申请一个示例性实施例提供的用于对虚拟环境进行视角切换的方法的流程图;
图11是本申请一个示例性实施例提供的三维虚拟环境的用户界面示意图;
图12是本申请一个示例性实施例提供的三维虚拟环境的用户界面示意图;
图13是本申请一个示例性实施例提供的三维虚拟环境的用户界面示意图;
图14是本申请一个示例性实施例提供的三维虚拟环境的用户界面示意图;
图15是本申请一个示例性实施例提供的用于对虚拟环境进行视角切换的 方法的流程图;
图16是本申请一个示例性实施例提供的用于对虚拟环境进行视角切换的方法的流程图;
图17是本申请一个示例性实施例提供的三维虚拟环境的用户界面示意图;
图18是本申请一个示例性实施例提供的三维虚拟环境的用户界面示意图;
图19是本申请一个示例性实施例提供的用于对虚拟环境进行视角切换的方法的流程图;
图20是本申请一个示例性实施例提供的视角切换装置的结构框图;
图21是本申请一个示例性实施例提供的电子设备的结构框图。
具体实施方式
为使本发明的目的、技术方案和优点更加清楚,下面将结合附图对本发明实施方式作进一步地详细描述。
首先,对本申请实施例中涉及的若干个名词进行解释:
虚拟环境:是应用程序在终端上运行时显示(或提供)的虚拟环境。该虚拟环境可以是对真实世界的仿真环境,也可以是半仿真半虚构的环境,也可以是纯虚构的环境。虚拟环境可以是二维虚拟环境、2.5维虚拟环境和三维虚拟环境中的任意一种,下述实施例以虚拟环境是三维虚拟环境来举例说明,但对此不加以限定。可选地,该虚拟环境还用于至少两个虚拟角色之间的虚拟环境对战。可选地,该虚拟环境还用于至少两个虚拟角色之间使用虚拟枪械进行对战。可选地,该虚拟环境还用于在目标区域范围内,至少两个虚拟角色之间使用虚拟枪械进行对战,该目标区域范围会随虚拟环境中的时间推移而不断变小。本申请实施例中,以该虚拟环境是三维虚拟环境为例进行说明。
虚拟对象:虚拟对象是指在三维虚拟环境中的可活动对象。该可活动对象可以是虚拟人物、虚拟动物、动漫人物等,比如:在三维虚拟环境中显示的人物、动物、植物、油桶、墙壁、石块等。可选地,虚拟对象是基于动画骨骼技术创建的三维立体模型。每个虚拟对象在三维虚拟环境中具有自身的形状和体积,占据三维虚拟环境中的一部分空间。通常,一个虚拟对象包括一个中心点,当该虚拟对象是规则物体时,该中心点是该虚拟对象的几何中心;当该虚拟对象是不规则物体时,该中心点可以是根据该虚拟对象的运动状态而预先设置 的。
视角方向:以虚拟对象的第一人称视角或第三人称视角在三维虚拟环境中进行观察时的观察方向。
世界坐标系:是指为三维虚拟环境所建立的三维坐标系,该世界坐标系用于对三维虚拟环境中的虚拟对象的坐标、方向向量、运动矢量等参数进行衡量,通常,该地面坐标系包括三个坐标轴:X轴、Y轴以及Z轴。可选地,X轴和Z轴所在平面是水平面,也即基准地面,Y轴的上半轴表示相对于地面的高度。其中,虚拟对象在三维虚拟环境中的姿态参数包括:三维坐标(X,Y,Z),其中,X和Y代表虚拟对象相对于地面的坐标,Y代表虚拟对象相对于地面的高度。
摄像机模型:三维虚拟环境中位于虚拟对象周围的三维模型。当采用第一人称视角时,摄像机模型位于虚拟对象的头部附近或位于虚拟对象的头部,当采用第三人称视角时,摄像机模型位于虚拟对象110的后方,通过摄像机模型的视角观察三维虚拟环境。
请参考图1,其示出了本申请一个示例性实施例提供的三维虚拟环境的界面示意图,如图1所示,三维虚拟环境的显示界面100包括虚拟对象110、视角切换区域120、在虚拟对象110的视角下观察到的三维虚拟环境的环境画面130以及虚拟左手摇杆140,其中,虚拟左手摇杆140为可选的控制元素,用户可通过虚拟左手摇杆140操控虚拟对象110进行前进、后退、向左侧移动和向右侧移动。
其中,三维虚拟环境是指应用程序在运行过程中所生成的一种虚拟环境,它能够使用户进入一个多媒体的三维虚拟世界,用户可通过操作设备或操作界面对三维虚拟环境中的虚拟对象进行控制,以虚拟对象的视角观察三维虚拟环境中的物体、人物、风景等,或通过虚拟对象和三维虚拟环境中的物体、人物、风景等进行互动,例如,通过操作一个虚拟士兵对目标敌军进行攻击。
虚拟对象110是建立在三维虚拟环境中的三维立体模型,在显示界面100中显示的三维虚拟环境的环境画面为虚拟对象110的视角所观察到的物体,示例性的,如图1所示,在位于虚拟对象110背后的第三人称视角观察下,显示的三维虚拟环境的环境画面130包括大地134、天空135、地平线133、小山131以及厂房132。在第三人称视角下,虚拟对象110显示在显示界面130中; 在第一人称视角下,虚拟对象110不显示在显示界面130中。
在一个可能的实现方式中,可通过在三维虚拟环境中建立摄像机模型来模拟(或实现)虚拟对象110的视角。当采用第一人称视角时,摄像机模型位于虚拟对象的头部附近或位于虚拟对象的头部,当采用第三人称视角时,摄像机模型位于虚拟对象110的后方,通过摄像机模型的视角观察三维虚拟环境。
如图2所示,三维虚拟环境具有X轴、Y轴和Z轴的世界坐标系,世界坐标系中的虚拟模型都具有对应的位置坐标,例如,摄像机模型的位置坐标为参考点P在世界坐标系中的坐标(X1,Y1,Z1),该参考点P可以是构建摄像机模型的物理中心点。同时,三维虚拟环境中的摄像机模型可通过垂直向上的向量U(Up)、水平向右的向量R(Right)、正前方的向量L(Look)以及参考点P进行描述。摄像机模型的运动可包含以下动作:绕向量R的旋转,绕向量U的旋转,绕向量L的旋转,沿向量R的扫视,沿向量U的升降以及沿向量L的平动,本申请实施例中,虚拟对象的视角切换主要涉及绕向量R的旋转和/或绕向量U的旋转。
图3示出了摄像机模型绕向量R的旋转的示意图,图左侧为没有旋转前的摄像机模型,图右侧为绕向量R旋转后的摄像机模型。
图4示出了摄像机模型绕向量U的旋转的示意图,图左侧为没有旋转前的摄像机模型,图右侧为绕向量U旋转后的摄像机模型。
图5示出了摄像机模型绕向量L的旋转的示意图,图左侧为没有旋转前的摄像机模型,图右侧为绕向量L旋转后的摄像机模型。
摄像机模型的实现,通常可以从摄像机模型的旋转、摄像机模型的缩放以及以摄像机模型的震动这几个点展开,摄像机模型的属性包括视角、视距、滚轮移动的速度、滚轮移动的加速度、虚拟对象在三维虚拟环境中的观察点以及虚拟对象的高度。
与摄像机模型的相关的模式和参数可定义为:
投影(Projection):投影方式分为透视和正交模式。
透视(Perspective):摄像机模型将用完全透视的方式来渲染可以观察到的对象,产生近大远小的效果。透视模式参数包括视野范围(Field of view)。
正交(Orthographic):摄像机模型以没有透视感的方式均匀地渲染可观察到的对象。正交模式的参数为视野大小(Size)。
剪裁平面(Clipping Planes):是摄像机模型渲染的范围,包括最近的点 (Near)和最远的点(Far)。
视图矩形(Viewport Rect):是用四个数值来控制摄像机模型的视图在显示屏中的位置和大小,使用屏幕坐标系。其包括三个参数x(水平位置起点)、y(垂直位置起点)、w(宽度)。
如图6所示,摄像机模型300包含固定的视角α和固定的视距d,在显示界面中显示的三维虚拟环境的环境画面就是在摄像机模型的视角和视距中包含的三维虚拟环境的局部环境。本申请实施例中,视角α的方向为视角α的中心轴指向的方向,可选的,视角α的方向为向量L的方向。
如图7所示,摄像机模型的取景变换是将摄像机模型变换到世界坐标系的原点,并旋转摄像机模型使其光轴与世界坐标系的z轴正方向朝向一致。三维虚拟环境中的所有物体,比如图7中的物体A、B、C、D都会随摄像机模型变换,以保证摄像机模型的视场恒定。
请参考图8,其示出了本申请一个示例性实施例提供的终端的结构框图。该终端800包括:操作系统820和应用程序822。
操作系统820是为应用程序822提供对计算机硬件的安全访问的基础软件。
应用程序822是支持虚拟环境的应用程序。可选地,应用程序822是支持三维虚拟环境的应用程序。该应用程序822可以是虚拟现实应用程序、三维地图应用程序、军事仿真应用程序、第三人称射击游戏(Third-Personal Shooting Game,TPS)、第一人称射击游戏(First-person shooting game,FPS)、MOBA游戏、多人枪战类生存游戏中的任意一种。该应用程序822可以是单机版的应用程序,比如单机版的三维游戏应用程序。
请参考图9,其示出了本申请一个示例性实施例提供的计算机系统的结构框图。该计算机系统900包括:第一终端920、服务器940和第二终端960。
第一终端920安装和运行有支持虚拟环境的应用程序。该应用程序可以是虚拟现实应用程序、三维地图应用程序、军事仿真应用程序、TPS游戏、FPS游戏、MOBA游戏、多人枪战类生存游戏中的任意一种。第一终端920是第一用户使用的终端,第一用户使用第一终端920控制位于虚拟环境中的第一虚拟对象进行活动,该活动包括但不限于:调整身体姿态、爬行、步行、奔跑、骑 行、跳跃、驾驶、拾取、射击、攻击、投掷中的至少一种。示意性的,第一虚拟对象是第一虚拟人物,比如仿真人物角色或动漫人物角色。
第一终端920通过无线网络或有线网络与服务器940相连。
服务器940包括一台服务器、多台服务器、云计算平台和虚拟化中心中的至少一种。服务器940用于为支持三维虚拟环境的应用程序提供后台服务。可选地,服务器940承担主要计算工作,第一终端920和第二终端960承担次要计算工作;或者,服务器940承担次要计算工作,第一终端920和第二终端960承担主要计算工作;或者,服务器940、第一终端920和第二终端960三者之间采用分布式计算架构进行协同计算。
第二终端960安装和运行有支持虚拟环境的应用程序。该应用程序可以是虚拟现实应用程序、三维地图应用程序、军事仿真应用程序、FPS游戏、MOBA游戏、多人枪战类生存游戏中的任意一种。第二终端960是第二用户使用的终端,第二用户使用第二终端960控制位于虚拟环境中的第二虚拟对象进行活动,该活动包括但不限于:调整身体姿态、爬行、步行、奔跑、骑行、跳跃、驾驶、拾取、射击、攻击、投掷中的至少一种。示意性的,第二虚拟对象是第二虚拟人物,比如仿真人物角色或动漫人物角色。
可选地,第一虚拟人物和第二虚拟人物处于同一虚拟环境中。可选地,第一虚拟人物和第二虚拟人物可以属于同一个队伍、同一个组织、具有好友关系或具有临时性的通讯权限。可选地,第一虚拟人物和第二虚拟人物也可以属于不同队伍、不同组织、或具有敌对性的两个团体。
可选地,第一终端920和第二终端960上安装的应用程序是相同的,或两个终端上安装的应用程序是不同控制系统平台的同一类型应用程序。第一终端920可以泛指多个终端中的一个,第二终端960可以泛指多个终端中的一个,本实施例仅以第一终端920和第二终端960来举例说明。第一终端920和第二终端960的终端类型相同或不同,该终端类型包括:游戏主机、台式计算机、智能手机、平板电脑、电子书阅读器、MP3(Moving Picture Experts Group Audio Layer III,动态影像专家压缩标准音频层面3)播放器、MP4(Moving Picture Experts Group Audio Layer IV,动态影像专家压缩标准音频层面4)播放器和膝上型便携计算机中的至少一种。以下实施例以终端是移动终端来举例说明。
请参考图10,其示出了本申请一个示例性实施例提供的用于对虚拟环境进 行视角切换的方法的流程图,该方法可应用于图8、图9实施例中的终端或图9实施例中的服务器中,本实施例中以该方法应用于终端中进行说明,该方法包括:
步骤1001,显示第一用户界面。
终端显示第一用户界面,第一用户界面包括虚拟对象以第一视角方向观察三维虚拟环境时的环境画面和视角切换区域,视角切换区域包括至少一个用于进行视角跳转的视角跳转元素,其中,视角跳转元素是视角切换区域中的元素控件。可选地,第一视角方向是按照虚拟对象的第一人称视角,或位于虚拟对象周围的第三人称视角进行观察的视角方向。
可选的,视角跳转元素包括采用第一方向刻度表示的跳转元素,第一方向刻度用于表示以虚拟对象所在位置为中心的360度水平视角方向中的全部视角方向或部分视角方向。
示例性的,如图11所示,第一用户界面1100包括虚拟对象1110、环境画面1130、视角切换区域1120以及虚拟左手摇杆1140。其中,第一用户界面1100中的环境画面1130为虚拟对象以第一视角方向观察三维虚拟环境时的环境画面,其包括虚拟对象1110正前方的小山1131、虚拟对象1110东北方向的厂房1132、大地1134、天空1135以及地平线1133。视角切换区域1120叠加在环境画面1130上显示,是位于环境画面530的顶部边缘的长条形区域,其包括第一方向刻度表现的视角跳转元素。该示例性的实施例中,第一方向刻度用于表示以虚拟对象所在位置为中心的360度水平视角方向中的全部视角方向,以正北方为0度,按照顺时针顺序,45度为东北方向,90度为正东方向,135度为东南方向,180度为正南方向,225度为西南方向,270度为正西方向,315度为西北方向。
可选的,视角跳转元素包括采用第二方向刻度表示的跳转元素,第二方向刻度用于表示以虚拟对象的第一仰视方向至第二俯视方向之间的全部视角方向或部分视角方向。
示例性的,如图12所示,第一用户界面1200包括虚拟对象1210、环境画面1230、视角切换区域1220以及虚拟左手摇杆1240。其中,第一用户界面1200中的环境画面1230为虚拟对象1210以第一视角方向观察三维虚拟环境时的环境画面,其包括离虚拟对象1210较远的小山1231、离虚拟对象1210较近的厂房1232、大地1234、天空1235以及地平线1233,其中,虚拟对象的视角位于 俯视的-45度方向。视角切换区域620叠加在环境画面1230上显示,是位于环境画面1230的右部边缘的长条形区域,其包括第二方向刻度表现的视角跳转元素,该示例性的实施例中,第二方向刻度用于表示以虚拟对象所在第一仰视方向至第二俯视方向之间的全部视角方向,其中,以竖直向上的方向为第一仰视方向,该方向对应的角度为90度,以竖直向下的方向为第二俯视方向,该方向对应的角度为-90度。
可选的,视角跳转元素包括采用第一子区域表示的跳转元素,每个第一子区域用于表示以虚拟对象所在位置为中心的360度水平视角方向中的一个视角方向。
示例性的,如图13所示,第一用户界面1300包括虚拟对象1310、环境画面1330、视角切换区域1320以及虚拟左手摇杆1340。其中,第一用户界面1300中的环境画面1330为虚拟对象1310以第一视角方向观察三维虚拟环境时的环境画面,其包括虚拟对象1310正前方的小山1331、虚拟对象1310东北方向的厂房1332、大地1334、天空1335以及地平线1333。视角切换区域1320叠加在环境画面1330上显示,是位于环境画面1330的顶部边缘的长条形区域,其包括多个第一子区域表示的视角跳转元素,每个第一子区域对应以虚拟对象所在位置为中心的360度水平视角方向中的一个视角方向,其中,每个第一子区域可以通过标识或文字体现对应的视角方向。
可选的,视角跳转元素包括采用第二子区域表示的跳转元素,每个第二子区域用于表示以虚拟对象的第一仰视方向至第二俯视方向之间的一个视角方向。
示例性的,如图14所示,第一用户界面1400包括虚拟对象1410、环境画面1430、视角切换区域1420以及虚拟左手摇杆1440。其中,第一用户界面1400中的环境画面1430为虚拟对象1410以第一视角方向观察三维虚拟环境时的环境画面,其包括离虚拟对象1410较远的小山1431、离虚拟对象1410较近的厂房1432、大地1434、天空1435以及地平线1433,其中,虚拟对象的视角位于俯视的-45度方向。视角切换区域1420叠加在环境画面1430上显示,是位于环境画面1430的右部边缘的长条形区域,其包括多个第二子区域表示的视角跳转元素,每个第二子区域对应以虚拟对象的第一仰视方向至第二俯视方向之间的一个视角方向,其中,每个第二子区域可以通过标识或文字体现对应的视角方向。
需要说明的是,图11、图12、图13、图14的示例性的实施例可以分别单独实现,也可以将图11和图12的实施例结合实现,或将图13和图14的实施例结合实现,或将图11和图14的实施例结合实现,或将图12和图13的实施例结合实现。
步骤1002,接收在目标视角跳转元素上触发的视角跳转信号。
终端接收在目标视角跳转元素上触发的视角跳转信号,视角切换信号是用户点击目标视角跳转元素产生的信号,其中,目标视角跳转元素是至少一个视角跳转元素中的一个。
示例性的,如图11所示,用户点击视角切换区域1120中的目标视角切换元素,比如刻度标尺中30度的刻度,产生视角跳转信号,终端接收该视角跳转信号。
示例性的,如图12所示,用户点击视角切换区域1220中的目标视角切换元素,比如刻度标尺中0度的刻度,产生视角跳转信号,终端接收该视角跳转信号。
示例性的,如图13所示,用户点击视角切换区域1320中的目标视角切换元素,比如第一子区域6,产生视角跳转信号,终端接收该视角跳转信号。
示例性的,如图14所示,用户点击视角切换区域1420中的目标视角切换元素,比如第二子区域4,产生视角跳转信号,终端接收该视角跳转信号。
步骤1003,确定与目标视角跳转元素对应的第二视角方向。
终端获取了视角跳转信号后,根据视角跳转信号确定与目标视角跳转元素对应的第二视角方向。
示例性的,如图11所示,终端根据用户点击视角切换区域1120中30度的刻度产生的视角跳转信号,确定第二视角方向为北偏东30度。
示例性的,如图12所示,终端根据用户点击视角切换区域1220中0度的刻度产生的视角跳转信号,确定第二视角方向为平视的0度。
示例性的,如图13所示,终端根据用户点击视角切换区域1320中第一子区域6产生的视角跳转信号,根据预存的第一子区域6对应的方向,确定第二视角方向为北偏东30度。
示例性的,如图14所示,终端根据用户点击视角切换区域1420中第二子区域4产生的视角跳转信号,根据预存的第二子区域4对应的方向,确定第二视角方向为平视的0度。
步骤1004,显示第二用户界面。
在得到第二视角方向后,终端显示第二用户界面,第二用户界面包括虚拟对象以第二视角方向观察三维虚拟环境时的环境画面。
示例性的,如图11所示,终端确定第二视角方向为北偏东30度,显示第二用户界面1150,第二用户界面1150包括虚拟对象1110向北偏东30度方向观察到三维虚拟环境时的环境画面1160,其中,环境画面1160包括虚拟对象1110正前方的厂房1132、虚拟对象1110正前方偏西的小山1131、天空1135、大地1134以及地平线1133。
示例性的,如图12所示,终端确定第二视角方向为平视的0度,显示第二用户界面1250,第二用户界面1250包括虚拟对象1210向平视的0度方向观察到三维虚拟环境时的环境画面1260,其中,环境画面1260包括虚拟对象1110正前方较远的小山1231、虚拟对象1210正前方较近的厂房1232、天空1235、大地1234以及地平线1233。
示例性的,如图13所示,终端确定第二视角方向为北偏东30度,显示第二用户界面1350,第二用户界面1350包括虚拟对象1310向北偏东30度的方向观察到三维虚拟环境时的环境画面1360,其中,环境画面1360包括虚拟对象110正前方的厂房1332、虚拟对象110正前方偏西的小山1331、天空1335、大地1334以及地平线1333。
示例性的,如图14所示,终端确定第二视角方向为平视的0度,显示第二用户界面1450,第二用户界面1450包括虚拟对象1410向平视的0度方向观察到三维虚拟环境时的环境画面1460,其中,环境画面1460包括虚拟对象1410正前方较远的小山1431、虚拟对象1410正前方较近的厂房1432、天空1435、大地1434以及地平线1433。
综上所述,本申请实施例中,通过接收视角切换区域内的目标视角跳转元素上触发的视角跳转信号,根据目标视角跳转元素对应的第二视角防线显示第二显示界面,解决了相关技术中通过右手滑动屏幕改变虚拟对象的视角方向所带来的人机交互效率较低的问题,由于不通过右手摇杆而是通过点击目标视角跳转元素切换虚拟对象的视角方向,提高了视角切换速度,从而提高了人机交互效率。
请参考图15,其示出了本申请一个示例性实施例提供的用于对虚拟环境进 行视角切换的方法的流程图,该方法可应用于图8、图9实施例中的终端或图9实施例中的服务器中,本实施例中以该方法应用于终端中进行说明,该方法包括:
步骤1501,显示第一用户界面。
终端显示第一用户界面,第一用户界面包括虚拟对象以第一视角方向观察三维虚拟环境时的环境画面和视角切换区域,视角切换区域包括至少一个用于进行视角跳转的视角跳转元素,其中,视角跳转元素是视角切换区域中的元素控件。
步骤1502,接收在目标视角跳转元素上触发的视角跳转信号。
终端接收在目标视角跳转元素上触发的视角跳转信号,视角切换信号是用户点击目标视角跳转元素产生的信号,其中,目标视角跳转元素是至少一个视角跳转元素中的一个。
示例性的,完整的点击事件至少由一个手指按下动作(ACTION_DOWN)、和一个手指抬起动作(ACTION_UP)组成的,也可能包括至少一个手指滑动动作(ACTION_MOVE),其中,每一个完整的点击事件都是由ACTION_DOWN开始。每个动作在触摸屏上都可以存在对应的触摸坐标。
当用户的手指按压在触摸显示屏上时,终端开始记录按压区域对应的起始坐标,当用户抬起手指时,终端得到用户点击操作的终止坐标,终端判断起始坐标、终止坐标之间以及起始坐标和终止坐标之间包含的滑动坐标是否位于目标跳转元素内,若用户点击操作所对应的所有坐标在目标跳转元素内且滑动的距离超过阈值,则确定用户的点击操作对应的控件为目标跳转元素。
步骤1503,在预设对应关系中,查询与目标视角跳转元素对应的第二视角方向。
终端获取视角跳转信号后,在预设对应关系中,查询与目标视角跳转元素对应的第二视角方向,其中,预设对应关系中存储有至少一个视角跳转元素各自对应的视角方向。
示例性的,终端中存储有每个视角跳转元素和对应的视角方向的对应关系表,终端通过查询对应关系表,得到目标视角跳转元素对应的视角方向,其中,对应关系表可如表一所示。
表一
刻度值 视角方向
0 正北方向
15 北偏东15度
30 北偏东30度
45 东北方向
示例性的,如图11所示,终端中存储有每个第一方向刻度对应的视角方向,终端可查询目标刻度对应的视角方向,例如,目标刻度为30度,其对应的方向就是北偏东30度。
示例性的,如图12所示,终端中存储有每个第二方向刻度对应的视角方向,终端可查询目标刻度对应的视角方向,例如,目标刻度为0度,其对应的方向就是平视的0度。
示例性的,如图13所示,终端中存储有每个第一子区域对应的视角方向,终端可查询目标子区域对应的视角方向,例如,目标子区域为第一子区域6,其对应的方向就是北偏东30度。
示例性的,如图14所示,终端中存储有每个第二子区域对应的视角方向,终端可查询目标子区域对应的视角方向,例如,目标子区域为0度,其对应的方向就是平视的0度。
在步骤1504中,显示第二用户界面。
在得到第二视角方向后,终端显示第二用户界面,第二用户界面包括虚拟对象以第二视角方向观察三维虚拟环境时的环境画面。
示例性的,如图11所示,终端确定第二视角方向为北偏东30度,显示第二用户界面1150,第二用户界面1150包括虚拟对象1110向北偏东30度方向观察到三维虚拟环境时的环境画面1160,其中,环境画面1160包括虚拟对象1110正前方的厂房1132、虚拟对象1110正前方偏西的小山1131、天空1135、大地1134以及地平线1133。
示例性的,如图12所示,终端确定第二视角方向为平视的0度,显示第二用户界面1250,第二用户界面1250包括虚拟对象1210向平视的0度方向观察到三维虚拟环境时的环境画面1260,其中,环境画面1260包括虚拟对象1210正前方较远的小山1231、虚拟对象1210正前方较近的厂房1232、天空1235、大地1234以及地平线1233。
示例性的,如图13所示,终端确定第二视角方向为北偏东30度,显示第二用户界面1350,第二用户界面1350包括虚拟对象1310向北偏东30度的方 向观察到三维虚拟环境时的环境画面1360,其中,环境画面1360包括虚拟对象1310正前方的厂房1332、虚拟对象1310正前方偏西的小山1331、天空1335、大地1334以及地平线1333。
示例性的,如图14所示,终端确定第二视角方向为平视的0度,显示第二用户界面1450,第二用户界面1450包括虚拟对象1410向平视的0度方向观察到三维虚拟环境时的环境画面1460,其中,环境画面1460包括虚拟对象1410正前方较远的小山1431、虚拟对象1410正前方较近的厂房1432、天空1435、大地1434以及地平线1433。
当本实施例中的视角切换方法应用于服务器中时,步骤1501和步骤1502在终端中执行,步骤1503由服务器执行,终端获取视角跳转信号后,向服务器发送视角跳转信号,服务器根据视角跳转信号在本地存储的对应关系中查询与目标视角跳转元素对应的第二视角方向,并将第二视角方向发送至终端,终端根据服务器发送的第二视角方向显示第二用户界面。
综上所述,本申请实施例中,通过接收视角切换区域内的目标视角跳转元素上触发的视角跳转信号,根据目标视角跳转元素对应的第二视角防线显示第二显示界面,解决了相关技术中通过右手滑动屏幕改变虚拟对象的视角方向所带来的人机交互效率较低的问题,由于不通过右手滑动屏幕而是通过点击目标视角跳转元素切换虚拟对象的视角方向,提高了视角切换速度,从而提高了人机交互效率。
进一步的,本申请实施例中,通过查询与目标视角跳转元素对应的第二视角方向确定第二视角方向,从而显示第二用户界面,由于通过查询对应关系的方式获取第二视角方向速度较快,提高了视角切换的速度。
请参考图16,其示出了本申请一个示例性实施例提供的用于对虚拟环境进行视角切换的方法的流程图,该方法可应用于图8、图9实施例中的终端或图9实施例中的服务器中,本实施例中以该方法应用于终端中进行说明,该方法包括:
步骤1601,显示第一用户界面。
终端显示第一用户界面,第一用户界面包括虚拟对象以第一视角方向观察三维虚拟环境时的环境画面和视角切换区域,视角切换区域包括至少一个用于进行视角跳转的视角跳转元素,其中,视角跳转元素是视角切换区域中的元素 控件。
示例性的,如图17所示,第一用户界面1700包括虚拟对象1710、环境画面1730、视角切换区域1720以及虚拟左手摇杆1740。其中,第一用户界面1700中的环境画面1730为虚拟对象1710以第一视角方向观察三维虚拟环境时的环境画面,其包括虚拟对象1710正前方的小山1731、虚拟对象1710东北方向的厂房1732、大地1734、天空1735以及地平线1733。视角切换区域1720叠加在环境画面1730上显示,是位于环境画面1730的顶部边缘的长条形区域,参考视角跳转元素是位于视角切换区域1720的中央区域,视角切换区域1720中显示有滑块1721,用户按压滑块1721可滑动至目标视角跳转元素所在的区域。
示例性的,如图18所示,第一用户界面1800包括虚拟对象1810、环境画面1830、视角切换区域1820以及虚拟左手摇杆1840。其中,第一用户界面1800中的环境画面1830为虚拟对象1810以第一视角方向观察三维虚拟环境时的环境画面,其包括离虚拟对象1810较远的小山1831、离虚拟对象1810较近的厂房1832、大地1834、天空1835以及地平线1833。视角切换区域1820叠加在环境画面1830上显示,是位于环境画面1830的右部边缘的长条形区域,参考视角跳转元素是位于视角切换区域1820的中央区域,视角切换区域1820中显示有滑块1821,用户按压滑块1821可滑动至目标视角跳转元素所在的区域。
步骤1602,接收在目标视角跳转元素上触发的视角跳转信号。
终端接收在目标视角跳转元素上触发的视角跳转信号,视角切换信号是用户滑动目标视角跳转元素产生的信号,其中,目标视角跳转元素是至少一个视角跳转元素中的一个。
示例性的,如图17所示,用户点击视角切换区域1720中的滑块1721由位于中央区域的参考跳转元素滑动至目标视角跳转元素所在的区域,当用户按下滑块1721时触发视角跳转信号,随着用户滑动滑块1721不断产生视角跳转信号,直到用户滑动滑块1721停止在目标跳转区域。
示例性的,如图18所示,用户点击视角切换区域1820中的滑块1821由位于中央区域的参考跳转元素滑动至目标视角跳转元素所在的区域,当用户按下滑块1821时触发视角跳转信号,随着用户滑动滑块1821不断产生视角跳转信号,直到用户滑动滑块1821停止在目标跳转区域。
步骤1603,确定目标视角跳转元素与参考视角跳转元素之间的偏移值和偏移方向。
终端根据目标视角跳转元素与参考视角跳转元素之间的偏移值和偏移方向,可选的,参考视角跳转元素是视角切换区域中央区域的视角跳转元素,终端可通过目标视角跳转元素的坐标和参考视角跳转元素的坐标计算偏移值,得到偏移方向。
示例性的,如图17所示,终端根据目标视角跳转元素的参考点坐标以及参考视角跳转元素的参考点坐标,计算参考视角跳转元素以及目标视角跳转元素之间的距离D1,得到偏移值为D1,偏移方向为目标视角跳转元素所对应的方向,其中,参考视角跳转元素的参考点可以是参考视角跳转元素的中心点,目标视角跳转元素的参考点可以是目标视角跳转元素的中心点。
示例性的,如图18所示,终端根据目标视角跳转元素的参考点坐标以及参考视角跳转元素的参考点坐标,计算参考视角跳转元素以及目标视角跳转元素之间的距离D2,得到偏移值为D2,偏移方向为目标视角跳转元素所对应的方向,其中,参考视角跳转元素的参考点可以是参考视角跳转元素的中心点,目标视角跳转元素的参考点可以是目标视角跳转元素的中心点。
步骤1604,根据偏移值和偏移方向确定第一视角方向的偏移角度。
终端根据目标视角跳转元素与参考视角跳转元素之间的偏移值和偏移方向确定第一视角方向的偏移角度。
示例性的,如图17所示,视角切换区域1720是采用长条形区域对应虚拟对象所在位置为中心的360度水平视角方向中的全部视角方向,但是在显示的时候,只显示部分视角方向,例如,以正北方为0度,当虚拟对象1710的视角方向为正北方时,视角切换区域1720显示的视角方向为300度至0度,以及0度至90度,若目标视角跳转元素的参考点位于0度至90度之间,且视角切换区域1720的长度为L1,则偏移角度为90*1/2*D1/L1。
示例性的,如图18所示,视角切换区域1820是采用长条形区域对应虚拟对象所在位置为中心的第一仰视方向和第二仰视方向之间的全部视角方向,其中,竖直向上的方向为第一仰视方向,该方向对应的角度为90度,以竖直向下的方向为第二俯视方向,该方向对应的角度为-90度。但是在显示的时候,只显示部分视角方向,例如,以平视的方向为0度,当虚拟对象1810的视角方向为平视时,视角切换区域1820显示的视角方向为-45度至45度,若目标视角跳转元素的参考点位于0度至45度之间,且视角切换区域1820的长度为L2,则偏移角度为90*1/2*D2/L2。
步骤1605,按照偏移角度对第一视角方向进行偏移后,得到第二视角方向。
终端将第一视角方向想偏移方向转动偏移角度,得到第二视角方向。
步骤1606,显示第二用户界面。
在得到第二视角方向后,终端显示第二用户界面,第二用户界面包括虚拟对象以第二视角方向观察三维虚拟环境时的环境画面。
示例性的,如图17所示,终端确定第二视角方向后,显示第二用户界面1750,第二用户界面1750包括虚拟对象1710在第二视角方向观察到三维虚拟环境时的环境画面1760,其中,环境画面1730包括虚拟对象1710正前方的厂房1732、虚拟对象1710正前方偏西的小山1731、天空1735、大地1734以及地平线1733。
示例性的,如图18所示,终端确定第二视角方向后,显示第二用户界面1850,第二用户界面1850包括虚拟对象1810在第二视角方向观察到三维虚拟环境时的环境画面1860,其中,环境画面1860包括虚拟对象1810正前方较远的小山1831、虚拟对象1810正前方较近的厂房1832、天空1835、大地1834以及地平线1833。
综上所述,本申请实施例中,通过接收视角切换区域内的目标视角跳转元素上触发的视角跳转信号,根据目标视角跳转元素对应的第二视角防线显示第二显示界面,解决了相关技术中通过右手滑动屏幕改变虚拟对象的视角方向所带来的人机交互效率较低的问题,由于不通过右手滑动屏幕而是通过点击目标视角跳转元素切换虚拟对象的视角方向,提高了视角切换速度,从而提高了人机交互效率。
进一步的,本申请实施例中,通过确定目标视角跳转元素与参考视角跳转元素之间的偏移值和偏移方向,从而根据偏移值和偏移方向确定第一视角方向的偏移角度,进而按照偏移角度对第一视角方向进行偏移后,得到第二视角方向,提高了计算第二视角方向的准确度。
视角切换区域显示的跳转元素可以是包括虚拟对象的全部视角方向,也可以是包括虚拟对象的部分视角方向,视角切换区域显示的跳转元素可通过图19的实施例实现。
请参考图19,其示出了本申请一个示例性实施例提供的用于对虚拟环境进 行视角切换的方法的流程图,该方法可应用于图8、图9实施例中的终端或图9实施例中的服务器中,本实施例中以该方法应用于终端中进行说明,该方法包括:
步骤1901,显示第一用户界面。
终端显示第一用户界面,第一用户界面包括虚拟对象以第一视角方向观察三维虚拟环境时的环境画面和视角切换区域,视角切换区域包括至少一个用于进行视角跳转的视角跳转元素,其中,视角跳转元素是视角切换区域中的元素控件。
步骤1902,接收在目标视角跳转元素上触发的视角跳转信号。
终端接收在目标视角跳转元素上触发的视角跳转信号,视角切换信号是用户点击和/或滑动目标视角跳转元素产生的信号,其中,目标视角跳转元素是至少一个视角跳转元素中的一个。
步骤1903,确定与目标视角跳转元素对应的第二视角方向。
终端根据视角跳转信号确定与目标视角跳转元素对应的第二视角方向。
可选的,终端通过图15和图16是实施例中的方法确定与目标视角跳转元素对应的第二视角方向。
步骤1904,在视角切换区域的中央区域显示与第二视角方向对应的目标视角跳转元素。
在确定了第二视角方向后,终端根据第二视角方向改变视角切换区域中的视角跳转元素的显示位置,在视角切换区域的中央区域显示与第二视角方向对应的目标视角跳转元素。在执行步骤1904之后,终端可执行步骤1905a或步骤1905b。通常步骤1905a和1905b不能同时实现,但是当用户显示界面显示两个视角切换区域时,终端可先后执行1905a和1905b,执行的顺序不加限定。
示例性的,如图11所示,终端在确定第二视角方向为北偏东30度时,将第二视角方向对应的目标视角跳转元素,比如刻度30度显示在视角切换区域1120的中心区域。
示例性的,如图12所示,终端在确定第二视角方向平视的0度时,将第二视角方向对应的目标视角跳转元素,比如刻度0度显示在视角切换区域1220的中心区域。
示例性的,如图13所示,终端在确定第二视角方向为北偏东30度时,将第二视角方向对应的目标视角跳转元素,比如第一子区域6显示在视角切换区 域1320的中心区域。
示例性的,如图14所示,终端在确定第二视角方向平视的0度时,将第二视角方向对应的目标视角跳转元素,比如第二子区域4显示在视角切换区域1420的中心区域。
步骤1905a,根据其它视角跳转元素与目标视角跳转元素之间的相对位置关系,在视角切换区域的其它区域上显示其它视角跳转元素
终端根据视角切换区域中其它视角跳转元素与目标视角跳转元素之间的相对位置关系,在视角切换区域的其它区域上显示其它视角跳转元素。
示例性的,如图13所示,视角切换区域1320中显示的第一子区域1至第一子区域7分别对应虚拟对象所在位置为中心的360度水平视角方向中的全部视角方向。在视角切换之前,视角方向为第一子区域4对应的方向,在视角切换之后,视角方向为第一子区域6对应的方向,视角切换后,第一子区域6显示在视角切换区域1320的中心区域,原先位于视角切换区域左边缘的第一子区域1和第一子区域2显示在视角切换区域右边缘。
示例性的,如图14所示,视角切换区域1420中的第二子区域1至第二子区域7分别对应以虚拟对象的第一仰视方向至第二俯视方向之间的全部视角方向。在视角切换之前,视角方向为第二子区域6对应的方向,在视角切换之后,视角方向为第二子区域4对应的方向,视角切换后,第二子区域4显示在视角切换区域1420的中心区域,原先位于视角切换区域左边缘的第二子区域3和第二子区域4显示在视角切换区域右边缘。
步骤1905b,根据其它视角跳转元素与所述目标视角跳转元素之间的相对位置关系,在视角切换区域上隐藏其它视角跳转元素。
终端根据视角切换区域中其它视角跳转元素与目标视角跳转元素之间的相对位置关系,在视角切换区域的其它区域上隐藏其它视角跳转元素。
示例性的,如图15所示,视角切换区域1520中的刻度分别对应虚拟对象所在位置为中心的360度水平视角方向中的全部视角方向,但是只显示部分视角方向,比如以虚拟对象所在位置为中心的90度的视角方向。在视角切换之前,视角方向为0度的刻度对应的方向,在视角切换之后,视角方向为30度的刻度对应的方向,视角切换后,30度的刻度显示在视角切换区域520的中心区域,原先位于视角切换区域左边缘的西北刻度和330度的刻度被隐藏。
示例性的,如图16所示,视角切换区域1620中的刻度分别对应以虚拟对 象的第一仰视方向至第二俯视方向之间的全部视角方向,但是只显示部分视角方向,比如只显示以虚拟对象的第一仰视方向至第二俯视方向之间的90度的视角方向。在视角切换之前,视角方向为-45度的刻度对应的方向,在视角切换之后,视角方向为0度的刻度对应的方向,视角切换后,0度的刻度显示在视角切换区域620的中心区域,原先位于视角切换区域下边缘的-60度的刻度、-75度的刻度和-90度的刻度被隐藏。
步骤1906,显示第二用户界面。
在得到第二视角方向后,终端显示第二用户界面,第二用户界面包括虚拟对象以第二视角方向观察三维虚拟环境时的环境画面以及叠加在环境画面上的视角切换区域,其中,视角跳转元素显示在视角切换区域的中央区域。
综上所述,本申请实施例中,通过接收视角切换区域内的目标视角跳转元素上触发的视角跳转信号,根据目标视角跳转元素对应的第二视角防线显示第二显示界面,解决了相关技术中通过右手滑动屏幕改变虚拟对象的视角方向所带来的人机交互效率较低的问题,由于不通过右手滑动屏幕而是通过点击目标视角跳转元素切换虚拟对象的视角方向,提高了视角切换速度,从而提高了人机交互效率。
进一步的,本申请实施例中,通过将目标视角跳转元素显示在视角切换区域的中央区域,使第二用户界面能够清楚地反应虚拟对象的第二视角方向,提高了视角切换的便利性。
请参考图20,其示出了本申请一个示例性实施例提供的视角切换装置的结构框图,该装置用于实现对虚拟环境进行视角切换,该装置可应用于图8、图9实施例中的终端或图9实施例中的服务器中,该装置包括显示模块2010、接收模块2020以及处理模块2030。
显示模块2010,用于显示第一用户界面,第一用户界面包括虚拟对象以第一视角方向观察三维虚拟环境时的环境画面和视角切换区域,视角切换区域包括至少一个用于进行视角跳转的视角跳转元素。
接收模块2020,用于接收在目标视角跳转元素上触发的视角跳转信号,目标视角跳转元素是至少一个视角跳转元素中的一个。
处理模块2030,用于确定与目标视角跳转元素对应的第二视角方向。
显示模块2010,还用于显示第二用户界面,第二用户界面包括虚拟对象以 第二视角方向观察三维虚拟环境时的环境画面。
在一个可选的实施例中,视角跳转元素包括采用第一方向刻度表示的跳转元素,第一方向刻度用于表示以虚拟对象所在位置为中心的360度水平视角方向中的全部视角方向或部分视角方向。
在一个可选的实施例中,视角跳转元素包括采用第二方向刻度表示的跳转元素,第二方向刻度用于表示以虚拟对象的第一仰视方向至第二俯视方向之间的全部视角方向或部分视角方向。
在一个可选的实施例中,视角跳转元素包括采用第一子区域表示的跳转元素,每个第一子区域用于表示以虚拟对象所在位置为中心的360度水平视角方向中的一个视角方向。
在一个可选的实施例中,视角跳转元素包括采用第二子区域表示的跳转元素,每个第二子区域用于表示以虚拟对象的第一仰视方向至第二俯视方向之间的一个视角方向。
在一个可选的实施例中,视角切换区域叠加在环境画面上显示。
在一个可选的实施例中,视角切换区域是位于环境画面的预定边缘的长条形区域,预定边缘包括顶部边缘、底部边缘、左侧边缘和右侧边缘中的至少一个。
在一个可选的实施例中,
处理模块2030,还用于查询与目标视角跳转元素对应的第二视角方向;其中,预设对应关系中存储有至少一个视角跳转元素各自对应的视角方向。
在一个可选的实施例中,
处理模块2030,还用于确定目标视角跳转元素与参考视角跳转元素之间的偏移值和偏移方向,参考视角跳转元素是第一视角方向对应的视角跳转元素;根据偏移值和偏移方向确定第一视角方向的偏移角度;按照偏移角度对第一视角方向进行偏移后,得到第二视角方向。
在一个可选的实施例中,
显示模块2010,还用于根据第二视角方向改变视角切换区域中的视角跳转元素的显示位置。
在一个可选的实施例中,视角跳转元素包括采用第一方向刻度表示的跳转元素;
显示模块2010,还用于在视角切换区域的中央区域显示与第二视角方向对 应的目标视角跳转元素;根据其它视角跳转元素与目标视角跳转元素之间的相对位置关系,在视角切换区域的其它区域上显示其它视角跳转元素;或者,根据其它视角跳转元素与目标视角跳转元素之间的相对位置关系,在视角切换区域上隐藏其它视角跳转元素。
在一个可选的实施例中,视角切换区域包括采用第一方向刻度表示的第一刻度条和位于第一刻度条上的滑块;视角跳转信号是将滑块在第一刻度条上进行移动的信号;第二视角方向是移动后的滑块在第一刻度条上指示的刻度所对应的视角方向;其中,第一方向刻度用于表示以虚拟对象所在位置为中心的360度水平视角方向中的全部视角方向或部分视角方向。
在一个可选的实施例中,视角切换区域包括采用第二方向刻度表示的第二刻度条和位于第二刻度条上的滑块;视角跳转信号是将滑块在第二刻度条上进行移动的信号;第二视角方向是移动后的滑块在第二刻度条上指示的刻度所对应的视角方向;其中,第二方向刻度用于表示以虚拟对象的第一仰视方向至第二俯视方向之间的全部视角方向或部分视角方向。
请参考图21,其示出了本申请一个示例性实施例提供的电子设备2100的结构框图,该设备用于实现对虚拟环境进行视角切换。该电子设备2100可以是:智能手机、平板电脑、MP3播放器(Moving Picture Experts Group Audio Layer III,动态影像专家压缩标准音频层面3)、MP4(Moving Picture Experts Group Audio Layer IV,动态影像专家压缩标准音频层面4)播放器、笔记本电脑或台式电脑。电子设备2100还可能被称为终端、用户设备、便携式终端、膝上型终端、台式终端等其他名称。
通常,电子设备2100包括有:处理器2101和存储器2102。
处理器2101可以包括一个或多个处理核心,比如4核心处理器、8核心处理器等。处理器2101可以采用DSP(Digital Signal Processing,数字信号处理)、FPGA(Field-Programmable Gate Array,现场可编程门阵列)、PLA(Programmable Logic Array,可编程逻辑阵列)中的至少一种硬件形式来实现。处理器2101也可以包括主处理器和协处理器,主处理器是用于对在唤醒状态下的数据进行处理的处理器,也称CPU(Central Processing Unit,中央处理器);协处理器是用于对在待机状态下的数据进行处理的低功耗处理器。在一些实施例中,处理器2101可以在集成有GPU(Graphics Processing Unit,图像处理器), GPU用于负责显示屏所需要显示的内容的渲染和绘制。一些实施例中,处理器2101还可以包括AI(Artificial Intelligence,人工智能)处理器,该AI处理器用于处理有关机器学习的计算操作。
存储器2102可以包括一个或多个计算机可读存储介质,该计算机可读存储介质可以是非暂态的。存储器2102还可包括高速随机存取存储器,以及非易失性存储器,比如一个或多个磁盘存储设备、闪存存储设备。在一些实施例中,存储器2102中的非暂态的计算机可读存储介质用于存储至少一个指令,该至少一个指令用于被处理器2101所执行以实现本申请中方法实施例提供的用于对三维虚拟环境进行视角切换的方法。
在一些实施例中,电子设备2100还可选包括有:外围设备接口2103和至少一个外围设备。处理器2101、存储器2102和外围设备接口2103之间可以通过总线或信号线相连。各个外围设备可以通过总线、信号线或电路板与外围设备接口2103相连。具体地,外围设备包括:射频电路2104、触摸显示屏2105、摄像头2106、音频电路2107、定位组件2108和电源2109中的至少一种。
外围设备接口2103可被用于将I/O(Input/Output,输入/输出)相关的至少一个外围设备连接到处理器2101和存储器2102。在一些实施例中,处理器2101、存储器2102和外围设备接口2103被集成在同一芯片或电路板上;在一些其他实施例中,处理器2101、存储器2102和外围设备接口2103中的任意一个或两个可以在单独的芯片或电路板上实现,本实施例对此不加以限定。
射频电路2104用于接收和发射RF(Radio Frequency,射频)信号,也称电磁信号。射频电路2104通过电磁信号与通信网络以及其他通信设备进行通信。射频电路2104将电信号转换为电磁信号进行发送,或者,将接收到的电磁信号转换为电信号。可选地,射频电路2104包括:天线系统、RF收发器、一个或多个放大器、调谐器、振荡器、数字信号处理器、编解码芯片组、用户身份模块卡等等。射频电路2104可以通过至少一种无线通信协议来与其它终端进行通信。该无线通信协议包括但不限于:万维网、城域网、内联网、各代移动通信网络(2G、3G、4G及5G)、无线局域网和/或WiFi(Wireless Fidelity,无线保真)网络。在一些实施例中,射频电路2104还可以包括NFC(Near Field Communication,近距离无线通信)有关的电路,本申请对此不加以限定。
显示屏2105用于显示UI(User Interface,用户界面)。该UI可以包括图形、文本、图标、视频及其它们的任意组合。当显示屏2105是触摸显示屏时, 显示屏2105还具有采集在显示屏2105的表面或表面上方的触摸信号的能力。该触摸信号可以作为控制信号输入至处理器2101进行处理。此时,显示屏2105还可以用于提供虚拟按钮和/或虚拟键盘,也称软按钮和/或软键盘。在一些实施例中,显示屏2105可以为一个,设置电子设备2100的前面板;在另一些实施例中,显示屏2105可以为至少两个,分别设置在电子设备2100的不同表面或呈折叠设计;在再一些实施例中,显示屏2105可以是柔性显示屏,设置在电子设备2100的弯曲表面上或折叠面上。甚至,显示屏2105还可以设置成非矩形的不规则图形,也即异形屏。显示屏2105可以采用LCD(Liquid Crystal Display,液晶显示屏)、OLED(Organic Light-Emitting Diode,有机发光二极管)等材质制备。
摄像头组件2106用于采集图像或视频。可选地,摄像头组件2106包括前置摄像头和后置摄像头。通常,前置摄像头设置在终端的前面板,后置摄像头设置在终端的背面。在一些实施例中,后置摄像头为至少两个,分别为主摄像头、景深摄像头、广角摄像头、长焦摄像头中的任意一种,以实现主摄像头和景深摄像头融合实现背景虚化功能、主摄像头和广角摄像头融合实现全景拍摄以及VR(Virtual Reality,虚拟现实)拍摄功能或者其它融合拍摄功能。在一些实施例中,摄像头组件2106还可以包括闪光灯。闪光灯可以是单色温闪光灯,也可以是双色温闪光灯。双色温闪光灯是指暖光闪光灯和冷光闪光灯的组合,可以用于不同色温下的光线补偿。
音频电路2107可以包括麦克风和扬声器。麦克风用于采集用户及环境的声波,并将声波转换为电信号输入至处理器2101进行处理,或者输入至射频电路2104以实现语音通信。出于立体声采集或降噪的目的,麦克风可以为多个,分别设置在电子设备2100的不同部位。麦克风还可以是阵列麦克风或全向采集型麦克风。扬声器则用于将来自处理器2101或射频电路2104的电信号转换为声波。扬声器可以是传统的薄膜扬声器,也可以是压电陶瓷扬声器。当扬声器是压电陶瓷扬声器时,不仅可以将电信号转换为人类可听见的声波,也可以将电信号转换为人类听不见的声波以进行测距等用途。在一些实施例中,音频电路2107还可以包括耳机插孔。
定位组件2108用于定位电子设备2100的当前地理位置,以实现导航或LBS(Location Based Service,基于位置的服务)。定位组件2108可以是基于美国的GPS(Global Positioning System,全球定位系统)、中国的北斗系统或俄 罗斯的伽利略系统的定位组件。
电源2109用于为电子设备2100中的各个组件进行供电。电源2109可以是交流电、直流电、一次性电池或可充电电池。当电源2109包括可充电电池时,该可充电电池可以是有线充电电池或无线充电电池。有线充电电池是通过有线线路充电的电池,无线充电电池是通过无线线圈充电的电池。该可充电电池还可以用于支持快充技术。
在一些实施例中,电子设备2100还包括有一个或多个传感器2110。该一个或多个传感器2110包括但不限于:加速度传感器2111、陀螺仪传感器2112、压力传感器2113、指纹传感器2114、光学传感器2115以及接近传感器2116。
加速度传感器2111可以检测以电子设备2100建立的坐标系的三个坐标轴上的加速度大小。比如,加速度传感器2111可以用于检测重力加速度在三个坐标轴上的分量。处理器2101可以根据加速度传感器2111采集的重力加速度信号,控制触摸显示屏2105以横向视图或纵向视图进行用户界面的显示。加速度传感器2111还可以用于游戏或者用户的运动数据的采集。
陀螺仪传感器2112可以检测电子设备2100的机体方向及转动角度,陀螺仪传感器2112可以与加速度传感器2111协同采集用户对电子设备2100的3D动作。处理器2101根据陀螺仪传感器2112采集的数据,可以实现如下功能:动作感应(比如根据用户的倾斜操作来改变UI)、拍摄时的图像稳定、游戏控制以及惯性导航。
压力传感器2113可以设置在电子设备2100的侧边框和/或触摸显示屏2105的下层。当压力传感器2113设置在终端2100的侧边框时,可以检测用户对电子设备2100的握持信号,由处理器2101根据压力传感器2113采集的握持信号进行左右手识别或快捷操作。当压力传感器2113设置在触摸显示屏2105的下层时,由处理器2101根据用户对触摸显示屏2105的压力操作,实现对UI界面上的可操作性控件进行控制。可操作性控件包括按钮控件、滚动条控件、图标控件、菜单控件中的至少一种。
指纹传感器2114用于采集用户的指纹,由处理器2101根据指纹传感器2114采集到的指纹识别用户的身份,或者,由指纹传感器2114根据采集到的指纹识别用户的身份。在识别出用户的身份为可信身份时,由处理器2101授权该用户执行相关的敏感操作,该敏感操作包括解锁屏幕、查看加密信息、下载软件、支付及更改设置等。指纹传感器2114可以被设置电子设备2100的正 面、背面或侧面。当电子设备2100上设置有物理按键或厂商Logo时,指纹传感器2114可以与物理按键或厂商Logo集成在一起。
光学传感器2115用于采集环境光强度。在一个实施例中,处理器1501可以根据光学传感器2115采集的环境光强度,控制触摸显示屏2105的显示亮度。具体地,当环境光强度较高时,调高触摸显示屏2105的显示亮度;当环境光强度较低时,调低触摸显示屏2105的显示亮度。在另一个实施例中,处理器2101还可以根据光学传感器2115采集的环境光强度,动态调整摄像头组件2106的拍摄参数。
接近传感器2116,也称距离传感器,通常设置在电子设备2100的前面板。接近传感器2116用于采集用户与电子设备2100的正面之间的距离。在一个实施例中,当接近传感器2116检测到用户与电子设备2100的正面之间的距离逐渐变小时,由处理器2101控制触摸显示屏2105从亮屏状态切换为息屏状态;当接近传感器2116检测到用户与电子设备2100的正面之间的距离逐渐变大时,由处理器2101控制触摸显示屏2105从息屏状态切换为亮屏状态。
本领域技术人员可以理解,图19中示出的结构并不构成对电子设备1500的限定,可以包括比图示更多或更少的组件,或者组合某些组件,或者采用不同的组件布置。
本申请还提供一种计算机可读存储介质,所述存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现上述方法实施例提供的用于对虚拟环境进行视角切换的方法。
本申请还提供了一种包含指令的计算机程序产品,当其在计算机上运行时,使得计算机执行上述各方面所述的用于对虚拟环境进行视角切换的方法。
应当理解的是,在本文中提及的“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。字符“/”一般表示前后关联对象是一种“或”的关系。
上述本申请实施例序号仅仅为了描述,不代表实施例的优劣。
本领域普通技术人员可以理解实现上述实施例的全部或部分步骤可以通过硬件来完成,也可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,上述提到的存储介质可以是只读存储器,磁盘或光盘等。
以上所述仅为本发明的较佳实施例,并不用以限制本发明,凡在本发明的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本发明的保护范围之内。

Claims (28)

  1. 一种用于对虚拟环境进行视角切换的方法,其特征在于,所述方法包括:
    显示第一用户界面,所述第一用户界面包括虚拟对象以第一视角方向观察三维虚拟环境时的环境画面和视角切换区域,所述视角切换区域包括至少一个用于进行视角跳转的视角跳转元素;
    接收在目标视角跳转元素上触发的视角跳转信号,所述目标视角跳转元素是所述至少一个视角跳转元素中的一个;
    确定与所述目标视角跳转元素对应的第二视角方向;
    显示第二用户界面,所述第二用户界面包括所述虚拟对象以第二视角方向观察所述三维虚拟环境时的环境画面。
  2. 根据权利要求1所述的方法,其特征在于,所述视角跳转元素包括采用第一方向刻度表示的跳转元素,所述第一方向刻度用于表示以所述虚拟对象所在位置为中心的360度水平视角方向中的全部视角方向或部分视角方向。
  3. 根据权利要求1所述的方法,其特征在于,所述视角跳转元素包括采用第二方向刻度表示的跳转元素,所述第二方向刻度用于表示以所述虚拟对象的第一仰视方向至第二俯视方向之间的全部视角方向或部分视角方向。
  4. 根据权利要求1所述的方法,其特征在于,所述视角跳转元素包括采用第一子区域表示的跳转元素,每个所述第一子区域用于表示以所述虚拟对象所在位置为中心的360度水平视角方向中的一个视角方向。
  5. 根据权利要求1所述的方法,其特征在于,所述视角跳转元素包括采用第二子区域表示的跳转元素,每个所述第二子区域用于表示以所述虚拟对象的第一仰视方向至第二俯视方向之间的一个视角方向。
  6. 根据权利要求1所述的方法,其特征在于,所述视角切换区域叠加在所述环境画面上显示。
  7. 根据权利要求1所述的方法,其特征在于,所述视角切换区域是位于所述环境画面的预定边缘的长条形区域,所述预定边缘包括顶部边缘、底部边缘、左侧边缘和右侧边缘中的至少一个。
  8. 根据权利要求1至7任一所述的方法,其特征在于,所述确定与所述目标视角跳转元素对应的第二视角方向,包括:
    在预设对应关系中,查询与所述目标视角跳转元素对应的第二视角方向;
    其中,所述预设对应关系中存储有所述至少一个视角跳转元素各自对应的视角方向。
  9. 根据权利要求1至7任一所述的方法,其特征在于,所述确定与所述目标视角跳转元素对应的第二视角方向,包括:
    确定所述目标视角跳转元素与参考视角跳转元素之间的偏移值和偏移方向,所述参考视角跳转元素是所述第一视角方向对应的视角跳转元素;
    根据所述偏移值和所述偏移方向确定所述第一视角方向的偏移角度;
    按照所述偏移角度对所述第一视角方向进行偏移后,得到所述第二视角方向。
  10. 根据权利要求1至7任一所述的方法,其特征在于,所述方法还包括:
    根据所述第二视角方向改变所述视角切换区域中的所述视角跳转元素的显示位置。
  11. 根据权利要求10所述的方法,其特征在于,所述视角跳转元素包括采用第一方向刻度表示的跳转元素;
    所述根据所述第二视角方向改变所述视角切换区域中的所述视角跳转元素的显示位置,包括:
    在所述视角切换区域的中央区域显示与所述第二视角方向对应的所述目标视角跳转元素;
    根据其它视角跳转元素与所述目标视角跳转元素之间的相对位置关系,在所述视角切换区域的其它区域上显示所述其它视角跳转元素;或者,根据其它视角跳转元素与所述目标视角跳转元素之间的相对位置关系,在所述视角切换 区域上隐藏所述其它视角跳转元素。
  12. 根据权利要求1所述的方法,其特征在于,所述视角切换区域包括采用第一方向刻度表示的第一刻度条和位于所述第一刻度条上的滑块;
    所述视角跳转信号是将所述滑块在所述第一刻度条上进行移动的信号;
    所述第二视角方向是移动后的滑块在所述第一刻度条上指示的刻度所对应的视角方向;
    其中,所述第一方向刻度用于表示以所述虚拟对象所在位置为中心的360度水平视角方向中的全部视角方向或部分视角方向。
  13. 根据权利要求1所述的方法,其特征在于,所述视角切换区域包括采用第二方向刻度表示的第二刻度条和位于所述第二刻度条上的滑块;
    所述视角跳转信号是将所述滑块在所述第二刻度条上进行移动的信号;
    所述第二视角方向是移动后的滑块在所述第二刻度条上指示的刻度所对应的视角方向;
    其中,所述第二方向刻度用于表示以所述虚拟对象的第一仰视方向至第二俯视方向之间的全部视角方向或部分视角方向。
  14. 一种视角切换装置,其特征在于,所述装置用于实现对虚拟环境进行视角切换,所述装置包括:
    显示模块,用于显示第一用户界面,所述第一用户界面包括虚拟对象以第一视角方向观察虚拟环境时的环境画面和视角切换区域,所述视角切换区域包括至少一个用于进行视角跳转的视角跳转元素;
    接收模块,用于接收在目标视角跳转元素上触发的视角跳转信号,所述目标视角跳转元素是所述至少一个视角跳转元素中的一个;
    处理模块,用于确定与所述目标视角跳转元素对应的第二视角方向;
    所述显示模块,还用于显示第二用户界面,所述第二用户界面包括所述虚拟对象以第二视角方向观察所述虚拟环境时的环境画面。
  15. 根据权利要求14所述的装置,其特征在于,所述视角跳转元素包括采用第一方向刻度表示的跳转元素,所述第一方向刻度用于表示以所述虚拟对象 所在位置为中心的360度水平视角方向中的全部视角方向或部分视角方向。
  16. 根据权利要求14所述的装置,其特征在于,所述视角跳转元素包括采用第二方向刻度表示的跳转元素,所述第二方向刻度用于表示以所述虚拟对象的第一仰视方向至第二俯视方向之间的全部视角方向或部分视角方向。
  17. 根据权利要求14所述的装置,其特征在于,所述视角跳转元素包括采用第一子区域表示的跳转元素,每个所述第一子区域用于表示以所述虚拟对象所在位置为中心的360度水平视角方向中的一个视角方向。
  18. 根据权利要求14所述的装置,其特征在于,所述视角跳转元素包括采用第二子区域表示的跳转元素,每个所述第二子区域用于表示以所述虚拟对象的第一仰视方向至第二俯视方向之间的一个视角方向。
  19. 根据权利要求14所述的装置,其特征在于,所述视角切换区域叠加在所述环境画面上显示。
  20. 根据权利要求14所述的装置,其特征在于,所述视角切换区域是位于所述环境画面的预定边缘的长条形区域,所述预定边缘包括顶部边缘、底部边缘、左侧边缘和右侧边缘中的至少一个。
  21. 根据权利要求14至20任一所述的装置,其特征在于,
    所述处理模块,还用于在预设对应关系中,查询与所述目标视角跳转元素对应的第二视角方向;其中,所述预设对应关系中存储有所述至少一个视角跳转元素各自对应的视角方向。
  22. 根据权利要求14至20任一所述的装置,其特征在于,
    所述处理模块,还用于确定所述目标视角跳转元素与参考视角跳转元素之间的偏移值和偏移方向,所述参考视角跳转元素是所述第一视角方向对应的视角跳转元素;根据所述偏移值和所述偏移方向确定所述第一视角方向的偏移角度;按照所述偏移角度对所述第一视角方向进行偏移后,得到所述第二视角方 向。
  23. 根据权利要求14至20任一所述的装置,其特征在于,
    所述显示模块,还用于根据所述第二视角方向改变所述视角切换区域中的所述视角跳转元素的显示位置。
  24. 根据权利要求23所述的装置,其特征在于,所述视角跳转元素包括采用第一方向刻度表示的跳转元素;
    所述显示模块,还用于在所述视角切换区域的中央区域显示与所述第二视角方向对应的所述目标视角跳转元素;根据其它视角跳转元素与所述目标视角跳转元素之间的相对位置关系,在所述视角切换区域的其它区域上显示所述其它视角跳转元素;或者,根据其它视角跳转元素与所述目标视角跳转元素之间的相对位置关系,在所述视角切换区域上隐藏所述其它视角跳转元素。
  25. 根据权利要求14所述的装置,其特征在于,所述视角切换区域包括采用第一方向刻度表示的第一刻度条和位于所述第一刻度条上的滑块;
    所述视角跳转信号是将所述滑块在所述第一刻度条上进行移动的信号;
    所述第二视角方向是移动后的滑块在所述第一刻度条上指示的刻度所对应的视角方向;
    其中,所述第一方向刻度用于表示以所述虚拟对象所在位置为中心的360度水平视角方向中的全部视角方向或部分视角方向。
  26. 根据权利要求14所述的装置,其特征在于,所述视角切换区域包括采用第二方向刻度表示的第二刻度条和位于所述第二刻度条上的滑块;
    所述视角跳转信号是将所述滑块在所述第二刻度条上进行移动的信号;
    所述第二视角方向是移动后的滑块在所述第二刻度条上指示的刻度所对应的视角方向;
    其中,所述第二方向刻度用于表示以所述虚拟对象的第一仰视方向至第二俯视方向之间的全部视角方向或部分视角方向。
  27. 一种电子设备,其特征在于,所述设备用于实现对虚拟环境进行视角 切换,所述设备包括处理器和存储器,所述存储器中存储有至少一条指令,所述指令由所述处理器加载并执行以实现如权利要求1至13任一所述的用于对虚拟环境进行视角切换的方法。
  28. 一种计算机可读存储介质,其特征在于,所述存储介质中存储有至少一条指令,所述指令由处理器加载并执行以实现如权利要求1至13任一所述的用于对虚拟环境进行视角切换的方法。
PCT/CN2018/106552 2018-02-09 2018-09-19 用于对虚拟环境进行视角切换的方法、装置、设备及存储介质 WO2019153750A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP18904652.7A EP3751522A4 (en) 2018-02-09 2018-09-19 METHOD, APPARATUS AND DEVICE FOR VIRTUAL ENVIRONMENT VIEW SWITCHING, AND MEDIA
US16/886,111 US11256384B2 (en) 2018-02-09 2020-05-28 Method, apparatus and device for view switching of virtual environment, and storage medium
US17/541,350 US11703993B2 (en) 2018-02-09 2021-12-03 Method, apparatus and device for view switching of virtual environment, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810135907.7A CN108376424A (zh) 2018-02-09 2018-02-09 用于对三维虚拟环境进行视角切换的方法、装置、设备及存储介质
CN201810135907.7 2018-02-09

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/886,111 Continuation US11256384B2 (en) 2018-02-09 2020-05-28 Method, apparatus and device for view switching of virtual environment, and storage medium

Publications (1)

Publication Number Publication Date
WO2019153750A1 true WO2019153750A1 (zh) 2019-08-15

Family

ID=63017620

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/106552 WO2019153750A1 (zh) 2018-02-09 2018-09-19 用于对虚拟环境进行视角切换的方法、装置、设备及存储介质

Country Status (4)

Country Link
US (2) US11256384B2 (zh)
EP (1) EP3751522A4 (zh)
CN (1) CN108376424A (zh)
WO (1) WO2019153750A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111913635A (zh) * 2020-08-14 2020-11-10 广东申义实业投资有限公司 三维全景图片显示方法、装置、移动终端及存储介质
CN112330823A (zh) * 2020-11-05 2021-02-05 腾讯科技(深圳)有限公司 虚拟道具的显示方法、装置、设备及可读存储介质

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108376424A (zh) 2018-02-09 2018-08-07 腾讯科技(深圳)有限公司 用于对三维虚拟环境进行视角切换的方法、装置、设备及存储介质
CN111277866B (zh) 2018-12-04 2022-05-10 华为技术有限公司 一种控制vr视频播放的方法及相关装置
CN109675310A (zh) 2018-12-19 2019-04-26 网易(杭州)网络有限公司 一种游戏中虚拟镜头控制的方法及装置
CN109847354B (zh) 2018-12-19 2020-05-22 网易(杭州)网络有限公司 一种游戏中虚拟镜头控制的方法及装置
CN109966738B (zh) 2019-02-22 2020-07-31 网易(杭州)网络有限公司 信息处理方法、处理装置、电子设备和存储介质
CN110139090A (zh) * 2019-05-22 2019-08-16 北京光启元数字科技有限公司 一种视角处理方法及其处理系统
CN110393916B (zh) 2019-07-26 2023-03-14 腾讯科技(深圳)有限公司 视角转动的方法、装置、设备及存储介质
CN110413171B (zh) * 2019-08-08 2021-02-09 腾讯科技(深圳)有限公司 控制虚拟对象进行快捷操作的方法、装置、设备及介质
CN110533756B (zh) * 2019-08-29 2021-10-29 腾讯科技(深圳)有限公司 贴附型装饰物的设置方法、装置、设备及存储介质
CN111061362A (zh) * 2019-11-21 2020-04-24 珠海剑心互动娱乐有限公司 自适应视角方法、装置、计算设备和存储介质
CN111158469A (zh) * 2019-12-12 2020-05-15 广东虚拟现实科技有限公司 视角切换方法、装置、终端设备及存储介质
CN114097248B (zh) * 2019-12-30 2023-03-28 华为技术有限公司 一种视频流处理方法、装置、设备及介质
CN111282266B (zh) * 2020-02-14 2021-08-03 腾讯科技(深圳)有限公司 三维虚拟环境中的技能瞄准方法、装置、终端及存储介质
CN111773657B (zh) * 2020-08-11 2024-06-04 网易(杭州)网络有限公司 一种游戏中视角切换的方法及装置、电子设备、存储介质
CN114095785B (zh) * 2020-08-25 2023-11-24 上海哔哩哔哩科技有限公司 视频播放方法、装置及计算机设备
CN112230837A (zh) * 2020-11-03 2021-01-15 成都景中教育软件有限公司 一种三维动态几何系统中视角变换方法
CN112906228A (zh) * 2021-02-26 2021-06-04 视伴科技(北京)有限公司 一种浏览赛事活动虚拟场馆的方法及装置
CN113238656B (zh) * 2021-05-25 2024-04-30 北京达佳互联信息技术有限公司 三维图像的显示方法、装置、电子设备及存储介质
CN113313840A (zh) * 2021-06-15 2021-08-27 周永奇 实时虚拟系统及实时虚拟交互方法
CN113641443B (zh) * 2021-08-31 2023-10-24 腾讯科技(深圳)有限公司 界面元素的显示方法、装置、设备及可读存储介质
CN113835521B (zh) * 2021-09-02 2022-11-25 北京城市网邻信息技术有限公司 场景视角的切换方法、装置、电子设备及可读介质
CN114463528A (zh) * 2022-02-14 2022-05-10 深圳须弥云图空间科技有限公司 目标视角播放方法及装置
KR20230171663A (ko) * 2022-06-14 2023-12-21 이루고월드 주식회사 다중 장비를 통한 3d 가상세계 접속 장치 및 그 방법
CN115631320B (zh) * 2022-12-23 2023-02-28 腾讯科技(深圳)有限公司 预计算单元格显示方法、预计算单元格生成方法及装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103793060A (zh) * 2014-02-14 2014-05-14 杨智 一种用户交互系统和方法
CN106445118A (zh) * 2016-09-06 2017-02-22 网易(杭州)网络有限公司 虚拟现实交互方法及装置
CN106774893A (zh) * 2016-12-15 2017-05-31 飞狐信息技术(天津)有限公司 一种虚拟现实交互方法及虚拟现实设备
CN107648847A (zh) * 2017-08-22 2018-02-02 网易(杭州)网络有限公司 信息处理方法及装置、存储介质、电子设备
CN108376424A (zh) * 2018-02-09 2018-08-07 腾讯科技(深圳)有限公司 用于对三维虚拟环境进行视角切换的方法、装置、设备及存储介质

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5835692A (en) * 1994-11-21 1998-11-10 International Business Machines Corporation System and method for providing mapping notation in interactive video displays
GB9800397D0 (en) * 1998-01-09 1998-03-04 Philips Electronics Nv Virtual environment viewpoint control
AU6362899A (en) * 1998-10-08 2000-04-26 Cyberworld, International Corp. Systems and methods for displaying three dimensional representations and an associated separate web window
AU2001258670A1 (en) * 2000-05-16 2001-11-26 Nokia Corporation A method and apparatus to browse and access downloaded contextual information
US9327191B2 (en) * 2006-05-08 2016-05-03 Nintendo Co., Ltd. Method and apparatus for enhanced virtual camera control within 3D video games or other computer graphics presentations providing intelligent automatic 3D-assist for third person viewpoints
US8649554B2 (en) * 2009-05-01 2014-02-11 Microsoft Corporation Method to control perspective for a camera-controlled computer
US20200125244A1 (en) * 2009-12-03 2020-04-23 Innoventions, Inc. Context-based graphical view navigation guidance system
US8913009B2 (en) * 2010-02-03 2014-12-16 Nintendo Co., Ltd. Spatially-correlated multi-display human-machine interface
EP2393056A1 (en) * 2010-06-02 2011-12-07 Layar B.V. Acquiring, ranking and displaying points of interest for use in an augmented reality service provisioning system and graphical user interface for displaying such ranked points of interests
WO2011155118A1 (ja) * 2010-06-07 2011-12-15 パナソニック株式会社 オブジェクト選択装置、オブジェクト選択プログラム、及びオブジェクト選択方法
US20140078144A1 (en) * 2012-09-14 2014-03-20 Squee, Inc. Systems and methods for avatar creation
CN104102407B (zh) * 2013-04-08 2018-04-17 上海联影医疗科技有限公司 一种参数输入方法及系统
US10558353B2 (en) * 2015-11-18 2020-02-11 Samsung Electronics Co., Ltd. System and method for 360-degree video navigation
CN105677177A (zh) * 2015-12-31 2016-06-15 联想(北京)有限公司 显示方法及显示设备
US10628019B2 (en) * 2016-08-22 2020-04-21 Samsung Electronics Co., Ltd. Electronic device and method for rendering 360-degree multimedia content
CN107132981B (zh) * 2017-03-27 2019-03-19 网易(杭州)网络有限公司 游戏画面的显示控制方法及装置、存储介质、电子设备
EP3418839B1 (de) * 2017-06-22 2021-10-13 Enisco by Forcam GmbH Verfahren zur überwachung einer automatisierungsanlage
US10402043B1 (en) * 2017-08-10 2019-09-03 Gopro, Inc. Systems and methods for indicating highlights within spherical videos
CN107803028B (zh) * 2017-09-30 2019-03-08 网易(杭州)网络有限公司 信息处理方法、装置、电子设备及存储介质
CN107754305A (zh) * 2017-10-13 2018-03-06 网易(杭州)网络有限公司 信息处理方法及装置、存储介质、电子设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103793060A (zh) * 2014-02-14 2014-05-14 杨智 一种用户交互系统和方法
CN106445118A (zh) * 2016-09-06 2017-02-22 网易(杭州)网络有限公司 虚拟现实交互方法及装置
CN106774893A (zh) * 2016-12-15 2017-05-31 飞狐信息技术(天津)有限公司 一种虚拟现实交互方法及虚拟现实设备
CN107648847A (zh) * 2017-08-22 2018-02-02 网易(杭州)网络有限公司 信息处理方法及装置、存储介质、电子设备
CN108376424A (zh) * 2018-02-09 2018-08-07 腾讯科技(深圳)有限公司 用于对三维虚拟环境进行视角切换的方法、装置、设备及存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3751522A4

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111913635A (zh) * 2020-08-14 2020-11-10 广东申义实业投资有限公司 三维全景图片显示方法、装置、移动终端及存储介质
CN112330823A (zh) * 2020-11-05 2021-02-05 腾讯科技(深圳)有限公司 虚拟道具的显示方法、装置、设备及可读存储介质
CN112330823B (zh) * 2020-11-05 2023-06-16 腾讯科技(深圳)有限公司 虚拟道具的显示方法、装置、设备及可读存储介质

Also Published As

Publication number Publication date
EP3751522A1 (en) 2020-12-16
EP3751522A4 (en) 2022-01-12
CN108376424A (zh) 2018-08-07
US11256384B2 (en) 2022-02-22
US11703993B2 (en) 2023-07-18
US20220091725A1 (en) 2022-03-24
US20200333941A1 (en) 2020-10-22

Similar Documents

Publication Publication Date Title
WO2019153750A1 (zh) 用于对虚拟环境进行视角切换的方法、装置、设备及存储介质
CN108619721B (zh) 虚拟场景中的距离信息显示方法、装置及计算机设备
CN109529319B (zh) 界面控件的显示方法、设备及存储介质
JP7076880B2 (ja) 仮想環境における仮想オブジェクトの姿勢決定方法、装置及び媒体
WO2019201047A1 (zh) 在虚拟环境中进行视角调整的方法、装置及可读存储介质
WO2019153824A1 (zh) 虚拟对象控制方法、装置、计算机设备及存储介质
CN108710525B (zh) 虚拟场景中的地图展示方法、装置、设备及存储介质
CN111035918B (zh) 基于虚拟环境的侦察界面显示方法、装置及可读存储介质
WO2019205881A1 (zh) 虚拟环境中的信息显示方法、装置、设备及存储介质
CN110917616B (zh) 虚拟场景中的方位提示方法、装置、设备及存储介质
CN108786110B (zh) 虚拟环境中的瞄准镜显示方法、设备及存储介质
CN110694273A (zh) 控制虚拟对象使用道具的方法、装置、终端及存储介质
JP2022509634A (ja) 仮想環境における仮想アイテムの観察方法、装置およびコンピュータプログラム
CN111603770B (zh) 虚拟环境画面的显示方法、装置、设备及介质
CN109821237B (zh) 视角转动的方法、装置、设备及存储介质
CN111273780B (zh) 基于虚拟环境的动画播放方法、装置、设备及存储介质
JP7186901B2 (ja) ホットスポットマップの表示方法、装置、コンピュータ機器および読み取り可能な記憶媒体
CN111589141B (zh) 虚拟环境画面的显示方法、装置、设备及介质
WO2019184782A1 (zh) 虚拟场景中的对象控制方法、装置及计算机设备
WO2020156252A1 (zh) 在虚拟环境中建造建筑物的方法、装置、设备及存储介质
CN113577765B (zh) 用户界面的显示方法、装置、设备及存储介质
JP2023139033A (ja) 視点回転の方法、装置、端末およびコンピュータプログラム
CN108744511B (zh) 虚拟环境中的瞄准镜显示方法、设备及存储介质
JP2022524802A (ja) 仮想環境におけるスコープの適用方法及び装置並びにコンピュータ装置及びプログラム
JP2024509064A (ja) 位置マークの表示方法及び装置、機器並びにコンピュータプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18904652

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018904652

Country of ref document: EP

Effective date: 20200909