WO2020207204A1 - 虚拟环境中虚拟物品的观察方法、装置及可读存储介质 - Google Patents

虚拟环境中虚拟物品的观察方法、装置及可读存储介质 Download PDF

Info

Publication number
WO2020207204A1
WO2020207204A1 PCT/CN2020/079651 CN2020079651W WO2020207204A1 WO 2020207204 A1 WO2020207204 A1 WO 2020207204A1 CN 2020079651 W CN2020079651 W CN 2020079651W WO 2020207204 A1 WO2020207204 A1 WO 2020207204A1
Authority
WO
WIPO (PCT)
Prior art keywords
wheelbase
rotation
virtual
state
virtual object
Prior art date
Application number
PCT/CN2020/079651
Other languages
English (en)
French (fr)
Inventor
杨槿
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to BR112021018341A priority Critical patent/BR112021018341A2/pt
Priority to SG11202104919UA priority patent/SG11202104919UA/en
Priority to EP20787691.3A priority patent/EP3954444A4/en
Priority to AU2020256776A priority patent/AU2020256776B2/en
Priority to JP2021529272A priority patent/JP7145331B2/ja
Priority to KR1020217020054A priority patent/KR102602074B1/ko
Priority to CA3133001A priority patent/CA3133001A1/en
Publication of WO2020207204A1 publication Critical patent/WO2020207204A1/zh
Priority to US17/326,160 priority patent/US11766613B2/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5258Changing parameters of virtual cameras by dynamically adapting the position of the virtual camera to keep a game object or game character in its viewing frustum, e.g. for tracking a character or a ball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5252Changing parameters of virtual cameras using two or more virtual cameras concurrently or sequentially, e.g. automatically switching between fixed virtual cameras when a character changes room or displaying a rear-mirror view in a car-driving game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5255Changing parameters of virtual cameras according to dedicated instructions from a player, e.g. using a secondary joystick to rotate the camera around a player's character
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Definitions

  • the embodiments of the present application relate to the field of computers, and in particular, to a method, device, and readable storage medium for observing virtual items in a virtual environment.
  • the virtual object in the process of picking up virtual items, first the virtual object needs to observe the position of the virtual item through a downward observation operation, and move to the side of the virtual item according to the position of the virtual item, and then respond to the target After the virtual item is selected, the operation of picking up the virtual item is realized.
  • a method, device and readable storage medium for observing virtual items in a virtual environment are provided.
  • a method for observing virtual items in a virtual environment which is executed by a computer device, and the method includes:
  • a first environment interface is displayed, and the first environment interface includes a screen when rotating and observing the virtual environment with a first rotation axis distance, and the first rotation axis distance is between the observation point of the virtual object and the rotation center the distance;
  • the first rotation wheelbase is adjusted to the second rotation wheelbase
  • a second environment interface is displayed, and the second environment interface includes an image when the virtual item in the virtual environment is rotated and observed with the second rotation wheelbase.
  • a device for observing virtual items in a virtual environment includes:
  • the display module is configured to display a first environment interface, the first environment interface includes a picture when the virtual environment is rotated and observed with a first rotation axis distance, and the first rotation axis distance is the observation point of the virtual object The distance from the center of rotation;
  • a receiving module configured to receive a viewing angle rotation operation, where the viewing angle rotation operation is used to rotate the observation point around the rotation center;
  • An adjustment module configured to adjust the first rotation wheelbase to the second rotation wheelbase by adjusting the position of the rotation center when the operating parameters of the viewing angle rotation operation meet the article observation conditions
  • the display module is further configured to display a second environment interface, and the second environment interface includes an image when the virtual item in the virtual environment is rotated and observed with the second rotation wheelbase.
  • a computer device including a memory and a processor, and computer-readable instructions are stored in the memory.
  • the processor executes the same The method for observing virtual items in the virtual environment provided in the application embodiments.
  • one or more non-volatile storage media storing computer-readable instructions, when the computer-readable instructions are executed by one or more processors, cause one or more processors to execute as implemented in this application.
  • a computer program product is provided.
  • the computer program product runs on a computer, the computer executes the method for observing virtual items in a virtual environment as provided in the above embodiments of the present application.
  • Fig. 1 is a schematic diagram of an observation mode of a camera model provided by an exemplary embodiment of the present application
  • FIG. 2 is a schematic diagram of the corresponding relationship between the rotation wheelbase and the rotation range provided by an exemplary embodiment of the present application;
  • Fig. 3 is a structural block diagram of a terminal provided by an exemplary embodiment of the present application.
  • Fig. 4 is a schematic diagram of an implementation environment provided by an exemplary embodiment of the present application.
  • Fig. 5 is a flowchart of a method for observing virtual items in a virtual environment provided by an exemplary embodiment of the present application
  • FIG. 6 is a schematic diagram of an interface based on the observation condition of an article provided by the embodiment shown in FIG.
  • FIG. 7 is a schematic diagram of an interface of another object observation condition provided based on the embodiment shown in FIG. 5;
  • FIG. 8 is a schematic diagram of an interface of a method for observing virtual items in a virtual environment provided by an exemplary embodiment of the present application.
  • FIG. 9 is a flowchart of a method for observing virtual items in a virtual environment provided by another exemplary embodiment of the present application.
  • FIG. 10 is a schematic diagram of the correspondence relationship between the rotation axis corresponding to the virtual object in the standing state and the driving state provided based on the embodiment shown in FIG. 9;
  • FIG. 11 is a flowchart of a method for observing virtual items in a virtual environment provided by another exemplary embodiment of the present application.
  • Fig. 12 is a flowchart of a method for observing virtual items in a virtual environment provided by another exemplary embodiment of the present application.
  • Fig. 13 is a structural block diagram of a device for observing virtual items in a virtual environment provided by an exemplary embodiment of the present application
  • FIG. 14 is a structural block diagram of a device for observing virtual items in a virtual environment provided by another exemplary embodiment of the present application.
  • Fig. 15 is a structural block diagram of a terminal provided by another exemplary embodiment of the present application.
  • Rotational wheelbase refers to the distance between the observation point and the center of rotation when rotating and observing the virtual environment.
  • the virtual environment is simulated at the observation point through the camera model, and the images in the virtual environment are collected to realize the observation of the virtual environment. That is, the rotation wheelbase refers to the camera model and The distance between the rotation centers of the camera model.
  • the relative position of the camera model and the virtual object is kept unchanged, and the position of the rotation center is adjusted to change the rotational wheelbase.
  • the change of the rotational wheelbase is mainly It is used to adjust the rotation flexibility of the camera model.
  • the rotating wheelbase is small, the rotation flexibility is higher. Under the same screen sliding operation, the camera model rotates at a larger angle to observe the virtual environment The range is wider.
  • Camera model refers to the three-dimensional model located around the virtual object in the virtual environment.
  • the camera model is located near or on the head of the virtual object.
  • the camera model can be located behind the virtual object and bound to the virtual object, or it can be located at any position with a preset distance from the virtual object.
  • the camera model can be positioned from different angles.
  • the virtual object in the three-dimensional virtual environment is observed.
  • the camera model is located behind the virtual object (such as the head and shoulders of the virtual person).
  • the camera model is not actually displayed in the three-dimensional virtual environment, that is, the camera model cannot be recognized in the three-dimensional virtual environment displayed on the user interface.
  • the camera model is located at any position at a preset distance from the virtual object as an example.
  • a virtual object corresponds to a camera model, and the camera model can be rotated with the virtual object as the center of rotation, such as: virtual object Any point of is the center of rotation to rotate the camera model.
  • the camera model not only rotates in angle, but also shifts in displacement.
  • the distance between the camera model and the center of rotation remains unchanged. That is, the camera model is rotated on the surface of a sphere with the center of rotation as the center of the sphere, where any point of the virtual object can be the head, torso, or any point around the virtual object.
  • the viewing angle direction of the camera model is the direction in which the vertical line on the tangent plane of the spherical surface where the camera model is located points to the virtual object.
  • the camera model can also observe the virtual object at a preset angle in different directions of the virtual object.
  • a point is determined in the virtual object 11 as the rotation center 12, and the camera model rotates around the rotation center 12.
  • the camera model is configured with an initial position, and the initial position is the virtual object The position above the back (such as the position behind the brain).
  • the initial position is position 13, and when the camera model rotates to position 14 or position 15, the viewing angle direction of the camera model changes with the rotation of the camera model.
  • the position of the nose of the virtual object 200 is the position of the camera model 210 (take the first-person perspective as an example), and the rotation center 221 is located in FIG.
  • the camera model 210 rotates around the rotation center 221
  • the first rotation wheelbase of the camera model 210 before the wheelbase adjustment is the distance between the camera model 210 and the rotation center 221
  • the camera model 210 is at
  • the rotation range before the wheelbase adjustment is shown in the dotted spherical range in Figure 2.
  • the second rotation axis distance is the distance between the camera model 210 and the rotation center 222.
  • the rotation range of the camera model 210 after the adjustment of the wheelbase is shown in FIG. 2
  • the solid spherical range shown is shown.
  • the terminal in this application can be a desktop computer, a laptop portable computer, a mobile phone, a tablet computer, an e-book reader, an MP3 (Moving Picture Experts Group Audio Layer III, dynamic image expert compression standard audio layer 3) player, and MP4 (Moving Picture Experts Group Audio Layer III). Picture Experts Group Audio Layer IV, Motion Picture Experts compress standard audio layer 4) Player and so on.
  • An application program supporting a virtual environment is installed and running in the terminal, for example, an application program supporting a three-dimensional virtual environment.
  • the application program can be any of virtual reality applications, three-dimensional map programs, military simulation programs, TPS games, FPS games, and MOBA games.
  • the application program may be a stand-alone version application program, such as a stand-alone version of a 3D game program, or a network online version application program.
  • Fig. 3 shows a structural block diagram of an electronic device provided by an exemplary embodiment of the present application.
  • the electronic device 300 includes an operating system 320 and an application program 322.
  • the operating system 320 is basic software that provides the application 322 with secure access to computer hardware.
  • the application program 322 is an application program supporting a virtual environment.
  • the application program 322 is an application program supporting a three-dimensional virtual environment.
  • the application 322 can be a virtual reality application, a three-dimensional map program, a military simulation program, a third-person shooting game (Third-Personal Shooting Game, TPS), a first-person shooting game (FPS), a MOBA game, Any of the multiplayer gun battle survival games.
  • the application program 322 may be a stand-alone application program, such as a stand-alone 3D game program.
  • Fig. 4 shows a structural block diagram of a computer system provided by an exemplary embodiment of the present application.
  • the computer system 400 includes: a first device 420, a server 440, and a second device 460.
  • the first device 420 installs and runs an application program supporting the virtual environment.
  • the application can be any of virtual reality applications, three-dimensional map programs, military simulation programs, TPS games, FPS games, MOBA games, or multiplayer gun battle survival games.
  • the first device 420 is a device used by the first user.
  • the first user uses the first device 420 to control the first virtual object in the virtual environment to perform activities, including but not limited to: adjusting body posture, crawling, walking, running, At least one of riding, jumping, driving, picking up, shooting, attacking, or throwing.
  • the first virtual object is a first virtual character, such as a simulated character or an animation character.
  • the first device 420 is connected to the server 440 through a wireless network or a wired network.
  • the server 440 includes at least one of a server, multiple servers, a cloud computing platform, and a virtualization center.
  • the server 440 is used to provide background services for applications supporting the three-dimensional virtual environment.
  • the server 440 is responsible for the main calculation work, and the first device 420 and the second device 460 are responsible for the secondary calculation work; or, the server 440 is responsible for the secondary calculation work, and the first device 420 and the second device 460 are responsible for the main calculation work;
  • the server 440, the first device 420, and the second device 460 adopt a distributed computing architecture to perform collaborative computing.
  • the second device 460 installs and runs an application program supporting the virtual environment.
  • the application can be any of virtual reality applications, three-dimensional map programs, military simulation programs, FPS games, MOBA games, and multiplayer gun battle survival games.
  • the second device 460 is a device used by the second user.
  • the second user uses the second device 460 to control the second virtual object in the virtual environment to perform activities, including but not limited to: adjusting body posture, crawling, walking, running, At least one of riding, jumping, driving, picking up, shooting, attacking, and throwing.
  • the second virtual object is a second virtual character, such as a simulated character or an animation character.
  • first virtual character and the second virtual character are in the same virtual environment.
  • first virtual character and the second virtual character may belong to the same team, the same organization, have a friend relationship, or have temporary communication permissions.
  • first virtual character and the second virtual character may also belong to different teams, different organizations, or two groups that are hostile.
  • the applications installed on the first device 420 and the second device 460 are the same, or the applications installed on the two devices are the same type of applications on different control system platforms.
  • the first device 420 may generally refer to one of multiple devices, and the second device 460 may generally refer to one of multiple devices. This embodiment only uses the first device 420 and the second device 460 as examples.
  • the device types of the first device 420 and the second device 460 are the same or different. The device types include: game consoles, desktop computers, smart phones, tablet computers, e-book readers, MP3 players, MP4 players and laptop portables At least one of the computers. In the following embodiments, the device is a desktop computer as an example.
  • the number of the above-mentioned devices can be more or less. For example, there may be only one device, or there may be dozens or hundreds of devices, or more.
  • the embodiment of the present application does not limit the quantity and type of equipment.
  • the method for observing virtual items in the virtual environment provided in the embodiments of the present application will be described.
  • the application of the method in a terminal is taken as an example for description. As shown in FIG. 5, the method includes:
  • a first environment interface is displayed, and the first environment interface includes an image when the virtual environment is rotated and observed with a first rotation axis distance.
  • the first rotation axis distance is the distance between the observation point of the virtual object and the rotation center.
  • the observation point realizes the observation of the virtual environment by capturing images of the virtual environment through the camera model, that is, the first rotation axis distance is the distance between the camera model and the center of rotation.
  • the virtual environment can be observed in a way that the camera model and the virtual object rotate synchronously, that is, the camera model is bound to the position of the virtual object, and the camera model While rotating, the virtual object rotates synchronously with the rotation of the camera model; the camera model can also be rotated independently to observe the virtual environment, that is, the virtual object keeps the position and facing direction unchanged, and the camera model rotates to the virtual environment Make observations.
  • the first rotation wheelbase is a default wheelbase when observing the virtual environment.
  • the motion state of the virtual object includes any one of a standing state, a running state, a walking state, a driving state, a riding state, or a swimming state.
  • the corresponding first rotation wheelbase has a different wheelbase length.
  • the driving state and the riding state can also be combined to realize one state, for example, the riding state and the driving state are combined to realize the driving state.
  • the person perspective when observing the virtual environment includes any one of a first person perspective or a third person perspective.
  • the wheelbase length of the first rotation wheelbase corresponding to different person perspectives is different.
  • the corresponding first rotation wheelbase has different wheelbase lengths for different combinations of motion states and person perspectives.
  • table 1 for the correspondence between the angle of view, the state of motion and the wheelbase length of the first rotation wheelbase:
  • x 1 represents the horizontal wheelbase when the virtual object is in a standing state and observes the virtual environment from the first-person perspective
  • y 1 represents the vertical wheelbase when the virtual object is in a standing state and observes the virtual environment from the first-person perspective
  • x 2 represents the horizontal wheelbase when the virtual object is standing and the virtual environment is viewed from the third-person perspective
  • y 2 represents the vertical wheelbase when the virtual object is standing and the virtual environment is viewed from the third-person perspective
  • x 3 Indicates the horizontal wheelbase when the virtual object is driving and viewing the virtual environment from the first-person perspective
  • y 3 represents the vertical wheelbase when the virtual object is driving and viewing the virtual environment from the first-person perspective.
  • x 4 represents the virtual The horizontal wheelbase when the object is driving and the virtual environment is viewed from the third-person perspective
  • y 4 represents the vertical wheelbase when the virtual object is driving and the virtual environment is viewed from the third-person perspective
  • x 5 represents the virtual object is at swimming state
  • y 5 means the virtual object is swimming
  • x 6 means the virtual object is swimming
  • y 6 represents the vertical wheelbase when the virtual object is in a swimming state and the virtual environment is observed from the third-person perspective.
  • Step 502 Receive a viewing angle rotation operation.
  • the viewing angle rotation operation is used to rotate the observation point around the center of rotation.
  • the implementation manner of the viewing angle rotation operation includes at least one of the following manners:
  • the method is applied to a mobile terminal including a touch display screen, and the viewing angle rotation operation is realized by sliding on the touch display screen;
  • the sliding operation on the touch display screen may be a sliding operation performed on the left half of the touch display screen, or a sliding operation performed on the right half of the touch display screen, or on the touch screen. Touch any position of the display screen to perform a sliding operation, and the specific position of the sliding operation is determined according to the setting parameters that control the rotation of the viewing angle.
  • the method is applied to a mobile terminal including a gyroscope, and the rotation of the gyroscope is controlled by swinging the terminal to realize the viewing angle rotation operation;
  • an input operation is performed through an external input device to realize the viewing angle rotation operation, such as inputting a direction control signal through a keyboard to realize the viewing angle rotation operation.
  • Step 503 When the operating parameter of the viewing angle rotation operation meets the object observation condition, the first rotation wheelbase is adjusted to the second rotation wheelbase by adjusting the position of the rotation center.
  • the item observation condition is used to indicate a condition corresponding to the observation of the virtual item in the virtual environment.
  • the operating parameter includes an operating angle, and when the operating angle is within the target angle range, the first rotation axis distance is adjusted to the second rotation axis distance by adjusting the position of the rotation center.
  • the operating parameter also includes the operating distance.
  • the first rotation axis distance is adjusted to the second rotation by adjusting the position of the rotation center Wheelbase.
  • the operating parameters also include the existence of virtual objects within a preset distance range of the virtual object in the virtual environment, when the operating angle is within the preset angle range, and the preset distance range of the virtual object in the virtual environment
  • the first rotation wheelbase is adjusted to the second rotation wheelbase by adjusting the position of the rotation center.
  • the virtual item is located within a preset distance range of the facing direction of the virtual object, that is, when the operating angle is within the preset angle range and the direction of the virtual object faces, and the preset distance of the virtual object is
  • the first rotation wheelbase is adjusted to the second rotation wheelbase by adjusting the position of the rotation center, where the virtual item is a virtual item to be picked up.
  • a virtual object 610 is included in the first environment interface 600.
  • the terminal receives a viewing angle rotation operation and the operating angle of the viewing angle rotation operation is within a preset angle range, it is determined that the virtual object 610 faces Whether there is a virtual item within the right direction range 620 and the preset distance range 630 of the virtual object 610, as shown in FIG. 6, there is a virtual item 640 at the intersection of the direction range 620 and the preset distance range 630.
  • the virtual item 640 is the item to be picked up, and the position of the rotation center is adjusted according to the viewing angle rotation operation.
  • the operating parameters also include the rotation end position of the camera model, when the operation angle is within the target angle range and the rotation end position is within the preset position range , By adjusting the position of the rotation center, the first rotation wheelbase is adjusted to the second rotation wheelbase.
  • the preset position range is a range above the horizontal position corresponding to the center of rotation.
  • the first rotation is adjusted by adjusting the position of the rotation center
  • the wheelbase is adjusted to the second rotating wheelbase.
  • a virtual object 710 is included in the first environment interface 700.
  • the rotation of the camera model 720 is determined
  • the end position as shown in FIG. 7, the horizontal position corresponding to the rotation center point 730 of the camera model 720 is a plane 740, and the rotation end position of the camera model 720 is above the plane 740, then the rotation operation on the rotation center of the camera model 720 The position is adjusted.
  • a second environment interface is displayed, and the second environment interface includes a picture when the virtual item in the virtual environment is rotated and observed with the second rotation wheelbase.
  • the second rotation axis distance is the distance between the observation point and the adjusted rotation center.
  • the second rotation axis distance is the distance between the camera model and the adjusted rotation center.
  • the method for observing virtual items in a virtual environment in a virtual environment is to match the operating parameters of the viewing angle rotation operation with the object observation conditions to determine whether the viewing angle after the viewing angle rotation operation is correct
  • the viewing angle of the virtual item is the viewing angle of the virtual item
  • the rotation axis distance between the observation point and the rotation center is adjusted by adjusting the position of the rotation center.
  • the rotation flexibility of the observation point, the rotation flexibility matching the observation of the virtual item is switched in real time through the monitoring of the viewing angle rotation operation, and the normal wheelbase is distinguished from the wheelbase when the virtual item is observed, so as to provide a more flexible and broader view Way to observe virtual items.
  • the operating angle of the viewing angle rotation operation is used to determine whether the viewing angle rotation operation meets the object observation conditions.
  • the object observation conditions include the operating angle within the target angle range, the operating distance reaches the target distance, and the rotation end position is preset
  • the method provided by this embodiment determines whether there are virtual items to be picked up within the preset distance range of the virtual environment, and triggers the adjustment of the rotation wheelbase when there are virtual items within the preset distance range of the virtual object to avoid This solves the problem of a false touch that triggers adjustment to the rotating wheelbase for observing the virtual object without virtual items on the periphery of the virtual object.
  • FIG. 8 is a schematic diagram of an interface of a method for observing virtual items in a virtual environment provided by an exemplary embodiment of the present application.
  • the third-person perspective is taken as an example or illustrated.
  • the first environment interface 810 Including the virtual object 811, the current first environment interface 810 is used to observe the virtual environment with the first rotation axis distance. After sliding the first distance in the terminal display screen, the first environment interface 820 is displayed.
  • the first environment interface 820 Half of the cabinet 821 is displayed in the middle; when the vertical sliding operation is in the terminal display screen and the vertical sliding operation meets the item observation conditions, such as: the vertical sliding operation is within the preset angle range 812, adjust the position of the rotation center 822 to move the first A rotating wheelbase 823 (the distance between the camera model 825 and the center of rotation 826) is adjusted to the second rotating wheelbase 824, and a second environment interface 830 is displayed.
  • the second environment interface 830 is a virtual The image when the virtual item 831 in the environment is observed.
  • the second environment interface 840 is displayed, and the complete cabinet 821 is displayed in the second environment interface 840. According to Fig.
  • the viewing angle rotation angle is different, that is, the flexibility of the viewing angle rotation is different.
  • the first rotation wheelbase is adjusted to the second rotation wheelbase, the flexibility of viewing angle rotation is improved, and the field of view is wider when the viewing angle is rotated. That is, adjusting the first rotation axis distance to the second rotation axis distance is used to adjust the flexibility of viewing angle rotation.
  • the adjustment of the first rotation axis base to the second rotation axis base can be used to adjust the flexibility of viewing angle rotation from low to increase, and can also be used to adjust the flexibility of viewing angle rotation from high to low.
  • FIG. 9 is a method for observing virtual objects in a virtual environment provided by another exemplary embodiment of the present application.
  • the flowchart of the method is applied to the terminal as an example. As shown in Fig. 9, the method includes:
  • step 901 a first environment interface is displayed, and the first environment interface includes an image when the virtual environment is rotated and observed with the first rotation wheelbase.
  • the first rotation axis distance is the distance between the observation point of the virtual object and the rotation center.
  • the observation point realizes the observation of the virtual environment by capturing images of the virtual environment through the camera model, that is, the first rotation axis distance is the distance between the camera model and the center of rotation.
  • step 501 the observation method for rotating observation of the virtual environment through the first rotation axis distance has been described in detail in step 501, and will not be repeated here.
  • Step 902 Receive a viewing angle rotation operation.
  • the viewing angle rotation operation is used to rotate the observation point around the center of rotation.
  • Step 903 When the operating parameter of the viewing angle rotation operation meets the object observation condition, determine the motion state of the virtual object.
  • the motion state includes any one of a standing state, a running state, a walking state, a driving state, a riding state, or a swimming state.
  • step 503 the manner of determining whether the viewing angle rotation operation meets the object observation condition has been described in detail in step 503, and will not be repeated here.
  • Step 904 Determine the wheelbase length corresponding to the motion state of the virtual object.
  • the manner of determining the wheelbase length corresponding to the motion state includes any one of the following manners:
  • the corresponding relationship is shown in Table 2 below:
  • p 1 represents the horizontal wheelbase in the standing state
  • q 1 represents the vertical wheelbase in the standing state
  • p 2 represents the horizontal wheelbase in the driving state
  • q 2 represents the vertical wheelbase in the driving state
  • p 3 represents the horizontal wheelbase in the swimming state
  • q 3 represents the vertical wheelbase in the swimming state.
  • the motion state of the virtual object includes any one of the standing state, the driving state, or the swimming state. Taking the above-mentioned target state as the standing state as an example for illustration, when the virtual object is in the driving state, determine the first state corresponding to the driving state.
  • a wheelbase coefficient, the product of the first wheelbase coefficient and the reference receipt length is determined as the wheelbase length corresponding to the driving state; or, when the virtual object is in the swimming state, the second wheelbase coefficient corresponding to the swimming state is determined, and the first The product of the two-wheelbase system and the reference wheelbase length is determined as the wheelbase length corresponding to the useful state.
  • the terminal may store the corresponding relationship between the wheelbase coefficient and the motion state, or obtain the corresponding relationship from the server. Schematically, the corresponding relationship is shown in Table 3 below:
  • the second rotating wheelbase in the standing state is (x, y)
  • the second rotating wheelbase corresponding to the driving state is (0.7x, 0.8y)
  • the second rotating wheelbase corresponding to the swimming state is (1.5x , 1.2y).
  • the horizontal rotation axis 1011 corresponding to the virtual object 1010 is x (the distance between the rotation center 1013 and the camera model 1014), and the vertical rotation axis 1012
  • the wheelbase is y (the distance between the rotation center 1013 and the ground).
  • the horizontal rotation axis 1021 corresponding to the driving state is 1.25x (between the rotation center 1023 and the camera model 1024)
  • the distance between the rotation axis 1022 and the ground is 0.8y (the distance between the rotation center 1023 and the ground).
  • Step 905 Adjust the position of the rotation center according to the long wheelbase, and adjust the first rotation wheelbase to the second rotation wheelbase.
  • the second rotation axis distance is the distance between the observation point and the adjusted rotation center.
  • the second rotation axis distance is the distance between the camera model and the adjusted rotation center.
  • step 906 a second environment interface is displayed, and the second environment interface includes a picture when the virtual item in the virtual environment is rotated and observed with the second rotation wheelbase.
  • the method for observing virtual items in a virtual environment in a virtual environment is to match the operating parameters of the viewing angle rotation operation with the object observation conditions to determine whether the viewing angle after the viewing angle rotation operation is correct
  • the viewing angle of the virtual item is the viewing angle of the virtual item
  • the rotation axis distance between the observation point and the rotation center is adjusted by adjusting the position of the rotation center.
  • the rotation flexibility of the observation point, the rotation flexibility matching the observation of the virtual item is switched in real time through the monitoring of the viewing angle rotation operation, and the ordinary wheelbase is distinguished from the wheelbase when the virtual item is observed, so as to provide a more flexible and wider field of view. Way to observe virtual items.
  • the method provided in this embodiment sets different second rotation wheelbases for different motion states of virtual objects, and adapts different second rotation wheelbases under different motion states, such as: in driving state, virtual object
  • the movement speed is faster, and the rotation flexibility of the viewing angle is low. Therefore, according to the flexibility requirements of the driving state, the corresponding second rotation wheelbase is adapted to avoid discomfort between the second rotation wheelbase in the standing state and the driving state The problem of allocation.
  • FIG. 11 is a flowchart of a method for observing virtual items in a virtual environment provided by another exemplary embodiment of the present application. The method is applied to the terminal as an example for description, and the method includes:
  • a first environment interface is displayed, and the first environment interface includes a picture when the virtual environment is rotated and observed with the first rotation axis distance.
  • the first rotation axis distance is the distance between the observation point of the virtual object and the rotation center.
  • the observation point realizes the observation of the virtual environment by capturing images of the virtual environment through the camera model, that is, the first rotation axis distance is the distance between the camera model and the center of rotation.
  • step 501 the observation method for rotating observation of the virtual environment through the first rotation axis distance has been described in detail in step 501, and will not be repeated here.
  • Step 1102 Receive a viewing angle rotation operation.
  • the viewing angle rotation operation is used to rotate the observation point around the center of rotation.
  • Step 1103 When the operating parameter of the viewing angle rotation operation meets the object observation condition, determine the motion state of the virtual object.
  • the motion state includes any one of a standing state, a running state, a walking state, a driving state, a riding state, or a swimming state.
  • step 503 the manner of determining whether the viewing angle rotation operation meets the object observation condition has been described in detail in step 503, and will not be repeated here.
  • Step 1104 Using the wheelbase length corresponding to the target state as the reference wheelbase length, determine the wheelbase coefficient corresponding to the perspective of the person and the motion state of the virtual object.
  • any one of the motion states can be realized as the target state.
  • the wheelbase length corresponding to the standing state is used as the reference wheelbase length.
  • the wheelbase length corresponding to the target state in the target person's perspective is used as the reference wheelbase length, for example, the wheelbase length in the standing state and the first-person perspective is used as the reference wheelbase length.
  • the terminal stores the corresponding relationship between the perspective, motion state, and wheelbase coefficient of the person, or the terminal obtains the corresponding relationship from the server.
  • the corresponding relationship is shown in Table 4 below:
  • Step 1105 Determine the product of the wheelbase coefficient and the reference wheelbase length as the wheelbase length corresponding to the motion state of the virtual object.
  • the wheelbase length can also be customized through the interface display, that is, the terminal displays the wheelbase adjustment interface, the wheelbase adjustment interface includes a preview of the wheelbase, and the user can adjust the wheelbase by dragging or numerical input.
  • the length of the wheelbase can be adjusted, and the length of the wheelbase under different motion states and different person perspectives can also be adjusted by selecting the motion state and the person viewing angle.
  • Step 1106 Adjust the position of the rotation center according to the long wheelbase, and adjust the first rotation wheelbase to the second rotation wheelbase.
  • the second rotation axis distance is the distance between the observation point and the adjusted rotation center.
  • the second rotation axis distance is the distance between the camera model and the adjusted rotation center.
  • Step 1107 Display a second environment interface.
  • the second environment interface includes an image when the virtual item in the virtual environment is rotated and observed with the second rotation wheelbase.
  • the second rotation wheelbase is restored to the first by adjusting the position of the rotation center.
  • Rotating wheelbase For example, when an upward sliding operation is received on the terminal display and the upward sliding operation is within the preset angle range, the second rotation axis distance is restored to the first rotation axis distance by adjusting the position of the rotation center .
  • the method for observing virtual items in a virtual environment in a virtual environment is to match the operating parameters of the viewing angle rotation operation with the object observation conditions to determine whether the viewing angle after the viewing angle rotation operation is correct
  • the viewing angle of the virtual item is the viewing angle of the virtual item
  • the rotation axis distance between the observation point and the rotation center is adjusted by adjusting the position of the rotation center.
  • the rotation flexibility of the observation point, the rotation flexibility matching the observation of the virtual item is switched in real time through the monitoring of the viewing angle rotation operation, and the normal wheelbase is distinguished from the wheelbase when the virtual item is observed, so as to provide a more flexible and broader view Way to observe virtual items.
  • the method provided in this embodiment adapts the second rotation wheelbase in different person perspectives to avoid the difference in the immersion of the virtual environment in the third person perspective and the first person perspective, and the second rotation wheelbase is the same.
  • the wheelbase does not match the problem.
  • FIG. 12 is an overall flowchart of a method for observing virtual items in a virtual environment provided by an exemplary embodiment of the present application.
  • the method is applied to a terminal as an example for description, as shown in FIG. 12.
  • the method includes:
  • Step 1201 Determine whether the included angle p of the sliding operation in the horizontal direction is not greater than a.
  • the judging operation is judging whether the sliding operation meets the item observation condition.
  • Step 1202 When p>a, the quick pickup lens solution is not activated, the angle of view rotation direction is p, and the rotation axis is fixed.
  • the viewing angle rotation does not mean that it is necessary to observe the virtual item in the virtual environment.
  • step 1203 when p ⁇ a, the quick pick-up lens scheme is enabled, the angle of view rotation direction is p, and the state of the virtual object is determined.
  • determining the state of the virtual object includes determining the motion state of the virtual object, and the motion state includes any one of a standing state, a driving state, or a swimming state.
  • Step 1204 When the virtual object is in a standing state, it is determined that the rotation axis value in the first person perspective is (x 1 , y 1 ), and the rotation axis value in the third person perspective is (x 2 , y 2 ).
  • Step 1205 When the virtual object is in the driving state, it is determined that the rotation axis value in the first-person perspective is (x 3 , y 3 ), and the rotation axis value in the third-person perspective is (x 4 , y 4 ).
  • Step 1206 When the virtual object is in a swimming state, it is determined that the rotation axis value in the first-person perspective is (x 5 , y 5 ), and the rotation axis value in the third-person perspective is (x 6 , y 6 ).
  • Step 1207 Detect the rotation value, and rotate the corresponding viewing angle range on the corresponding rotation axis.
  • the viewing angle is rotated according to the viewing angle rotation operation.
  • the method for observing virtual items in a virtual environment in a virtual environment is to match the operating parameters of the viewing angle rotation operation with the object observation conditions to determine whether the viewing angle after the viewing angle rotation operation is correct
  • the viewing angle of the virtual item is the viewing angle of the virtual item
  • the rotation axis distance between the observation point and the rotation center is adjusted by adjusting the position of the rotation center.
  • the rotation flexibility of the observation point, the rotation flexibility matching the observation of the virtual item is switched in real time through the monitoring of the viewing angle rotation operation, and the normal wheelbase is distinguished from the wheelbase when the virtual item is observed, so as to provide a more flexible and broader view Way to observe virtual items.
  • steps in the embodiments of the present application are not necessarily executed in sequence in the order indicated by the step numbers. Unless specifically stated in this article, the execution of these steps is not strictly limited in order, and these steps can be executed in other orders. Moreover, at least part of the steps in each embodiment may include multiple sub-steps or multiple stages. These sub-steps or stages are not necessarily executed at the same time, but can be executed at different times. The execution of these sub-steps or stages The sequence is not necessarily performed sequentially, but may be performed alternately or alternately with at least a part of other steps or sub-steps or stages of other steps.
  • FIG. 13 is a structural block diagram of a device for observing virtual items in a virtual environment provided by an exemplary embodiment of the present application.
  • the device may be implemented in a terminal. As shown in FIG. 13, the device includes: a display module 1310, a receiving module 1320, and Adjustment module 1330;
  • the display module 1310 is configured to display a first environment interface, the first environment interface includes a picture when the virtual environment is rotated and observed with a first rotation axis distance, and the first rotation axis distance is the observation of the virtual object The distance between the point and the center of rotation;
  • the receiving module 1320 is configured to receive a viewing angle rotation operation, where the viewing angle rotation operation is used to rotate the observation point around the rotation center;
  • the adjustment module 1330 is configured to adjust the first rotation wheelbase to the second rotation wheelbase by adjusting the position of the rotation center when the operating parameters of the viewing angle rotation operation meet the article observation conditions;
  • the display module 1310 is further configured to display a second environment interface, and the second environment interface includes an image when the virtual item in the virtual environment is rotated and observed with the second rotation wheelbase.
  • the adjustment module 1330 includes:
  • the determining unit 1331 is configured to determine the motion state of the virtual object
  • the determining unit 1331 is further configured to determine the wheelbase length corresponding to the motion state of the virtual object, where the wheelbase length is the wheelbase length of the second rotating wheelbase;
  • the adjustment unit 1332 is configured to adjust the position of the rotation center according to the long wheelbase, and adjust the first rotation wheelbase to the second rotation wheelbase.
  • the determining unit 1331 is further configured to use the wheelbase length corresponding to the target state as the reference wheelbase length to determine the motion state of the virtual object and the target state. Wheelbase coefficient corresponding to the state;
  • the determining unit 1331 is further configured to determine the product of the wheelbase coefficient and the reference wheelbase length as the wheelbase length corresponding to the motion state of the virtual object.
  • the motion state is any one of a standing state, a driving state, or a swimming state
  • the standing state is the target state
  • the determining unit 1331 is further configured to determine a first wheelbase coefficient corresponding to the driving state when the virtual object is in the driving state; and compare the first wheelbase coefficient with the reference wheelbase length The product is determined as the long wheelbase corresponding to the driving state;
  • the determining unit 1331 is further configured to determine a second wheelbase coefficient corresponding to the swimming state when the virtual object is in the swimming state; and compare the second wheelbase coefficient with the reference wheelbase length The product is determined as the long wheelbase corresponding to the swimming state.
  • the determining unit 1331 is further configured to determine the perspective of the person when observing the virtual environment; determine the perspective corresponding to the perspective of the person and the motion state of the virtual object The coefficient of the wheelbase.
  • the operating parameter includes an operating angle
  • the adjustment module 1330 is further configured to adjust the first rotation axis distance to the second rotation axis distance by adjusting the position of the rotation center when the operation angle is within the target angle range.
  • the operating parameters further include the existence of virtual items of the virtual object within a preset distance in the virtual environment;
  • the adjustment module 1330 is further configured to: when the operating angle is within the target angle range and the virtual object is within the preset distance range in the virtual environment, the virtual object is The position of the rotation center is adjusted, and the first rotation axis distance is adjusted to the second rotation axis distance.
  • the observation point observes the virtual environment through a camera model
  • the operation parameter further includes a rotation end position of the camera model
  • the adjustment module 1330 is further configured to adjust the position of the rotation center when the operation angle is within the target angle range and the rotation end position is within a preset position range.
  • the first rotation wheelbase is adjusted to the second rotation wheelbase.
  • the method and apparatus for observing virtual items in a virtual environment in a virtual environment can determine whether the viewing angle after the viewing angle rotation operation is correct by matching the operating parameters of the viewing angle rotation operation with the object observation conditions.
  • the viewing angle of the virtual item When the viewing angle of the virtual item after the rotation operation is the viewing angle of the virtual item, the rotation axis distance between the observation point and the rotation center is adjusted by adjusting the position of the rotation center.
  • the rotation flexibility of the observation point, the rotation flexibility matching the observation of the virtual item is switched in real time through the monitoring of the viewing angle rotation operation, and the normal wheelbase is distinguished from the wheelbase when the virtual item is observed, so as to provide a more flexible and broader view Way to observe virtual items.
  • FIG. 15 shows a structural block diagram of a terminal 1500 provided by an exemplary embodiment of the present invention.
  • the terminal 1500 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, moving picture expert compression standard audio layer 3), MP4 (Moving Picture Experts Group Audio Layer IV, moving picture expert compressing standard audio Level 4) Player, laptop or desktop computer.
  • the terminal 1500 may also be called user equipment, portable terminal, laptop terminal, desktop terminal and other names.
  • the terminal 1500 includes a processor 1501 and a memory 1502.
  • the processor 1501 may include one or more processing cores, and "plurality” refers to at least two. Such as 4-core processors, 8-core processors, etc.
  • the processor 1501 can be implemented in at least one hardware form among DSP (Digital Signal Processing), FPGA (Field-Programmable Gate Array), PLA (Programmable Logic Array, Programmable Logic Array) .
  • the processor 1501 may also include a main processor and a coprocessor.
  • the main processor is a processor used to process data in the wake state, also called a CPU (Central Processing Unit, central processing unit); the coprocessor is A low-power processor used to process data in the standby state.
  • CPU Central Processing Unit
  • the coprocessor is A low-power processor used to process data in the standby state.
  • the processor 1501 may be integrated with a GPU (Graphics Processing Unit, image processor), and the GPU is responsible for rendering and drawing content that needs to be displayed on the display screen.
  • the processor 1501 may further include an AI (Artificial Intelligence) processor, and the AI processor is used to process computing operations related to machine learning.
  • AI Artificial Intelligence
  • the memory 1502 may include one or more computer-readable storage media, which may be non-transitory.
  • the memory 1502 may also include high-speed random access memory and non-volatile memory, such as one or more magnetic disk storage devices and flash memory storage devices.
  • the non-transitory computer-readable storage medium in the memory 1502 is used to store at least one instruction, and the at least one instruction is used to be executed by the processor 1501 to implement the virtual environment provided by the method embodiment of the present application. Observation method of virtual items in China.
  • the terminal 1500 may optionally further include: a peripheral device interface 1503 and at least one peripheral device.
  • the processor 1501, the memory 1502, and the peripheral device interface 1503 may be connected by a bus or a signal line.
  • Each peripheral device can be connected to the peripheral device interface 1503 through a bus, a signal line or a circuit board.
  • the peripheral device includes at least one of a radio frequency circuit 1504, a touch display screen 1505, a camera 1506, an audio circuit 1507, a positioning component 1508, and a power supply 1509.
  • the peripheral device interface 1503 may be used to connect at least one peripheral device related to I/O (Input/Output) to the processor 1501 and the memory 1502.
  • the processor 1501, the memory 1502, and the peripheral device interface 1503 are integrated on the same chip or circuit board; in some other embodiments, any one of the processor 1501, the memory 1502, and the peripheral device interface 1503 or The two can be implemented on separate chips or circuit boards, which are not limited in this embodiment.
  • the radio frequency circuit 1504 is used to receive and transmit RF (Radio Frequency, radio frequency) signals, also called electromagnetic signals.
  • the radio frequency circuit 1504 communicates with a communication network and other communication devices through electromagnetic signals.
  • the radio frequency circuit 1504 converts electrical signals into electromagnetic signals for transmission, or converts received electromagnetic signals into electrical signals.
  • the radio frequency circuit 1504 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a user identity module card, and so on.
  • the radio frequency circuit 1504 can communicate with other terminals through at least one wireless communication protocol.
  • the wireless communication protocol includes but is not limited to: World Wide Web, Metropolitan Area Network, Intranet, various generations of mobile communication networks (2G, 3G, 4G, and 5G), wireless local area network and/or WiFi (Wireless Fidelity, wireless fidelity) network.
  • the radio frequency circuit 1504 may also include NFC (Near Field Communication, Near Field Communication) related circuits, which is not limited in this application.
  • the display screen 1505 is used to display a UI (User Interface, user interface).
  • the UI can include graphics, text, icons, videos, and any combination thereof.
  • the display screen 1505 also has the ability to collect touch signals on or above the surface of the display screen 1505.
  • the touch signal may be input to the processor 1501 as a control signal for processing.
  • the display screen 1505 may also be used to provide virtual buttons and/or virtual keyboards, also called soft buttons and/or soft keyboards.
  • the display screen 1505 may be one display screen 1505, which is provided with the front panel of the terminal 1500; in other embodiments, there may be at least two display screens 1505, which are respectively arranged on different surfaces of the terminal 1500 or in a folded design; In still other embodiments, the display screen 1505 may be a flexible display screen, which is disposed on the curved surface or the folding surface of the terminal 1500. Furthermore, the display screen 1505 can also be set as a non-rectangular irregular pattern, that is, a special-shaped screen.
  • the display screen 1505 may be made of materials such as LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode, organic light emitting diode).
  • the camera assembly 1506 is used to collect images or videos.
  • the camera assembly 1506 includes a front camera and a rear camera.
  • the front camera is set on the front panel of the terminal, and the rear camera is set on the back of the terminal.
  • the camera assembly 1506 may also include a flash.
  • the flash can be a single-color flash or a dual-color flash. Dual color temperature flash refers to a combination of warm light flash and cold light flash, which can be used for light compensation under different color temperatures.
  • the audio circuit 1507 may include a microphone and a speaker.
  • the microphone is used to collect sound waves of the user and the environment, and convert the sound waves into electrical signals and input them to the processor 1501 for processing, or input to the radio frequency circuit 1504 to implement voice communication. For the purpose of stereo collection or noise reduction, there may be multiple microphones, which are respectively set in different parts of the terminal 1500.
  • the microphone can also be an array microphone or an omnidirectional acquisition microphone.
  • the speaker is used to convert the electrical signal from the processor 1501 or the radio frequency circuit 1504 into sound waves.
  • the speaker can be a traditional membrane speaker or a piezoelectric ceramic speaker.
  • the speaker When the speaker is a piezoelectric ceramic speaker, it can not only convert the electrical signal into human audible sound waves, but also convert the electrical signal into human inaudible sound waves for purposes such as distance measurement.
  • the audio circuit 1507 may also include a headphone jack.
  • the positioning component 1508 is used to locate the current geographic position of the terminal 1500 to implement navigation or LBS (Location Based Service, location-based service).
  • the positioning component 1508 may be a positioning component based on the GPS (Global Positioning System, Global Positioning System) of the United States, the Beidou system of China, or the Galileo system of Russia.
  • the power supply 1509 is used to supply power to various components in the terminal 1500.
  • the power source 1509 may be alternating current, direct current, disposable batteries or rechargeable batteries.
  • the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery.
  • a wired rechargeable battery is a battery charged through a wired line
  • a wireless rechargeable battery is a battery charged through a wireless coil.
  • the rechargeable battery can also be used to support fast charging technology.
  • the terminal 1500 further includes one or more sensors 1510.
  • the one or more sensors 1510 include, but are not limited to: an acceleration sensor 1511, a gyroscope sensor 1512, a pressure sensor 1513, a fingerprint sensor 1514, an optical sensor 1515, and a proximity sensor 1516.
  • the acceleration sensor 1511 can detect the magnitude of acceleration on the three coordinate axes of the coordinate system established by the terminal 1500. For example, the acceleration sensor 1511 can be used to detect the components of the gravitational acceleration on three coordinate axes.
  • the processor 1501 may control the touch screen 1505 to display the user interface in a horizontal view or a vertical view according to the gravity acceleration signal collected by the acceleration sensor 1511.
  • the acceleration sensor 1511 may also be used for the collection of game or user motion data.
  • the gyroscope sensor 1512 can detect the body direction and rotation angle of the terminal 1500, and the gyroscope sensor 1512 can cooperate with the acceleration sensor 1511 to collect the user's 3D actions on the terminal 1500. Based on the data collected by the gyroscope sensor 1512, the processor 1501 can implement the following functions: motion sensing (such as changing the UI according to the user's tilt operation), image stabilization during shooting, game control, and inertial navigation.
  • the pressure sensor 1513 may be disposed on the side frame of the terminal 1500 and/or the lower layer of the touch screen 1505.
  • the processor 1501 performs left and right hand recognition or quick operation according to the holding signal collected by the pressure sensor 1513.
  • the processor 1501 operates according to the user's pressure on the touch display screen 1505 to control the operability controls on the UI interface.
  • the operability control includes at least one of a button control, a scroll bar control, an icon control, and a menu control.
  • the fingerprint sensor 1514 is used to collect the user's fingerprint.
  • the processor 1501 identifies the user's identity according to the fingerprint collected by the fingerprint sensor 1514, or the fingerprint sensor 1514 identifies the user's identity according to the collected fingerprint.
  • the processor 1501 authorizes the user to perform related sensitive operations, including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings.
  • the fingerprint sensor 1514 may be provided on the front, back or side of the terminal 1500. When a physical button or a manufacturer logo is provided on the terminal 1500, the fingerprint sensor 1514 may be integrated with the physical button or the manufacturer logo.
  • the optical sensor 1515 is used to collect the ambient light intensity.
  • the processor 1501 may control the display brightness of the touch screen 1505 according to the intensity of the ambient light collected by the optical sensor 1515. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1505 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 1505 is decreased.
  • the processor 1501 may also dynamically adjust the shooting parameters of the camera assembly 1506 according to the ambient light intensity collected by the optical sensor 1515.
  • the proximity sensor 1516 also called a distance sensor, is usually arranged on the front panel of the terminal 1500.
  • the proximity sensor 1516 is used to collect the distance between the user and the front of the terminal 1500.
  • the processor 1501 controls the touch screen 1505 to switch from the on-screen state to the off-screen state; when the proximity sensor 1516 detects When the distance between the user and the front of the terminal 1500 gradually increases, the processor 1501 controls the touch display screen 1505 to switch from the rest screen state to the bright screen state.
  • FIG. 15 does not constitute a limitation on the terminal 1500, and may include more or fewer components than shown in the figure, or combine some components, or adopt different component arrangements.
  • An embodiment of the present application also provides a computer device including a memory and a processor.
  • the memory stores at least one instruction, at least one program, code set or instruction set, at least one instruction, at least one program, code set or instruction
  • the set is loaded by the processor and implements the method for observing virtual items in the virtual environment described in any one of FIGS. 5, 9 and 11.
  • An embodiment of the present application also provides a computer-readable storage medium that stores at least one instruction, at least one program, code set, or instruction set, the at least one instruction, the at least one program, the The code set or instruction set is loaded and executed by the processor to implement the method for observing virtual items in the virtual environment described in any one of FIGS. 5, 9 and 11.
  • This application also provides a computer program product, which when the computer program product runs on a computer, causes the computer to execute the method for observing virtual items in a virtual environment provided by the foregoing method embodiments.
  • the program can be stored in a computer-readable storage medium.
  • the medium may be a computer-readable storage medium included in the memory in the foregoing embodiment; or may be a computer-readable storage medium that exists alone and is not assembled into the terminal.
  • the computer-readable storage medium stores at least one instruction, at least one program, code set or instruction set, and the at least one instruction, the at least one program, the code set or the instruction set is loaded and executed by the processor In order to realize the observation method of virtual items in the virtual environment as described in any one of FIG. 5, FIG. 9 and FIG.
  • the computer-readable storage medium may include: read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory.
  • Volatile memory may include random access memory (RAM) or external cache memory.
  • RAM is available in many forms, such as static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous chain Channel (Synchlink) DRAM (SLDRAM), memory bus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Management Or Editing Of Information On Record Carriers (AREA)

Abstract

本申请公开了一种虚拟环境中虚拟物品的观察方法、装置及可读存储介质,该方法包括:显示第一环境界面,第一环境界面中包括以第一旋转轴距进行旋转观察时的画面;接收视角旋转操作;当视角旋转操作的操作参数符合物品观察条件时,通过对旋转中心的位置进行调整,将第一旋转轴距调整为第二旋转轴距;显示第二环境界面,第二环境界面中包括述第二旋转轴距对虚拟环境中的虚拟物品进行旋转观察时的画面。

Description

虚拟环境中虚拟物品的观察方法、装置及可读存储介质
本申请要求于2019年04月11日提交中国专利局,申请号为201910290579.2,申请名称为“虚拟环境中虚拟物品的观察方法、装置及可读存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及计算机领域,特别涉及一种虚拟环境中虚拟物品的观察方法、装置及可读存储介质。
背景技术
在包括虚拟环境的应用程序中,通常需要通过控制虚拟环境中的虚拟对象在虚拟环境中进行活动,如:步行、驾驶、游泳、作战、捡拾物品等,其中,捡拾物品是指控制虚拟对象在虚拟环境中对虚拟物品进行捡拾操作。
相关技术中,在对虚拟物品进行捡拾的过程中,首先虚拟对象需要通过向下观察的操作观察虚拟物品所处的位置,并根据虚拟物品所处的位置移动至虚拟物品周侧,并对目标虚拟物品进行选择后,实现对虚拟物品的捡拾操作。
然而,通过上述方式在虚拟物品的捡拾过程中,经常存在视角转动的准确度较低的问题。
发明内容
根据本申请提供的各种实施例,提供了一种虚拟环境中虚拟物品的观察方法、装置及可读存储介质。
一方面,提供了一种虚拟环境中虚拟物品的观察方法,由计算机设备执行,所述方法包括:
显示第一环境界面,所述第一环境界面中包括以第一旋转轴距对所述虚拟环境进行旋转观察时的画面,所述第一旋转轴距为虚拟对象的观察点与旋转中心之间的距离;
接收视角旋转操作,所述视角旋转操作用于将所述观察点围绕所述旋转中心进行旋转;
当所述视角旋转操作的操作参数符合物品观察条件时,通过对所述旋转中 心的位置进行调整,将所述第一旋转轴距调整为第二旋转轴距;及
显示第二环境界面,所述第二环境界面中包括以所述第二旋转轴距对所述虚拟环境中的虚拟物品进行旋转观察时的画面。
另一方面,提供了一种虚拟环境中虚拟物品的观察装置,所述装置包括:
显示模块,用于显示第一环境界面,所述第一环境界面中包括以第一旋转轴距对所述虚拟环境进行旋转观察时的画面,所述第一旋转轴距为虚拟对象的观察点与旋转中心之间的距离;
接收模块,用于接收视角旋转操作,所述视角旋转操作用于将所述观察点围绕所述旋转中心进行旋转;
调整模块,用于当所述视角旋转操作的操作参数符合物品观察条件时,通过对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为第二旋转轴距;
所述显示模块,还用于显示第二环境界面,所述第二环境界面中包括以所述第二旋转轴距对所述虚拟环境中的虚拟物品进行旋转观察时的画面。
另一方面,提供了一种计算机设备,包括存储器和处理器,所述存储器中存储有计算机可读指令,所述计算机可读指令被所述处理器执行时,使得所述处理器执行如本申请实施例中提供的虚拟环境中虚拟物品的观察方法。
另一方面,一个或多个存储有计算机可读指令的非易失性存储介质,所述计算机可读指令被一个或多个处理器执行时,使得一个或多个处理器执行如本申请实施例中提供的虚拟环境中虚拟物品的观察方法。
另一方面,提供了一种计算机程序产品,当所述计算机程序产品在计算机上运行时,使得计算机执行如上述本申请实施例中提供的虚拟环境中虚拟物品的观察方法。
本申请的一个或多个实施例的细节在下面的附图和描述中提出。本申请的其它特征、目的和优点将从说明书、附图以及权利要求书变得明显。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请一个示例性实施例提供的摄像机模型的观察方式的示意图;
图2是本申请一个示例性实施例提供的旋转轴距与旋转范围对应关系示意图;
图3是本申请一个示例性实施例提供的终端的结构框图;
图4是本申请一个示例性实施例提供的实施环境示意图;
图5是本申请一个示例性实施例提供的虚拟环境中虚拟物品的观察方法的流程图;
图6是基于图5示出的实施例提供的物品观察条件的界面示意图;
图7是基于图5示出的实施例提供的另一个物品观察条件的界面示意图;
图8是本申请一个示例性实施例提供的虚拟环境中虚拟物品的观察方法的界面示意图;
图9是本申请另一个示例性实施例提供的虚拟环境中虚拟物品的观察方法的流程图;
图10是基于图9示出的实施例提供的站立状态与驾驶状态的虚拟对象对应的旋转轴的对应关系示意图;
图11是本申请另一个示例性实施例提供的虚拟环境中虚拟物品的观察方法的流程图;
图12是本申请另一个示例性实施例提供的虚拟环境中虚拟物品的观察方法的流程图;
图13是本申请一个示例性实施例提供的虚拟环境中虚拟物品的观察装置的结构框图;
图14是本申请另一个示例性实施例提供的虚拟环境中虚拟物品的观察装置的结构框图;及
图15是本申请另一个示例性实施例提供的终端的结构框图。
具体实施方式
为使本申请的目的、技术方案和优点更加清楚,下面将结合附图对本申请实施方式作进一步地详细描述。
首先,对本申请实施例中涉及的名词进行简单介绍:
旋转轴距:是指对虚拟环境进行旋转观察时,观察点与旋转中心之间的距离。可选地,在对虚拟环境进行观察时,是通过摄像机模型在观察点对虚拟环境进行模拟拍摄,采集虚拟环境中的画面实现对虚拟环境的观察的,即该旋转轴距是指摄像机模型与该摄像机模型的旋转中心之间的距离。可选地,该旋转 轴距在变化的过程中,是通过保持摄像机模型与虚拟对象的相对位置不变,而调整旋转中心的位置对该旋转轴距进行变化的,该旋转轴距的变化主要用于对摄像机模型的旋转灵活程度进行调整,示意性的,当旋转轴距较小时,旋转灵活程度较高,在相同的屏幕滑动操作下,摄像机模型旋转的角度较大,对虚拟环境进行观察的范围较广。
摄像机模型:是指在虚拟环境中位于虚拟对象周围的三维模型,当采用第一人称视角对虚拟环境进行观察时,该摄像机模型位于虚拟对象的头部附近或者位于虚拟对象的头部,当采用第三人称视角对虚拟环境进行观察时,该摄像机模型可以位于虚拟对象的后方并与虚拟对象进行绑定,也可以位于与虚拟对象相距预设距离的任意位置,通过该摄像机模型可以从不同角度对位于三维虚拟环境中的虚拟对象进行观察,可选地,该第三人称视角为第一人称的过肩视角时,摄像机模型位于虚拟对象(比如虚拟人物的头肩部)的后方。可选地,该摄像机模型在三维虚拟环境中不会进行实际显示,即,在用户界面显示的三维虚拟环境中无法识别到该摄像机模型。
对该摄像机模型位于与虚拟对象相距预设距离的任意位置为例进行说明,可选地,一个虚拟对象对应一个摄像机模型,该摄像机模型可以以虚拟对象为旋转中心进行旋转,如:以虚拟对象的任意一点为旋转中心对摄像机模型进行旋转,摄像机模型在旋转过程中的不仅在角度上有转动,还在位移上有偏移,旋转时摄像机模型与该旋转中心之间的距离保持不变,即,将摄像机模型在以该旋转中心作为球心的球体表面进行旋转,其中,虚拟对象的任意一点可以是虚拟对象的头部、躯干、或者虚拟对象周围的任意一点,本申请实施例对此不加以限定。可选地,摄像机模型在对虚拟对象进行观察时,该摄像机模型的视角方向为该摄像机模型所在球面的切面上的垂线指向虚拟对象的方向。
可选地,该摄像机模型还可以在虚拟对象的不同方向以预设的角度对虚拟对象进行观察。
示意性的,请参考图1,在虚拟对象11中确定一点作为旋转中心12,摄像机模型围绕该旋转中心12进行旋转,可选地,该摄像机模型配置有一个初始位置,该初始位置为虚拟对象后上方的位置(比如脑部的后方位置)。示意性的,如图1所示,该初始位置为位置13,当摄像机模型旋转至位置14或者位置15时,摄像机模型的视角方向随摄像机模型的转动而进行改变。
结合图1,对旋转轴距的调整进行说明,如图2所示,虚拟对象200的鼻子所在的位置为摄像机模型210所在的位置(以第一人称视角为例),旋转中心 221位于如图2所示的位置,摄像机模型210围绕该旋转中心221进行旋转,则该摄像机模型210在轴距调整前的第一旋转轴距为摄像机模型210到旋转中心221之间的距离,该摄像机模型210在轴距调整前的旋转范围如图2的虚线球形范围所示。将该旋转中心221的位置调整至旋转中心222的位置后,第二旋转轴距为摄像机模型210到旋转中心222之间的距离,该摄像机模型210在轴距调整后的旋转范围如图2所示的实线球形范围所示。
本申请中的终端可以是台式计算机、膝上型便携计算机、手机、平板电脑、电子书阅读器、MP3(Moving Picture Experts GroupAudio Layer III,动态影像专家压缩标准音频层面3)播放器、MP4(Moving Picture Experts Group Audio Layer IV,动态影像专家压缩标准音频层面4)播放器等等。该终端中安装和运行有支持虚拟环境的应用程序,比如支持三维虚拟环境的应用程序。该应用程序可以是虚拟现实应用程序、三维地图程序、军事仿真程序、TPS游戏、FPS游戏、MOBA游戏中的任意一种。可选地,该应用程序可以是单机版的应用程序,比如单机版的3D游戏程序,也可以是网络联机版的应用程序。
图3示出了本申请一个示例性实施例提供的电子设备的结构框图。该电子设备300包括:操作系统320和应用程序322。
操作系统320是为应用程序322提供对计算机硬件的安全访问的基础软件。
应用程序322是支持虚拟环境的应用程序。可选地,应用程序322是支持三维虚拟环境的应用程序。该应用程序322可以是虚拟现实应用程序、三维地图程序、军事仿真程序、第三人称射击游戏(Third-Personal Shooting Game,TPS)、第一人称射击游戏(First-person shooting game,FPS)、MOBA游戏、多人枪战类生存游戏中的任意一种。该应用程序322可以是单机版的应用程序,比如单机版的3D游戏程序。
图4示出了本申请一个示例性实施例提供的计算机系统的结构框图。该计算机系统400包括:第一设备420、服务器440和第二设备460。
第一设备420安装和运行有支持虚拟环境的应用程序。该应用程序可以是虚拟现实应用程序、三维地图程序、军事仿真程序、TPS游戏、FPS游戏、MOBA游戏或者多人枪战类生存游戏中的任意一种。第一设备420是第一用户使用的设备,第一用户使用第一设备420控制位于虚拟环境中的第一虚拟对象进行活动,该活动包括但不限于:调整身体姿态、爬行、步行、奔跑、骑行、跳跃、驾驶、拾取、射击、攻击或者投掷中的至少一种。示意性的,第一虚拟对象是第一虚拟人物,比如仿真人物角色或动漫人物角色。
第一设备420通过无线网络或有线网络与服务器440相连。
服务器440包括一台服务器、多台服务器、云计算平台和虚拟化中心中的至少一种。服务器440用于为支持三维虚拟环境的应用程序提供后台服务。可选地,服务器440承担主要计算工作,第一设备420和第二设备460承担次要计算工作;或者,服务器440承担次要计算工作,第一设备420和第二设备460承担主要计算工作;或者,服务器440、第一设备420和第二设备460三者之间采用分布式计算架构进行协同计算。
第二设备460安装和运行有支持虚拟环境的应用程序。该应用程序可以是虚拟现实应用程序、三维地图程序、军事仿真程序、FPS游戏、MOBA游戏、多人枪战类生存游戏中的任意一种。第二设备460是第二用户使用的设备,第二用户使用第二设备460控制位于虚拟环境中的第二虚拟对象进行活动,该活动包括但不限于:调整身体姿态、爬行、步行、奔跑、骑行、跳跃、驾驶、拾取、射击、攻击、投掷中的至少一种。示意性的,第二虚拟对象是第二虚拟人物,比如仿真人物角色或动漫人物角色。
可选地,第一虚拟人物和第二虚拟人物处于同一虚拟环境中。可选地,第一虚拟人物和第二虚拟人物可以属于同一个队伍、同一个组织、具有好友关系或具有临时性的通讯权限。可选地,第一虚拟人物和第二虚拟人物也可以属于不同队伍、不同组织、或具有敌对性的两个团体。
可选地,第一设备420和第二设备460上安装的应用程序是相同的,或两个设备上安装的应用程序是不同控制系统平台的同一类型应用程序。第一设备420可以泛指多个设备中的一个,第二设备460可以泛指多个设备中的一个,本实施例仅以第一设备420和第二设备460来举例说明。第一设备420和第二设备460的设备类型相同或不同,该设备类型包括:游戏主机、台式计算机、智能手机、平板电脑、电子书阅读器、MP3播放器、MP4播放器和膝上型便携计算机中的至少一种。以下实施例以设备是台式计算机来举例说明。
本领域技术人员可以知晓,上述设备的数量可以更多或更少。比如上述设备可以仅为一个,或者上述设备为几十个或几百个,或者更多数量。本申请实施例对设备的数量和设备类型不加以限定。
结合上述名词简介以及实施环境说明,对本申请实施例提供的虚拟环境中虚拟物品的观察方法进行说明,以该方法应用于终端中为例进行说明,如图5所示,该方法包括:
步骤501,显示第一环境界面,该第一环境界面中包括以第一旋转轴距对虚 拟环境进行旋转观察时的画面。
可选地,该第一旋转轴距为虚拟对象的观察点与旋转中心之间的距离。可选地,该观察点是通过摄像机模型对虚拟环境进行画面采集实现对虚拟环境的观察的,即该第一旋转轴距为摄像机模型与旋转中心之间的距离。
可选地,在通过摄像机模型对虚拟环境进行旋转观察的过程中,可以是采用摄像机模型与虚拟对象同步旋转的方式对虚拟环境进行观察,即该摄像机模型与虚拟对象位置绑定,在摄像机模型旋转的同时,虚拟对象跟随摄像机模型的旋转进行同步旋转;该摄像机模型还可以采用单独旋转的方式对虚拟环境进行观察,即虚拟对象保持位置以及面对方向不变,摄像机模型通过旋转对虚拟环境进行观察。
可选地,该第一旋转轴距为对虚拟环境进行观察时的默认轴距。
可选地,该虚拟对象所处的运动状态包括站立状态、跑步状态、步行状态、驾驶状态、乘坐状态或者游泳状态中的任意一种。可选地,根据虚拟对象所处运动状态不同,对应的第一旋转轴距的轴距长不同。其中,驾驶状态和乘坐状态也可以合并实现为一种状态,如,乘坐状态与驾驶状态合并实现为驾驶状态。
可选地,对该虚拟环境进行观察时的人称视角包括第一人称视角或者第三人称视角中的任意一种。可选地,不同的人称视角对应的第一旋转轴距的轴距长不同。
可选地,结合虚拟对象所处的运动状态,以及对虚拟环境进行观察时的人称视角,针对不同运动状态以及人称视角的组合,对应的第一旋转轴距的轴距长不同。示意性的,人称视角、运动状态与第一旋转轴距的轴距长之间的对应关系请参考下表一:
表一
Figure PCTCN2020079651-appb-000001
其中,x 1表示虚拟对象处于站立状态,且以第一人称视角观察虚拟环境时的水平方向轴距,y 1表示虚拟对象处于站立状态,且以第一人称视角观察虚拟 环境时的垂直方向轴距,x 2表示虚拟对象处于站立状态,且以第三人称视角观察虚拟环境时的水平方向轴距,y 2表示虚拟对象处于站立状态,且以第三人称视角观察虚拟环境时的垂直方向轴距,x 3表示虚拟对象处于驾驶状态,且以第一人称视角观察虚拟环境时的水平方向轴距,y 3表示虚拟对象处于驾驶状态,且以第一人称视角观察虚拟环境时的垂直方向轴距,x 4表示虚拟对象处于驾驶状态,且以第三人称视角观察虚拟环境时的水平方向轴距,y 4表示虚拟对象处于驾驶状态,且以第三人称视角观察虚拟环境时的垂直方向轴距,x 5表示虚拟对象处于游泳状态,且以第一人称视角观察虚拟环境时的水平方向轴距,y 5表示虚拟对象处于游泳状态,且以第一人称视角观察虚拟环境时的垂直方向轴距,x 6表示虚拟对象处于游泳状态,且以第三人称视角观察虚拟环境时的水平方向轴距,y 6表示虚拟对象处于游泳状态,且以第三人称视角观察虚拟环境时的垂直方向轴距。
步骤502,接收视角旋转操作。
可选地,该视角旋转操作用于将观察点围绕旋转中心进行旋转。
可选地,该视角旋转操作的实现方式包括如下方式中的至少一种:
第一,该方法应用于包括触摸显示屏的移动终端中,通过在触摸显示屏中进行滑动实现该视角旋转操作;
可选地,该在触摸显示屏上的滑动操作可以是在触摸显示屏左半侧屏幕进行的滑动操作,也可以是在触摸显示屏上的右半侧屏幕进行的滑动操作,还可以是在触摸显示屏的任意位置进行的滑动操作,滑动操作的具体位置根据控制该视角旋转的设置参数确定。
第二,该方法应用于包括陀螺仪的移动终端中,通过对终端进行摆动控制陀螺仪的旋转,实现该视角旋转操作;
第三,该方法应用于台式电脑、便携式膝上笔记本电脑中时,通过外部输入设备进行输入操作,实现该视角旋转操作,如:通过键盘输入方向控制信号,以实现该视角旋转操作。
步骤503,当视角旋转操作的操作参数符合物品观察条件时,通过对旋转中心的位置进行调整,将第一旋转轴距调整为第二旋转轴距。
可选地,该物品观察条件用于表示对虚拟环境中的虚拟物品进行观察对应的条件。
可选地,该操作参数包括操作角度,当操作角度在目标角度范围内时,通过对旋转中心的位置进行调整,将第一旋转轴距调整为第二旋转轴距。
可选地,该操作参数中还包括操作距离,当操作角度在目标角度范围内,且操作距离达到目标距离时,通过对旋转中心的位置进行调整,将第一旋转轴距调整为第二旋转轴距。
可选地,该操作参数中还包括虚拟对象在虚拟环境中的预设距离范围内的虚拟物品存在情况,当操作角度在预设角度范围内,且虚拟对象在虚拟环境中的预设距离范围内存在虚拟物品时,通过对旋转中心的位置进行调整,将第一旋转轴距调整为第二旋转轴距。可选地,该虚拟物品位于虚拟对象面对方向的预设距离范围内,也即,当操作角度在预设角度范围内,且虚拟对象所面对的方向范围内,以及虚拟对象的预设距离范围内存在虚拟物品时,通过对旋转中心的位置进行调整,将第一旋转轴距调整为第二旋转轴距,其中,该虚拟物品为待捡拾的虚拟物品。示意性的,请参考图6,在第一环境界面600中包括虚拟对象610,当终端接收到视角旋转操作,且视角旋转操作的操作角度在预设角度范围内时,确定虚拟对象610所面对的方向范围620内,以及虚拟对象610的预设距离范围630内是否存在虚拟物品,如图6所示,在方向范围620以及预设距离范围630的交集中存在虚拟物品640,该虚拟物品640为待捡拾的物品,则根据该视角旋转操作对旋转中心的位置进行调整。
可选地,上述观察点通过摄像机模型对虚拟环境进行观察时,该操作参数中还包括摄像机模型的旋转终止位置,当操作角度在目标角度范围内,且旋转终止位置在预设位置范围内时,通过对旋转中心的位置进行调整,将第一旋转轴距调整为第二旋转轴距。可选地,该预设位置范围为旋转中心对应的水平位置以上的范围,在该水平位置以上的范围进行观察时,表示该摄像机模型处于由上向下进行观察的状态,即表示该摄像机模型处于对虚拟物品进行观察的状态,则当操作角度在目标角度范围内,且旋转终止位置在该旋转中心对应的水平位置以上的范围内时,通过对旋转中心的位置进行调整,将第一旋转轴距调整为第二旋转轴距。示意性的,请参考图7,在第一环境界面700中包括虚拟对象710,当终端接收到视角旋转操作,且视角旋转操作的操作角度在预设角度范围内时,确定摄像机模型720的旋转终止位置,如图7所示,该摄像机模型720的旋转中心点730对应的水平位置为平面740,摄像机模型720的旋转终止位置位于该平面740之上,则根据该视角旋转操作对旋转中心的位置进行调整。
步骤504,显示第二环境界面,该第二环境界面中包括以第二旋转轴距对虚拟环境中的虚拟物品进行旋转观察时的画面。
可选地,该第二旋转轴距为观察点与调整后的旋转中心之间的距离。可选 地,该第二旋转轴距为摄像机模型与调整后的旋转中心之间的距离。
综上所述,本实施例提供的虚拟环境中虚拟环境中虚拟物品的观察方法方法,通过对视角旋转操作地操作参数与物品观察条件进行匹配,确定该视角旋转操作后的观察视角是否为对虚拟物品进行观察的视角,当该视角旋转操作后的观察视角为对虚拟物品进行观察的视角时,则通过调整旋转中心的位置对观察点与旋转中心之间的旋转轴距进行调整,从而调整观察点的旋转灵活程度,将与观察虚拟物品匹配的旋转灵活度通过对视角旋转操作的监控进行实时切换,区分普通轴距与对虚拟物品进行观察时的轴距,以更灵活视野更广阔的方式对虚拟物品进行观察。
本实施例提供的方法,通过视角旋转操作的操作角度确定该视角旋转操作是否符合物品观察条件,符合物品观察条件包括操作角度在目标角度范围内、操作距离达到目标距离、旋转终止位置在预设位置范围内或者虚拟对象在虚拟环境中的预设距离范围内存在虚拟物品中的至少一种,避免用户在水平方向或斜向的视角旋转操作,由于在垂直方向上也存在操作位移,导致误触旋转轴距的调整的问题。
本实施例提供的方法,通过确定虚拟对象在虚拟环境的预设距离范围内是否存在待捡拾的虚拟物品,并当虚拟对象的预设距离范围内存在虚拟物品时触发旋转轴距的调整,避免了虚拟对象周侧并不存在虚拟物品而触发调整至对虚拟物品进行观察的旋转轴距的误触问题。
示意性的,图8是本申请一个示例性实施例提供的虚拟环境中虚拟物品的观察方法的界面示意图,以第三人称视角为例进行或说明,如图8所示,第一环境界面810中包括虚拟对象811,当前第一环境界面810中以第一旋转轴距对虚拟环境进行观察,当在终端显示屏中横向滑动第一距离后,显示第一环境界面820,该第一环境界面820中显示有一半柜子821;当在终端显示屏中纵向滑动后,且该纵向滑动操作符合物品观察条件,如:纵向滑动操作在预设角度范围812以内时,调整旋转中心822的位置以将第一旋转轴距823(摄像机模型825与旋转中心826之间的距离)调整至第二旋转轴距824,并显示第二环境界面830,该第二环境界面830是以第二旋转轴距对虚拟环境中的虚拟物品831进行观察时的画面,当在终端显示屏中横向滑动第一距离后,显示第二环境界面840,该第二环境界面840中显示有完整的柜子821。根据图8以及上述说明可知,将第一旋转轴距调整为第二旋转轴距后,同样的滑动操作下,视角旋转角度不同,也即视角旋转的灵活度不同,如图8所示,有第一旋转轴距调整为第 二旋转轴距后,视角旋转的灵活度提高,视角旋转时的视野更广阔。即,将第一旋转轴距调整至第二旋转轴距用于对视角旋转的灵活度进行调整。可选地,该将第一旋转轴距调整至第二旋转轴距可以用于将视角旋转的灵活度由低调高,也可以用于将视角旋转的灵活度由高调低。
在一个可选的实施例中,根据虚拟对象所处的运动状态不同,对应的第二旋转轴距也不同,图9是本申请另一个示例性实施例提供的虚拟环境中虚拟物品的观察方法的流程图,以该方法应用于终端中为例进行说明,如图9所示,该方法包括:
步骤901,显示第一环境界面,该第一环境界面中包括以第一旋转轴距对虚拟环境进行旋转观察时的画面。
可选地,该第一旋转轴距为虚拟对象的观察点与旋转中心之间的距离。可选地,该观察点是通过摄像机模型对虚拟环境进行画面采集实现对虚拟环境的观察的,即该第一旋转轴距为摄像机模型与旋转中心之间的距离。
可选地,通过该第一旋转轴距对虚拟环境进行旋转观察的观察方式在上述步骤501中已进行了详细说明,此处不再赘述。
步骤902,接收视角旋转操作。
可选地,该视角旋转操作用于将观察点围绕旋转中心进行旋转。
可选地,该视角旋转操作的实现方式已在上述步骤502中进行了详细说明,此处不再赘述。
步骤903,当视角旋转操作的操作参数符合物品观察条件时,确定虚拟对象所处的运动状态。
可选地,该运动状态包括站立状态、跑步状态、步行状态、驾驶状态、乘坐状态或者游泳状态中的任意一种。
可选地,确定视角旋转操作是否符合物品观察条件的方式在上述步骤503中已进行了详细说明,此处不再赘述。
步骤904,确定与虚拟对象所处的运动状态对应的轴距长。
可选地,确定与运动状态对应的轴距长的方式包括如下方式中的任意一种:
第一,获取运动状态与轴距长的对应关系,并根据运动状态在该对应关系中确定对应的轴距长;可选地,该对应关系可以是终端中存储的,也可以是终端从服务器中获取的,示意性的,该对应关系如下表二所示:
表二
运动状态 第二旋转轴距
站立状态 (p 1,q 1)
驾驶状态 (p 2,q 2)
游泳状态 (p 3,q 3)
其中,p 1表示站立状态下的水平方向轴距,q 1表示站立状态下的垂直方向轴距,p 2表示驾驶状态下的水平方向轴距,q 2表示驾驶状态下的垂直方向轴距,p 3表示游泳状态下的水平方向轴距,q 3表示游泳状态下的垂直方向轴距。
第二,以目标状态对应的轴距长为基准轴距长,确定虚拟对象所处的运动状态与目标状态对应的轴距系数,并将轴距系数与基准轴距长之积确定为虚拟对象所处的运动状态对应的轴距长。可选地,虚拟对象的运动状态包括站立状态、驾驶状态或者游泳状态中的任意一种,以上述目标状态为站立状态为例进行说明,当虚拟对象处于驾驶状态时,确定驾驶状态对应的第一轴距系数,将第一轴距系数与基准收据长之积确定为驾驶状态对应的轴距长;或,当虚拟对象处于游泳状态时,确定游泳状态对应的第二轴距系数,将第二轴距系统与基准轴距长之积确定为有用状态对应的轴距长。可选地,终端可以存储有轴距系数与运动状态之间的对应关系,也可以从服务器中获取该对应关系。示意性的,该对应关系如下表三所示:
表三
运动状态 轴距系数
站立状态 (1,1)
驾驶状态 (1.25,0.8)
游泳状态 (1.5,1.2)
其中,当站立状态的第二旋转轴距为(x,y)时,驾驶状态对应的第二旋转轴距为(0.7x,0.8y),游泳状态对应的第二旋转轴距为(1.5x,1.2y)。
示意性的,请参考图10,当虚拟对象1010处于站立状态时,该虚拟对象1010对应的横向转动轴1011轴距为x(旋转中心1013与摄像机模型1014之间的距离),纵向转动轴1012轴距为y(旋转中心1013与地面之间的距离),当该虚拟对象1010处于驾驶状态时,该驾驶状态对应的横向转动轴1021轴距为1.25x(旋转中心1023与摄像机模型1024之间的距离),纵向转动轴1022轴距为0.8y(旋转中心1023与地面之间的距离)。
步骤905,根据轴距长对旋转中心的位置进行调整,将第一旋转轴距调整为第二旋转轴距。
可选地,该第二旋转轴距为观察点与调整后的旋转中心之间的距离。可选 地,该第二旋转轴距为摄像机模型与调整后的旋转中心之间的距离。
步骤906,显示第二环境界面,该第二环境界面中包括以第二旋转轴距对虚拟环境中的虚拟物品进行旋转观察时的画面。
综上所述,本实施例提供的虚拟环境中虚拟环境中虚拟物品的观察方法方法,通过对视角旋转操作地操作参数与物品观察条件进行匹配,确定该视角旋转操作后的观察视角是否为对虚拟物品进行观察的视角,当该视角旋转操作后的观察视角为对虚拟物品进行观察的视角时,则通过调整旋转中心的位置对观察点与旋转中心之间的旋转轴距进行调整,从而调整观察点的旋转灵活程度,将与观察虚拟物品匹配的旋转灵活度通过对视角旋转操作的监控进行实时切换,区分普通轴距与对虚拟物品进行观察时的轴距,以更灵活视野更广阔的方式对虚拟物品进行观察。
本实施例提供的方法,针对虚拟对象不同的运动状态,设置不同的第二旋转轴距,在不同的运动状态下,适配不同的第二旋转轴距,如:在驾驶状态下,虚拟对象的运动速度较快,需要视角的旋转灵活度较低,故根据该驾驶状态的灵活度需要,适配对应的第二旋转轴距,避免站立状态下的第二旋转轴距与驾驶状态不适配的问题。
在一个可选的实施例中,该第二旋转轴距根据人称视角的不同也不同,图11是本申请另一个示例性实施例提供的虚拟环境中虚拟物品的观察方法的流程图,以该方法应用于终端中为例进行说明,该方法包括:
步骤1101,显示第一环境界面,该第一环境界面中包括以第一旋转轴距对虚拟环境进行旋转观察时的画面。
可选地,该第一旋转轴距为虚拟对象的观察点与旋转中心之间的距离。可选地,该观察点是通过摄像机模型对虚拟环境进行画面采集实现对虚拟环境的观察的,即该第一旋转轴距为摄像机模型与旋转中心之间的距离。
可选地,通过该第一旋转轴距对虚拟环境进行旋转观察的观察方式在上述步骤501中已进行了详细说明,此处不再赘述。
步骤1102,接收视角旋转操作。
可选地,该视角旋转操作用于将观察点围绕旋转中心进行旋转。
可选地,该视角旋转操作的实现方式已在上述步骤502中进行了详细说明,此处不再赘述。
步骤1103,当视角旋转操作的操作参数符合物品观察条件时,确定虚拟对象所处的运动状态。
可选地,该运动状态包括站立状态、跑步状态、步行状态、驾驶状态、乘坐状态或者游泳状态中的任意一种。
可选地,确定视角旋转操作是否符合物品观察条件的方式在上述步骤503中已进行了详细说明,此处不再赘述。
步骤1104,以目标状态对应的轴距长为基准轴距长,确定与人称视角以及虚拟对象所处的运动状态对应的轴距系数。
可选地,运动状态中的任一状态都可以实现为目标状态,示意性的,以站立状态对应的轴距长为基准轴距长。
可选地,以目标状态在目标人称视角下对应的轴距长作为基准轴距长,如:以站立状态以及第一人称视角下的轴距长作为基准轴距长。
可选地,终端中存储有该人称视角、运动状态以及轴距系数的对应关系,或,终端从服务器中获取该对应关系,示意性的,该对应关系如下表四所示:
表四
Figure PCTCN2020079651-appb-000002
步骤1105,将轴距系数与基准轴距长之积确定为虚拟对象所处的运动状态对应的轴距长。
结合上表四,当站立状态及第一人称视角的第二旋转轴距为(x,y)时,站立状态及第三人称视角的第二旋转轴距为(0.5x,0.85y),驾驶状态及第一人称视角的第二旋转轴距为(0.75x,0.8y),驾驶状态及第三人称视角的第二旋转轴距为(2.25x,0.85y),游泳状态及第一人称视角的第二旋转轴距为(0.5x,0.65y),游泳状态及第三人称视角的第二旋转轴距为(x,1.2y)。
可选地,该轴距长还可以通过界面显示的方式进行自定义调节,即终端显示轴距调整界面,该轴距调整界面中包括预览轴距,用户可以通过拖动操作或数值输入操作对该轴距的长度进行调节,还可以通过对运动状态以及人称视角的选择,对不同运动状态以及不同人称视角下的轴距长度进行调节。
步骤1106,根据轴距长对旋转中心的位置进行调整,将第一旋转轴距调整 为第二旋转轴距。
可选地,该第二旋转轴距为观察点与调整后的旋转中心之间的距离。可选地,该第二旋转轴距为摄像机模型与调整后的旋转中心之间的距离。
步骤1107,显示第二环境界面,该第二环境界面中包括以第二旋转轴距对虚拟环境中的虚拟物品进行旋转观察时的画面。
可选地,当在该第二环境界面上接收到视角旋转操作,且该视角旋转操作符合转动轴距恢复条件时,通过对旋转中心的位置进行调整,将第二旋转轴距恢复至第一旋转轴距。如:在终端显示屏中接受到向上的滑动操作,且该向上的滑动操作在预设角度范围内时,通过对旋转中心的位置进行调整,将第二旋转轴距恢复至第一旋转轴距。
综上所述,本实施例提供的虚拟环境中虚拟环境中虚拟物品的观察方法方法,通过对视角旋转操作地操作参数与物品观察条件进行匹配,确定该视角旋转操作后的观察视角是否为对虚拟物品进行观察的视角,当该视角旋转操作后的观察视角为对虚拟物品进行观察的视角时,则通过调整旋转中心的位置对观察点与旋转中心之间的旋转轴距进行调整,从而调整观察点的旋转灵活程度,将与观察虚拟物品匹配的旋转灵活度通过对视角旋转操作的监控进行实时切换,区分普通轴距与对虚拟物品进行观察时的轴距,以更灵活视野更广阔的方式对虚拟物品进行观察。
本实施例提供的方法,通过对不同人称视角下的第二旋转轴距进行适配,避免在第三人称视角以及第一人称视角下,虚拟环境的沉浸度不同,而第二旋转轴距相同导致的轴距不匹配的问题。
示意性的,请参考图12,是本申请一个示例性实施例提供的虚拟环境中虚拟物品的观察方法的整体流程图,以该方法应用于终端中为例进行说明,如图12所示,该方法包括:
步骤1201,判断滑动操作在水平方向上的夹角p是否不大于a。
可选地,该判断操作即判断滑动操作是否符合物品观察条件。
步骤1202,当p>a时,不启用快速拾取镜头方案,视角转动方向为p,转动轴固定。
可选地,当p>a时,即该滑动操作不符合物品观察条件,视角转动并不表示需要对虚拟环境中的虚拟物品进行观察。
步骤1203,当p≤a时,启用快速拾取镜头方案,视角转动方向为p,确定虚拟对象所处的状态。
可选地,确定虚拟对象所处的状态包括确定该虚拟对象所处的运动状态,该运动状态包括站立状态、驾驶状态或者游泳状态中的任意一种。
步骤1204,当虚拟对象处于站立状态,则确定第一人称视角下的转动轴值为(x 1,y 1),第三人称视角下的转动轴值为(x 2,y 2)。
步骤1205,当虚拟对象处于驾驶状态,则确定第一人称视角下的转动轴值为(x 3,y 3),第三人称视角下的转动轴值为(x 4,y 4)。
步骤1206,当虚拟对象处于游泳状态,则确定第一人称视角下的转动轴值为(x 5,y 5),第三人称视角下的转动轴值为(x 6,y 6)。
步骤1207,检测转动值,在对应的转动轴上转动对应的视角范围。
可选地,在调整后的转动轴值的转动轴下,对根据视角旋转操作对视角进行转动。
综上所述,本实施例提供的虚拟环境中虚拟环境中虚拟物品的观察方法方法,通过对视角旋转操作地操作参数与物品观察条件进行匹配,确定该视角旋转操作后的观察视角是否为对虚拟物品进行观察的视角,当该视角旋转操作后的观察视角为对虚拟物品进行观察的视角时,则通过调整旋转中心的位置对观察点与旋转中心之间的旋转轴距进行调整,从而调整观察点的旋转灵活程度,将与观察虚拟物品匹配的旋转灵活度通过对视角旋转操作的监控进行实时切换,区分普通轴距与对虚拟物品进行观察时的轴距,以更灵活视野更广阔的方式对虚拟物品进行观察。
应该理解的是,本申请各实施例中的各个步骤并不是必然按照步骤标号指示的顺序依次执行。除非本文中有明确的说明,这些步骤的执行并没有严格的顺序限制,这些步骤可以以其它的顺序执行。而且,各实施例中至少一部分步骤可以包括多个子步骤或者多个阶段,这些子步骤或者阶段并不必然是在同一时刻执行完成,而是可以在不同的时刻执行,这些子步骤或者阶段的执行顺序也不必然是依次进行,而是可以与其它步骤或者其它步骤的子步骤或者阶段的至少一部分轮流或者交替地执行。
图13是本申请一个示例性实施例提供的虚拟环境中虚拟物品的观察装置的结构框图,该装置可以实现在终端中,如图13所示,该装置包括:显示模块1310、接收模块1320以及调整模块1330;
显示模块1310,用于显示第一环境界面,所述第一环境界面中包括以第一旋转轴距对所述虚拟环境进行旋转观察时的画面,所述第一旋转轴距为虚拟对 象的观察点与旋转中心之间的距离;
接收模块1320,用于接收视角旋转操作,所述视角旋转操作用于将所述观察点围绕所述旋转中心进行旋转;
调整模块1330,用于当所述视角旋转操作的操作参数符合物品观察条件时,通过对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为第二旋转轴距;
所述显示模块1310,还用于显示第二环境界面,所述第二环境界面中包括以所述第二旋转轴距对所述虚拟环境中的虚拟物品进行旋转观察时的画面。
在一个可选的实施例中,如图14所示,所述调整模块1330,包括:
确定单元1331,用于确定所述虚拟对象所处的运动状态;
所述确定单元1331,还用于确定与所述虚拟对象所处的所述运动状态对应的轴距长,所述轴距长为所述第二旋转轴距的轴距长;
调整单元1332,用于根据所述轴距长对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为所述第二旋转轴距。
在一个可选的实施例中,所述确定单元1331,还用于以目标状态对应的所述轴距长为基准轴距长,确定所述虚拟对象所处的所述运动状态与所述目标状态对应的轴距系数;
所述确定单元1331,还用于将所述轴距系数与所述基准轴距长之积确定为所述虚拟对象所处的所述运动状态对应的所述轴距长。
在一个可选的实施例中,所述运动状态为站立状态、驾驶状态或者游泳状态中的任意一种,所述站立状态为所述目标状态;
所述确定单元1331,还用于当所述虚拟对象处于所述驾驶状态时,确定所述驾驶状态对应的第一轴距系数;将所述第一轴距系数与所述基准轴距长之积确定为所述驾驶状态对应的所述轴距长;
或,
所述确定单元1331,还用于当所述虚拟对象处于所述游泳状态时,确定所述游泳状态对应的第二轴距系数;将所述第二轴距系数与所述基准轴距长之积确定为所述游泳状态对应的所述轴距长。
在一个可选的实施例中,所述确定单元1331,还用于确定对所述虚拟环境进行观察时的人称视角;确定与所述人称视角以及所述虚拟对象所处的所述运动状态对应的所述轴距系数。
在一个可选的实施例中,所述操作参数包括操作角度;
所述调整模块1330,还用于当所述操作角度在目标角度范围内时,通过对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为所述第二旋转轴距。
在一个可选的实施例中,所述操作参数还包括所述虚拟对象在所述虚拟环境中的预设距离范围内的虚拟物品存在情况;
所述调整模块1330,还用于当所述操作角度在所述目标角度范围内,且所述虚拟对象在所述虚拟环境中的所述预设距离范围内存在所述虚拟物品时,通过对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为所述第二旋转轴距。
在一个可选的实施例中,所述观察点通过摄像机模型对所述虚拟环境进行观察,所述操作参数还包括所述摄像机模型的旋转终止位置;
所述调整模块1330,还用于当所述操作角度在所述目标角度范围内,且所述旋转终止位置在预设位置范围内时,通过对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为所述第二旋转轴距。
综上所述,本实施例提供的虚拟环境中虚拟环境中虚拟物品的观察方法装置,通过对视角旋转操作地操作参数与物品观察条件进行匹配,确定该视角旋转操作后的观察视角是否为对虚拟物品进行观察的视角,当该视角旋转操作后的观察视角为对虚拟物品进行观察的视角时,则通过调整旋转中心的位置对观察点与旋转中心之间的旋转轴距进行调整,从而调整观察点的旋转灵活程度,将与观察虚拟物品匹配的旋转灵活度通过对视角旋转操作的监控进行实时切换,区分普通轴距与对虚拟物品进行观察时的轴距,以更灵活视野更广阔的方式对虚拟物品进行观察。
图15示出了本发明一个示例性实施例提供的终端1500的结构框图。该终端1500可以是:智能手机、平板电脑、MP3播放器(Moving Picture Experts Group Audio Layer III,动态影像专家压缩标准音频层面3)、MP4(Moving Picture Experts Group Audio Layer IV,动态影像专家压缩标准音频层面4)播放器、笔记本电脑或台式电脑。终端1500还可能被称为用户设备、便携式终端、膝上型终端、台式终端等其他名称。
通常,终端1500包括有:处理器1501和存储器1502。
处理器1501可以包括一个或多个处理核心,“多个”是指至少两个。比如4核心处理器、8核心处理器等。处理器1501可以采用DSP(Digital Signal Processing,数字信号处理)、FPGA(Field-Programmable GateArray,现场可编程门阵列)、PLA(Programmable Logic Array,可编程逻辑阵列)中的至少一 种硬件形式来实现。处理器1501也可以包括主处理器和协处理器,主处理器是用于对在唤醒状态下的数据进行处理的处理器,也称CPU(Central Processing Unit,中央处理器);协处理器是用于对在待机状态下的数据进行处理的低功耗处理器。在一些实施例中,处理器1501可以在集成有GPU(Graphics Processing Unit,图像处理器),GPU用于负责显示屏所需要显示的内容的渲染和绘制。一些实施例中,处理器1501还可以包括AI(Artificial Intelligence,人工智能)处理器,该AI处理器用于处理有关机器学习的计算操作。
存储器1502可以包括一个或多个计算机可读存储介质,该计算机可读存储介质可以是非暂态的。存储器1502还可包括高速随机存取存储器,以及非易失性存储器,比如一个或多个磁盘存储设备、闪存存储设备。在一些实施例中,存储器1502中的非暂态的计算机可读存储介质用于存储至少一个指令,该至少一个指令用于被处理器1501所执行以实现本申请中方法实施例提供的虚拟环境中虚拟物品的观察方法。
在一些实施例中,终端1500还可选包括有:外围设备接口1503和至少一个外围设备。处理器1501、存储器1502和外围设备接口1503之间可以通过总线或信号线相连。各个外围设备可以通过总线、信号线或电路板与外围设备接口1503相连。具体地,外围设备包括:射频电路1504、触摸显示屏1505、摄像头1506、音频电路1507、定位组件1508和电源1509中的至少一种。
外围设备接口1503可被用于将I/O(Input/Output,输入/输出)相关的至少一个外围设备连接到处理器1501和存储器1502。在一些实施例中,处理器1501、存储器1502和外围设备接口1503被集成在同一芯片或电路板上;在一些其他实施例中,处理器1501、存储器1502和外围设备接口1503中的任意一个或两个可以在单独的芯片或电路板上实现,本实施例对此不加以限定。
射频电路1504用于接收和发射RF(Radio Frequency,射频)信号,也称电磁信号。射频电路1504通过电磁信号与通信网络以及其他通信设备进行通信。射频电路1504将电信号转换为电磁信号进行发送,或者,将接收到的电磁信号转换为电信号。可选地,射频电路1504包括:天线系统、RF收发器、一个或多个放大器、调谐器、振荡器、数字信号处理器、编解码芯片组、用户身份模块卡等等。射频电路1504可以通过至少一种无线通信协议来与其它终端进行通信。该无线通信协议包括但不限于:万维网、城域网、内联网、各代移动通信网络(2G、3G、4G及5G)、无线局域网和/或WiFi(Wireless Fidelity,无线保真)网络。在一些实施例中,射频电路1504还可以包括NFC(Near Field  Communication,近距离无线通信)有关的电路,本申请对此不加以限定。
显示屏1505用于显示UI(User Interface,用户界面)。该UI可以包括图形、文本、图标、视频及其它们的任意组合。当显示屏1505是触摸显示屏时,显示屏1505还具有采集在显示屏1505的表面或表面上方的触摸信号的能力。该触摸信号可以作为控制信号输入至处理器1501进行处理。此时,显示屏1505还可以用于提供虚拟按钮和/或虚拟键盘,也称软按钮和/或软键盘。在一些实施例中,显示屏1505可以为一个,设置终端1500的前面板;在另一些实施例中,显示屏1505可以为至少两个,分别设置在终端1500的不同表面或呈折叠设计;在再一些实施例中,显示屏1505可以是柔性显示屏,设置在终端1500的弯曲表面上或折叠面上。甚至,显示屏1505还可以设置成非矩形的不规则图形,也即异形屏。显示屏1505可以采用LCD(Liquid Crystal Display,液晶显示屏)、OLED(Organic Light-Emitting Diode,有机发光二极管)等材质制备。
摄像头组件1506用于采集图像或视频。可选地,摄像头组件1506包括前置摄像头和后置摄像头。通常,前置摄像头设置在终端的前面板,后置摄像头设置在终端的背面。在一些实施例中,后置摄像头为至少两个,分别为主摄像头、景深摄像头、广角摄像头、长焦摄像头中的任意一种,以实现主摄像头和景深摄像头融合实现背景虚化功能、主摄像头和广角摄像头融合实现全景拍摄以及VR(Virtual Reality,虚拟现实)拍摄功能或者其它融合拍摄功能。在一些实施例中,摄像头组件1506还可以包括闪光灯。闪光灯可以是单色温闪光灯,也可以是双色温闪光灯。双色温闪光灯是指暖光闪光灯和冷光闪光灯的组合,可以用于不同色温下的光线补偿。
音频电路1507可以包括麦克风和扬声器。麦克风用于采集用户及环境的声波,并将声波转换为电信号输入至处理器1501进行处理,或者输入至射频电路1504以实现语音通信。出于立体声采集或降噪的目的,麦克风可以为多个,分别设置在终端1500的不同部位。麦克风还可以是阵列麦克风或全向采集型麦克风。扬声器则用于将来自处理器1501或射频电路1504的电信号转换为声波。扬声器可以是传统的薄膜扬声器,也可以是压电陶瓷扬声器。当扬声器是压电陶瓷扬声器时,不仅可以将电信号转换为人类可听见的声波,也可以将电信号转换为人类听不见的声波以进行测距等用途。在一些实施例中,音频电路1507还可以包括耳机插孔。
定位组件1508用于定位终端1500的当前地理位置,以实现导航或LBS(Location Based Service,基于位置的服务)。定位组件1508可以是基于美国 的GPS(Global Positioning System,全球定位系统)、中国的北斗系统或俄罗斯的伽利略系统的定位组件。
电源1509用于为终端1500中的各个组件进行供电。电源1509可以是交流电、直流电、一次性电池或可充电电池。当电源1509包括可充电电池时,该可充电电池可以是有线充电电池或无线充电电池。有线充电电池是通过有线线路充电的电池,无线充电电池是通过无线线圈充电的电池。该可充电电池还可以用于支持快充技术。
在一些实施例中,终端1500还包括有一个或多个传感器1510。该一个或多个传感器1510包括但不限于:加速度传感器1511、陀螺仪传感器1512、压力传感器1513、指纹传感器1514、光学传感器1515以及接近传感器1516。
加速度传感器1511可以检测以终端1500建立的坐标系的三个坐标轴上的加速度大小。比如,加速度传感器1511可以用于检测重力加速度在三个坐标轴上的分量。处理器1501可以根据加速度传感器1511采集的重力加速度信号,控制触摸显示屏1505以横向视图或纵向视图进行用户界面的显示。加速度传感器1511还可以用于游戏或者用户的运动数据的采集。
陀螺仪传感器1512可以检测终端1500的机体方向及转动角度,陀螺仪传感器1512可以与加速度传感器1511协同采集用户对终端1500的3D动作。处理器1501根据陀螺仪传感器1512采集的数据,可以实现如下功能:动作感应(比如根据用户的倾斜操作来改变UI)、拍摄时的图像稳定、游戏控制以及惯性导航。
压力传感器1513可以设置在终端1500的侧边框和/或触摸显示屏1505的下层。当压力传感器1513设置在终端1500的侧边框时,可以检测用户对终端1500的握持信号,由处理器1501根据压力传感器1513采集的握持信号进行左右手识别或快捷操作。当压力传感器1513设置在触摸显示屏1505的下层时,由处理器1501根据用户对触摸显示屏1505的压力操作,实现对UI界面上的可操作性控件进行控制。可操作性控件包括按钮控件、滚动条控件、图标控件、菜单控件中的至少一种。
指纹传感器1514用于采集用户的指纹,由处理器1501根据指纹传感器1514采集到的指纹识别用户的身份,或者,由指纹传感器1514根据采集到的指纹识别用户的身份。在识别出用户的身份为可信身份时,由处理器1501授权该用户执行相关的敏感操作,该敏感操作包括解锁屏幕、查看加密信息、下载软件、支付及更改设置等。指纹传感器1514可以被设置终端1500的正面、背面或侧 面。当终端1500上设置有物理按键或厂商Logo时,指纹传感器1514可以与物理按键或厂商Logo集成在一起。
光学传感器1515用于采集环境光强度。在一个实施例中,处理器1501可以根据光学传感器1515采集的环境光强度,控制触摸显示屏1505的显示亮度。具体地,当环境光强度较高时,调高触摸显示屏1505的显示亮度;当环境光强度较低时,调低触摸显示屏1505的显示亮度。在另一个实施例中,处理器1501还可以根据光学传感器1515采集的环境光强度,动态调整摄像头组件1506的拍摄参数。
接近传感器1516,也称距离传感器,通常设置在终端1500的前面板。接近传感器1516用于采集用户与终端1500的正面之间的距离。在一个实施例中,当接近传感器1516检测到用户与终端1500的正面之间的距离逐渐变小时,由处理器1501控制触摸显示屏1505从亮屏状态切换为息屏状态;当接近传感器1516检测到用户与终端1500的正面之间的距离逐渐变大时,由处理器1501控制触摸显示屏1505从息屏状态切换为亮屏状态。
本领域技术人员可以理解,图15中示出的结构并不构成对终端1500的限定,可以包括比图示更多或更少的组件,或者组合某些组件,或者采用不同的组件布置。
本申请实施例还提供一种计算机设备,该计算机设备包括存储器和处理器,存储器中存储有至少一条指令、至少一段程序、代码集或指令集,至少一条指令、至少一段程序、代码集或指令集由处理器加载并实现上述如图5、图9以及图11任一所述的虚拟环境中虚拟物品的观察方法。
本申请实施例还提供一种计算机可读存储介质,该可读存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如图5、图9以及图11任一所述的虚拟环境中虚拟物品的观察方法。
本申请还提供了一种计算机程序产品,当计算机程序产品在计算机上运行时,使得计算机执行上述各个方法实施例提供的虚拟环境中虚拟物品的观察方法。
本领域普通技术人员可以理解上述实施例的各种方法中的全部或部分步骤是可以通过程序来指令相关的硬件来完成,该程序可以存储于一计算机可读存储介质中,该计算机可读存储介质可以是上述实施例中的存储器中所包含的计算机可读存储介质;也可以是单独存在,未装配入终端中的计算机可读存储介 质。该计算机可读存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如图5、图9以及图11任一所述的虚拟环境中虚拟物品的观察方法。
可选地,该计算机可读存储介质可以包括:只读存储器(ROM)、可编程ROM(PROM)、电可编程ROM(EPROM)、电可擦除可编程ROM(EEPROM)或闪存。易失性存储器可包括随机存取存储器(RAM)或者外部高速缓冲存储器。作为说明而非局限,RAM以多种形式可得,诸如静态RAM(SRAM)、动态RAM(DRAM)、同步DRAM(SDRAM)、双数据率SDRAM(DDRSDRAM)、增强型SDRAM(ESDRAM)、同步链路(Synchlink)DRAM(SLDRAM)、存储器总线(Rambus)直接RAM(RDRAM)、直接存储器总线动态RAM(DRDRAM)、以及存储器总线动态RAM(RDRAM)等。
上述本申请实施例序号仅仅为了描述,不代表实施例的优劣。
本领域普通技术人员可以理解实现上述实施例的全部或部分步骤可以通过硬件来完成,也可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,上述提到的存储介质可以是只读存储器,磁盘或光盘等。
以上所述仅为本申请的较佳实施例,并不用以限制本申请,凡在本申请的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。

Claims (25)

  1. 一种虚拟环境中虚拟物品的观察方法,由计算机设备执行,包括:
    显示第一环境界面,所述第一环境界面中包括以第一旋转轴距对所述虚拟环境进行旋转观察时的画面,所述第一旋转轴距为虚拟对象的观察点与旋转中心之间的距离;
    接收视角旋转操作,所述视角旋转操作用于将所述观察点围绕所述旋转中心进行旋转;
    当所述视角旋转操作的操作参数符合物品观察条件时,通过对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为第二旋转轴距;及
    显示第二环境界面,所述第二环境界面中包括以所述第二旋转轴距对所述虚拟环境中的虚拟物品进行旋转观察时的画面。
  2. 根据权利要求1所述的方法,其特征在于,所述通过对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为第二旋转轴距,包括:
    确定所述虚拟对象所处的运动状态;
    确定与所述虚拟对象所处的所述运动状态对应的轴距长,所述轴距长为所述第二旋转轴距的轴距长;及
    根据所述轴距长对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为所述第二旋转轴距。
  3. 根据权利要求2所述的方法,其特征在于,所述确定与所述虚拟对象所处的所述运动状态对应的轴距长,包括:
    以目标状态对应的所述轴距长为基准轴距长,确定所述虚拟对象所处的所述运动状态与所述目标状态对应的轴距系数;及
    将所述轴距系数与所述基准轴距长之积确定为所述虚拟对象所处的所述运动状态对应的所述轴距长。
  4. 根据权利要求3所述的方法,其特征在于,所述运动状态为驾驶状态或者游泳状态中的任意一种,所述目标状态为站立状态;
    所述确定所述虚拟对象所处的所述运动状态与所述目标状态对应的轴距系数,将所述轴距系数与所述基准轴距长之积确定为所述虚拟对象所处的所述运动状态对应的所述轴距长,包括:
    当所述虚拟对象处于驾驶状态时,确定所述驾驶状态对应的第一轴距系数;将所述第一轴距系数与所述基准轴距长之积确定为所述驾驶状态对应的所述轴距长;或,当所述虚拟对象处于游泳状态时,确定所述游泳状态对应的第二轴 距系数;将所述第二轴距系数与所述基准轴距长之积确定为所述游泳状态对应的所述轴距长。
  5. 根据权利要求3所述的方法,其特征在于,所述确定所述虚拟对象所处的所述运动状态与所述目标状态对应的轴距系数,包括:
    确定对所述虚拟环境进行观察时的人称视角;
    确定与所述人称视角以及所述虚拟对象所处的所述运动状态对应的所述轴距系数。
  6. 根据权利要求1至5任一所述的方法,其特征在于,所述操作参数包括操作角度;
    所述当所述视角旋转操作的操作参数符合物品观察条件时,通过对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为第二旋转轴距,包括:
    当所述操作角度在目标角度范围内时,通过对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为所述第二旋转轴距。
  7. 根据权利要求6所述的方法,其特征在于,所述操作参数还包括所述虚拟对象在所述虚拟环境中的预设距离范围内的虚拟物品存在情况;
    所述当所述操作角度在目标角度范围内时,通过对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为所述第二旋转轴距,包括:
    当所述操作角度在所述目标角度范围内,且所述虚拟对象在所述虚拟环境中的所述预设距离范围内存在所述虚拟物品时,通过对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为所述第二旋转轴距。
  8. 根据权利要求6所述的方法,其特征在于,所述观察点通过摄像机模型对所述虚拟环境进行观察,所述操作参数还包括所述摄像机模型的旋转终止位置;
    所述当所述操作角度在目标角度范围内时,通过对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为所述第二旋转轴距,包括:
    当所述操作角度在所述目标角度范围内,且所述旋转终止位置在预设位置范围内时,通过对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为所述第二旋转轴距。
  9. 一种虚拟环境中虚拟物品的观察装置,所述装置包括:
    显示模块,用于显示第一环境界面,所述第一环境界面中包括以第一旋转轴距对所述虚拟环境进行旋转观察时的画面,所述第一旋转轴距为虚拟对象的观察点与旋转中心之间的距离;
    接收模块,用于接收视角旋转操作,所述视角旋转操作用于将所述观察点围绕所述旋转中心进行旋转;
    调整模块,用于当所述视角旋转操作的操作参数符合物品观察条件时,通过对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为第二旋转轴距;及
    所述显示模块,还用于显示第二环境界面,所述第二环境界面中包括以所述第二旋转轴距对所述虚拟环境中的虚拟物品进行旋转观察时的画面。
  10. 一种计算机设备,包括存储器和处理器,所述存储器中存储有计算机可读指令,所述计算机可读指令被所述处理器执行时,使得所述处理器执行以下步骤:
    显示第一环境界面,所述第一环境界面中包括以第一旋转轴距对所述虚拟环境进行旋转观察时的画面,所述第一旋转轴距为虚拟对象的观察点与旋转中心之间的距离;
    接收视角旋转操作,所述视角旋转操作用于将所述观察点围绕所述旋转中心进行旋转;
    当所述视角旋转操作的操作参数符合物品观察条件时,通过对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为第二旋转轴距;及
    显示第二环境界面,所述第二环境界面中包括以所述第二旋转轴距对所述虚拟环境中的虚拟物品进行旋转观察时的画面。
  11. 根据权利要求10所述的计算机设备,其特征在于,所述通过对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为第二旋转轴距,包括:
    确定所述虚拟对象所处的运动状态;
    确定与所述虚拟对象所处的所述运动状态对应的轴距长,所述轴距长为所述第二旋转轴距的轴距长;及
    根据所述轴距长对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为所述第二旋转轴距。
  12. 根据权利要求11所述的计算机设备,其特征在于,所述确定与所述虚拟对象所处的所述运动状态对应的轴距长,包括:
    以目标状态对应的所述轴距长为基准轴距长,确定所述虚拟对象所处的所述运动状态与所述目标状态对应的轴距系数;及
    将所述轴距系数与所述基准轴距长之积确定为所述虚拟对象所处的所述运动状态对应的所述轴距长。
  13. 根据权利要求12所述的计算机设备,其特征在于,所述运动状态为驾驶状态或者游泳状态中的任意一种,所述目标状态为站立状态;
    所述确定所述虚拟对象所处的所述运动状态与所述目标状态对应的轴距系数,将所述轴距系数与所述基准轴距长之积确定为所述虚拟对象所处的所述运动状态对应的所述轴距长,包括:
    当所述虚拟对象处于驾驶状态时,确定所述驾驶状态对应的第一轴距系数;将所述第一轴距系数与所述基准轴距长之积确定为所述驾驶状态对应的所述轴距长;或,当所述虚拟对象处于游泳状态时,确定所述游泳状态对应的第二轴距系数;将所述第二轴距系数与所述基准轴距长之积确定为所述游泳状态对应的所述轴距长。
  14. 根据权利要求12所述的计算机设备,其特征在于,所述确定所述虚拟对象所处的所述运动状态与所述目标状态对应的轴距系数,包括:
    确定对所述虚拟环境进行观察时的人称视角;
    确定与所述人称视角以及所述虚拟对象所处的所述运动状态对应的所述轴距系数。
  15. 根据权利要求10至14任一所述的计算机设备,其特征在于,所述操作参数包括操作角度;
    所述当所述视角旋转操作的操作参数符合物品观察条件时,通过对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为第二旋转轴距,包括:
    当所述操作角度在目标角度范围内时,通过对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为所述第二旋转轴距。
  16. 根据权利要求15所述的计算机设备,其特征在于,所述操作参数还包括所述虚拟对象在所述虚拟环境中的预设距离范围内的虚拟物品存在情况;
    所述当所述操作角度在目标角度范围内时,通过对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为所述第二旋转轴距,包括:
    当所述操作角度在所述目标角度范围内,且所述虚拟对象在所述虚拟环境中的所述预设距离范围内存在所述虚拟物品时,通过对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为所述第二旋转轴距。
  17. 根据权利要求15所述的计算机设备,其特征在于,所述观察点通过摄像机模型对所述虚拟环境进行观察,所述操作参数还包括所述摄像机模型的旋转终止位置;
    所述当所述操作角度在目标角度范围内时,通过对所述旋转中心的位置进 行调整,将所述第一旋转轴距调整为所述第二旋转轴距,包括:
    当所述操作角度在所述目标角度范围内,且所述旋转终止位置在预设位置范围内时,通过对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为所述第二旋转轴距。
  18. 一个或多个存储有计算机可读指令的非易失性存储介质,所述计算机可读指令被一个或多个处理器执行时,使得一个或多个处理器执行以下步骤:
    显示第一环境界面,所述第一环境界面中包括以第一旋转轴距对所述虚拟环境进行旋转观察时的画面,所述第一旋转轴距为虚拟对象的观察点与旋转中心之间的距离;
    接收视角旋转操作,所述视角旋转操作用于将所述观察点围绕所述旋转中心进行旋转;
    当所述视角旋转操作的操作参数符合物品观察条件时,通过对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为第二旋转轴距;及
    显示第二环境界面,所述第二环境界面中包括以所述第二旋转轴距对所述虚拟环境中的虚拟物品进行旋转观察时的画面。
  19. 根据权利要求18所述的存储介质,其特征在于,所述通过对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为第二旋转轴距,包括:
    确定所述虚拟对象所处的运动状态;
    确定与所述虚拟对象所处的所述运动状态对应的轴距长,所述轴距长为所述第二旋转轴距的轴距长;及
    根据所述轴距长对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为所述第二旋转轴距。
  20. 根据权利要求19所述的存储介质,其特征在于,所述确定与所述虚拟对象所处的所述运动状态对应的轴距长,包括:
    以目标状态对应的所述轴距长为基准轴距长,确定所述虚拟对象所处的所述运动状态与所述目标状态对应的轴距系数;及
    将所述轴距系数与所述基准轴距长之积确定为所述虚拟对象所处的所述运动状态对应的所述轴距长。
  21. 根据权利要求20所述的存储介质,其特征在于,所述运动状态为驾驶状态或者游泳状态中的任意一种,所述目标状态为站立状态;
    所述确定所述虚拟对象所处的所述运动状态与所述目标状态对应的轴距系数,将所述轴距系数与所述基准轴距长之积确定为所述虚拟对象所处的所述运 动状态对应的所述轴距长,包括:
    当所述虚拟对象处于驾驶状态时,确定所述驾驶状态对应的第一轴距系数;将所述第一轴距系数与所述基准轴距长之积确定为所述驾驶状态对应的所述轴距长;或,当所述虚拟对象处于游泳状态时,确定所述游泳状态对应的第二轴距系数;将所述第二轴距系数与所述基准轴距长之积确定为所述游泳状态对应的所述轴距长。
  22. 根据权利要求20所述的存储介质,其特征在于,所述确定所述虚拟对象所处的所述运动状态与所述目标状态对应的轴距系数,包括:
    确定对所述虚拟环境进行观察时的人称视角;
    确定与所述人称视角以及所述虚拟对象所处的所述运动状态对应的所述轴距系数。
  23. 根据权利要求18至22任一所述的存储介质,其特征在于,所述操作参数包括操作角度;
    所述当所述视角旋转操作的操作参数符合物品观察条件时,通过对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为第二旋转轴距,包括:
    当所述操作角度在目标角度范围内时,通过对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为所述第二旋转轴距。
  24. 根据权利要求23所述的存储介质,其特征在于,所述操作参数还包括所述虚拟对象在所述虚拟环境中的预设距离范围内的虚拟物品存在情况;
    所述当所述操作角度在目标角度范围内时,通过对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为所述第二旋转轴距,包括:
    当所述操作角度在所述目标角度范围内,且所述虚拟对象在所述虚拟环境中的所述预设距离范围内存在所述虚拟物品时,通过对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为所述第二旋转轴距。
  25. 根据权利要求23所述的存储介质,其特征在于,所述观察点通过摄像机模型对所述虚拟环境进行观察,所述操作参数还包括所述摄像机模型的旋转终止位置;
    所述当所述操作角度在目标角度范围内时,通过对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为所述第二旋转轴距,包括:
    当所述操作角度在所述目标角度范围内,且所述旋转终止位置在预设位置范围内时,通过对所述旋转中心的位置进行调整,将所述第一旋转轴距调整为所述第二旋转轴距。
PCT/CN2020/079651 2019-04-11 2020-03-17 虚拟环境中虚拟物品的观察方法、装置及可读存储介质 WO2020207204A1 (zh)

Priority Applications (8)

Application Number Priority Date Filing Date Title
BR112021018341A BR112021018341A2 (pt) 2019-04-11 2020-03-17 Método e dispositivo para observar artigo virtual em ambiente virtual, e mídia de armazenamento legível
SG11202104919UA SG11202104919UA (en) 2019-04-11 2020-03-17 Method and device for observing virtual article in virtual environment, and readable storage medium
EP20787691.3A EP3954444A4 (en) 2019-04-11 2020-03-17 DEVICE AND METHOD FOR OBSERVING A VIRTUAL ARTICLE IN A VIRTUAL ENVIRONMENT, AND READABLE STORAGE MEDIA
AU2020256776A AU2020256776B2 (en) 2019-04-11 2020-03-17 Method and device for observing virtual article in virtual environment, and readable storage medium
JP2021529272A JP7145331B2 (ja) 2019-04-11 2020-03-17 仮想環境における仮想アイテムの観察方法、装置およびコンピュータプログラム
KR1020217020054A KR102602074B1 (ko) 2019-04-11 2020-03-17 가상 환경에서 가상 물체를 관찰하는 방법 및 디바이스, 및 판독 가능한 저장 매체
CA3133001A CA3133001A1 (en) 2019-04-11 2020-03-17 Method and device for observing virtual article in virtual environment, and readable storage medium
US17/326,160 US11766613B2 (en) 2019-04-11 2021-05-20 Method and apparatus for observing virtual item in virtual environment and readable storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910290579.2A CN110045827B (zh) 2019-04-11 2019-04-11 虚拟环境中虚拟物品的观察方法、装置及可读存储介质
CN201910290579.2 2019-04-11

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/326,160 Continuation US11766613B2 (en) 2019-04-11 2021-05-20 Method and apparatus for observing virtual item in virtual environment and readable storage medium

Publications (1)

Publication Number Publication Date
WO2020207204A1 true WO2020207204A1 (zh) 2020-10-15

Family

ID=67276864

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/079651 WO2020207204A1 (zh) 2019-04-11 2020-03-17 虚拟环境中虚拟物品的观察方法、装置及可读存储介质

Country Status (10)

Country Link
US (1) US11766613B2 (zh)
EP (1) EP3954444A4 (zh)
JP (1) JP7145331B2 (zh)
KR (1) KR102602074B1 (zh)
CN (1) CN110045827B (zh)
AU (1) AU2020256776B2 (zh)
BR (1) BR112021018341A2 (zh)
CA (1) CA3133001A1 (zh)
SG (1) SG11202104919UA (zh)
WO (1) WO2020207204A1 (zh)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110045827B (zh) 2019-04-11 2021-08-17 腾讯科技(深圳)有限公司 虚拟环境中虚拟物品的观察方法、装置及可读存储介质
CN110517346B (zh) * 2019-08-30 2021-06-18 腾讯科技(深圳)有限公司 虚拟环境界面的展示方法、装置、计算机设备及存储介质
US20220212107A1 (en) * 2020-03-17 2022-07-07 Tencent Technology (Shenzhen) Company Limited Method and Apparatus for Displaying Interactive Item, Terminal, and Storage Medium
CN111784844B (zh) * 2020-06-09 2024-01-05 北京五一视界数字孪生科技股份有限公司 观察虚拟对象的方法、装置、存储介质及电子设备
CN113304471B (zh) * 2020-08-26 2023-01-10 北京完美赤金科技有限公司 虚拟对象的展示方法、装置、设备
CN112774203B (zh) * 2021-01-22 2023-04-28 北京字跳网络技术有限公司 一种虚拟对象的位姿控制方法、装置以及计算机存储介质
CN113926187A (zh) * 2021-10-20 2022-01-14 腾讯科技(深圳)有限公司 虚拟场景中的对象控制方法、装置及终端设备
CN115454255B (zh) * 2022-10-09 2024-02-13 如你所视(北京)科技有限公司 物品展示的切换方法和装置、电子设备、存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0583060A2 (en) * 1992-07-24 1994-02-16 The Walt Disney Company Method and system for creating an illusion of three-dimensionality
US20080043022A1 (en) * 2006-08-18 2008-02-21 Nintendo Co., Ltd. Storage Medium storing game program and game apparatus
CN108717733A (zh) * 2018-06-07 2018-10-30 腾讯科技(深圳)有限公司 虚拟环境的视角切换方法、设备及存储介质
CN110045827A (zh) * 2019-04-11 2019-07-23 腾讯科技(深圳)有限公司 虚拟环境中虚拟物品的观察方法、装置及可读存储介质

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000308755A (ja) * 1999-04-27 2000-11-07 Konami Co Ltd 画像表示方法、記録媒体及びビデオゲーム装置
JP2001149643A (ja) 1999-09-16 2001-06-05 Sony Computer Entertainment Inc 3次元ゲームにおけるオブジェクト表示方法、情報記録媒体およびエンタテインメント装置
JP4144017B2 (ja) 1999-12-23 2008-09-03 株式会社セガ ビデオゲーム装置、画像表示装置及び方法、動画表示装置及び方法並びに記録媒体
JP3726239B1 (ja) * 2004-07-13 2005-12-14 有 上條 画像処理プログラム及び装置
US8882590B2 (en) 2006-04-28 2014-11-11 Nintendo Co., Ltd. Touch-controlled game character motion providing dynamically-positioned virtual control pad
US9327191B2 (en) * 2006-05-08 2016-05-03 Nintendo Co., Ltd. Method and apparatus for enhanced virtual camera control within 3D video games or other computer graphics presentations providing intelligent automatic 3D-assist for third person viewpoints
JP5161256B2 (ja) 2010-03-31 2013-03-13 株式会社バンダイナムコゲームス プログラム、情報記憶媒体、及び、画像生成装置
JP5148652B2 (ja) * 2010-03-31 2013-02-20 株式会社バンダイナムコゲームス プログラム、情報記憶媒体及び画像生成システム
JP5622447B2 (ja) * 2010-06-11 2014-11-12 任天堂株式会社 情報処理プログラム、情報処理装置、情報処理システム及び情報処理方法
JP2014235596A (ja) 2013-06-03 2014-12-15 任天堂株式会社 情報処理プログラム、情報処理システム、情報処理装置、および情報処理方法
JP2015026172A (ja) 2013-07-25 2015-02-05 株式会社スクウェア・エニックス 画像処理プログラムおよび画像処理装置
US9824500B2 (en) 2016-03-16 2017-11-21 Microsoft Technology Licensing, Llc Virtual object pathing
CN106302679B (zh) 2016-08-08 2018-10-02 腾讯科技(深圳)有限公司 一种虚拟对象移动同步方法、客户端及服务器
JP6939801B2 (ja) 2016-09-14 2021-09-22 ソニーグループ株式会社 情報処理装置、情報処理方法およびプログラム
CN106569614A (zh) * 2016-11-11 2017-04-19 上海远鉴信息科技有限公司 虚拟现实中场景切换控制方法及系统
JP2019008346A (ja) 2017-06-20 2019-01-17 凸版印刷株式会社 画像処理装置、画像処理システム、画像処理方法及び画像処理プログラム
RU2670351C1 (ru) 2017-07-31 2018-10-22 Алексей Владимирович Лысенко Система и способ управления виртуальным объектом
CN108499105B (zh) * 2018-04-16 2022-02-25 腾讯科技(深圳)有限公司 在虚拟环境中进行视角调整的方法、装置及存储介质
CN108635857B (zh) 2018-05-18 2022-04-22 腾讯科技(深圳)有限公司 界面显示方法、装置、电子装置及计算机可读存储介质
CN108815851B (zh) * 2018-06-05 2020-06-30 腾讯科技(深圳)有限公司 在虚拟环境中射击时的界面显示方法、设备及存储介质

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0583060A2 (en) * 1992-07-24 1994-02-16 The Walt Disney Company Method and system for creating an illusion of three-dimensionality
US20080043022A1 (en) * 2006-08-18 2008-02-21 Nintendo Co., Ltd. Storage Medium storing game program and game apparatus
CN108717733A (zh) * 2018-06-07 2018-10-30 腾讯科技(深圳)有限公司 虚拟环境的视角切换方法、设备及存储介质
CN110045827A (zh) * 2019-04-11 2019-07-23 腾讯科技(深圳)有限公司 虚拟环境中虚拟物品的观察方法、装置及可读存储介质

Also Published As

Publication number Publication date
JP7145331B2 (ja) 2022-09-30
US20210268380A1 (en) 2021-09-02
CN110045827A (zh) 2019-07-23
EP3954444A1 (en) 2022-02-16
SG11202104919UA (en) 2021-06-29
JP2022509634A (ja) 2022-01-21
CN110045827B (zh) 2021-08-17
AU2020256776A1 (en) 2021-10-07
AU2020256776B2 (en) 2023-02-09
BR112021018341A2 (pt) 2021-11-23
EP3954444A4 (en) 2022-06-08
US11766613B2 (en) 2023-09-26
KR20210095931A (ko) 2021-08-03
CA3133001A1 (en) 2020-10-15
KR102602074B1 (ko) 2023-11-13

Similar Documents

Publication Publication Date Title
WO2020207204A1 (zh) 虚拟环境中虚拟物品的观察方法、装置及可读存储介质
US11151773B2 (en) Method and apparatus for adjusting viewing angle in virtual environment, and readable storage medium
US11703993B2 (en) Method, apparatus and device for view switching of virtual environment, and storage medium
US11224810B2 (en) Method and terminal for displaying distance information in virtual scene
WO2019153836A1 (zh) 虚拟环境中虚拟对象的姿态确定方法、装置及介质
WO2019205881A1 (zh) 虚拟环境中的信息显示方法、装置、设备及存储介质
CN110917616B (zh) 虚拟场景中的方位提示方法、装置、设备及存储介质
CN111921197B (zh) 对局回放画面的显示方法、装置、终端及存储介质
WO2020151594A1 (zh) 视角转动的方法、装置、设备及存储介质
WO2021164315A1 (zh) 热点图的显示方法、装置、计算机设备及可读存储介质
CN109634413B (zh) 对虚拟环境进行观察的方法、设备及存储介质
CN110496392B (zh) 虚拟对象的控制方法、装置、终端及存储介质
CN110448908B (zh) 虚拟环境中瞄准镜的应用方法、装置、设备及存储介质
CN111026318A (zh) 基于虚拟环境的动画播放方法、装置、设备及存储介质
CN111273780A (zh) 基于虚拟环境的动画播放方法、装置、设备及存储介质
US20220291791A1 (en) Method and apparatus for determining selected target, device, and storage medium
CN111589141A (zh) 虚拟环境画面的显示方法、装置、设备及介质
CN109806583B (zh) 用户界面显示方法、装置、设备及系统
CN113289336A (zh) 在虚拟环境中标记物品的方法、装置、设备及介质
JP2024518182A (ja) 動作効果の表示方法と装置、コンピュータ機器及びコンピュータプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20787691

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021529272

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 20217020054

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 3133001

Country of ref document: CA

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112021018341

Country of ref document: BR

ENP Entry into the national phase

Ref document number: 2020256776

Country of ref document: AU

Date of ref document: 20200317

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020787691

Country of ref document: EP

Effective date: 20211111

ENP Entry into the national phase

Ref document number: 112021018341

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20210915

WWE Wipo information: entry into national phase

Ref document number: 521430450

Country of ref document: SA