CN112330823B - Virtual prop display method, device, equipment and readable storage medium - Google Patents

Virtual prop display method, device, equipment and readable storage medium Download PDF

Info

Publication number
CN112330823B
CN112330823B CN202011222394.7A CN202011222394A CN112330823B CN 112330823 B CN112330823 B CN 112330823B CN 202011222394 A CN202011222394 A CN 202011222394A CN 112330823 B CN112330823 B CN 112330823B
Authority
CN
China
Prior art keywords
virtual
virtual prop
environment
prop
camera model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011222394.7A
Other languages
Chinese (zh)
Other versions
CN112330823A (en
Inventor
杨金昊
林凌云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202011222394.7A priority Critical patent/CN112330823B/en
Publication of CN112330823A publication Critical patent/CN112330823A/en
Application granted granted Critical
Publication of CN112330823B publication Critical patent/CN112330823B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2004Aligning objects, relative positioning of parts

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Architecture (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to a virtual prop display method, device and equipment and a readable storage medium, and relates to the field of virtual environments. The method comprises the following steps: displaying a first environment picture; in response to receiving the accessory state adjustment operation, adjusting the virtual prop from the first accessory state to the second accessory state; and displaying a second environment picture. The virtual object in the first accessory state is displayed in the first environment picture at a first viewing angle, and when the virtual prop is changed from the first accessory state to the second accessory state, the virtual object in the second accessory state is displayed in the second environment picture at a second viewing angle. By changing the viewing angle for displaying the virtual prop in the environment picture, when the virtual prop is displayed in the environment picture at different viewing angles, that is, a large amount of space in the viewing angle is not occupied, and the efficiency of viewing the virtual environment at the viewing angle of the virtual object is improved.

Description

Virtual prop display method, device, equipment and readable storage medium
Technical Field
The present invention relates to the field of virtual environments, and in particular, to a method, an apparatus, a device, and a readable storage medium for displaying virtual props.
Background
On terminals such as smartphones, tablet computers, there are many virtual environment based applications such as: virtual reality applications, three-dimensional mapping programs, military simulation programs, third person shooter games (TPS), first person shooter games (FPS), multiplayer online tactical game games (Multiplayer Online Battle ArenaGames, MOBA), etc. In the application program, the user may observe the virtual environment from the perspective of the first virtual object and control the first virtual object to fire a virtual object or a virtual object located in the virtual environment using the virtual firearm.
In the related art, a number of slots are included in the virtual firearm, each slot being used for assembling a different firearm accessory, and the virtual firearm will have a different function and a different appearance after assembling the different firearm accessories.
However, when the firearm accessory is combined with a virtual firearm, the virtual firearm and its accessories will occupy a lot of space in the viewing perspective, resulting in a problem of less efficient viewing of the virtual environment at the perspective of the virtual object.
Disclosure of Invention
The application relates to a display method, a device, equipment and a readable storage medium for virtual props, which can reduce the space occupied by virtual guns and accessories thereof in an observation view angle, and improve the efficiency of observing a virtual environment from the view angle of a virtual object:
in one aspect, a method for displaying a virtual prop is provided, the method comprising:
displaying a first environment picture, wherein the first environment picture comprises a virtual environment, the virtual environment comprises a virtual object, the virtual object holds a virtual prop observed at a first observation angle, the virtual prop is in a first accessory state, and the first accessory state indicates an accessory to which the virtual prop is assembled currently;
in response to receiving the accessory state adjustment operation, adjusting the virtual prop from the first accessory state to the second accessory state;
and displaying a second environment picture, wherein the second environment picture comprises virtual props observed at a second observation angle.
In another aspect, a display device for a virtual prop is provided, the device comprising:
the display module is used for displaying a first environment picture, wherein the first environment picture comprises a virtual environment, the virtual environment comprises a virtual object, the virtual object holds a virtual prop observed at a first observation angle, the virtual prop is in a first accessory state, and the first accessory state indicates an accessory to which the virtual prop is assembled currently;
The adjusting module is used for adjusting the virtual prop from the first accessory state to the second accessory state in response to receiving the accessory state adjusting operation;
the display module is also used for displaying a second environment picture, and the second environment picture comprises virtual props observed at a second observation angle.
In another aspect, a computer device is provided, where the computer device includes a processor and a memory, where the memory stores at least one instruction, at least one program, a code set, or an instruction set, where the at least one instruction, the at least one program, the code set, or the instruction set is loaded and executed by the processor to implement a method for displaying a virtual prop as provided in an embodiment of the present application.
In another aspect, a computer readable storage medium is provided, where at least one instruction, at least one program, a set of codes, or a set of instructions is stored, where the at least one instruction, the at least one program, the set of codes, or the set of instructions are loaded and executed by a processor to implement a method for displaying a virtual prop as described above.
In another aspect, a computer program product or computer program is provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the method for displaying a virtual prop according to any of the above embodiments.
The beneficial effects that this application provided technical scheme brought include at least:
the virtual object in the first accessory state is displayed in the first environment picture at a first viewing angle, and when the virtual prop is changed from the first accessory state to the second accessory state, the virtual object in the second accessory state is displayed in the second environment picture at a second viewing angle. By changing the viewing angle for displaying the virtual prop in the environment picture, when the virtual prop is displayed in the environment picture at different viewing angles, that is, a large amount of space in the viewing angle is not occupied, and the efficiency of viewing the virtual environment at the viewing angle of the virtual object is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 illustrates a schematic diagram of an environment screen as illustrated in an exemplary embodiment of the present application;
FIG. 2 illustrates a schematic diagram of another environment screen provided by an exemplary embodiment of the present application;
FIG. 3 illustrates a block diagram of an electronic device provided in an exemplary embodiment of the present application;
FIG. 4 illustrates a block diagram of a computer system provided in accordance with one exemplary embodiment of the present application;
FIG. 5 illustrates a flowchart of a method for displaying a virtual prop provided by an exemplary embodiment of the present application;
FIG. 6 illustrates a schematic diagram of another environment screen shown in an exemplary embodiment of the present application;
FIG. 7 illustrates a schematic diagram of a virtual prop in a first accessory state versus a second accessory state provided in accordance with an exemplary embodiment of the present application;
FIG. 8 illustrates a schematic diagram of another environment screen shown in an exemplary embodiment of the present application;
FIG. 9 illustrates a flow chart of determining a second viewing angle according to an exemplary embodiment of the present application;
FIG. 10 illustrates a schematic diagram of a method for placing virtual props in a Cartesian coordinate system according to an exemplary embodiment of the present application;
FIG. 11 illustrates a schematic diagram of a method of adjusting a second length provided in an exemplary embodiment of the present application;
FIG. 12 illustrates a schematic diagram of another method of adjusting a second length provided in an exemplary embodiment of the present application;
FIG. 13 illustrates a schematic diagram of a near cross-section and far cross-section positional relationship provided by an exemplary embodiment of the present application;
FIG. 14 illustrates a schematic diagram of viewing a virtual prop in a coordinate system provided by an exemplary embodiment of the present application;
FIG. 15 illustrates another schematic view of viewing virtual props in a coordinate system provided by an exemplary embodiment of the present application;
FIG. 16 illustrates another schematic view of viewing virtual props in a coordinate system provided by an exemplary embodiment of the present application;
FIG. 17 is a flowchart of a method for displaying a virtual prop according to an exemplary embodiment of the present application;
FIG. 18 illustrates a schematic view of a virtual firearm being viewed through different cameras provided in an exemplary embodiment of the present application;
FIG. 19 illustrates a schematic diagram of another environment screen shown in an exemplary embodiment of the present application;
FIG. 20 is a process diagram of a method for displaying virtual props according to an exemplary embodiment of the present application;
FIG. 21 illustrates a block diagram of a display device for a virtual prop provided in an exemplary embodiment of the present application;
FIG. 22 illustrates a block diagram of a display device of another virtual prop provided in an exemplary embodiment of the present application;
fig. 23 shows a block diagram of an electronic device according to an exemplary embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
First, the terms involved in the embodiments of the present application will be briefly described:
virtual environment: is a virtual environment that an application displays (or provides) while running on a terminal. The virtual environment may be a simulation environment for the real world, a semi-simulation and semi-imaginary environment, or a pure imaginary environment. The virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment, which is not limited in this application. The following embodiments are illustrated with the virtual environment being a three-dimensional virtual environment. In the present application, the virtual environment is displayed in the environment screen.
Virtual object: refers to movable objects in a virtual environment. The movable object may be a virtual character, a virtual animal, a cartoon character, etc., such as: characters, animals, plants, oil drums, walls, stones, etc. displayed in the three-dimensional virtual environment. Optionally, the virtual object is a three-dimensional stereoscopic model created based on animated skeleton techniques. Each virtual object has its own shape and volume in the three-dimensional virtual environment, occupying a portion of the space in the three-dimensional virtual environment.
In the present application, the virtual object holds a virtual prop, optionally, the virtual prop includes at least one of a virtual weapon prop, a virtual accessory prop, a virtual backpack prop, a virtual medicine prop, wherein the virtual weapon prop includes a virtual firearm, a virtual walking stick, a virtual bow, a virtual sword, and the like; virtual accessory props include firearm accessories on virtual firearms, pole accessories on virtual poles, and the like; the virtual knapsack prop comprises a first knapsack, a second knapsack, a third knapsack and the like; virtual drug props include bottles, pills, etc. The characteristics of the virtual object assembly virtual prop will be displayed within the virtual environment.
In this application, the virtual prop may be fitted with an accessory. Optionally, the post-assembly features will be displayed in the environment screen to alert the user that the virtual prop has been assembled with the accessory. In one example, the virtual prop is implemented as a virtual firearm, and the accessory corresponding to the virtual firearm includes at least one of a sight, muzzle, barrel, laser, underhang, clip, butt, and side grip. After the virtual prop is assembled, the virtual prop corresponds to the accessory state. Optionally, the accessory status of the virtual prop indicates what virtual accessory the virtual prop is equipped with, and whether the virtual accessory is equipped with.
The terminals in this application may be desktop computers, laptop portable computers, cell phones, tablet computers, e-book readers, MP3 (Moving Picture Experts Group Audio Layer III, moving picture experts compression standard audio layer 3) players, MP4 (Moving Picture Experts Group Audio Layer IV, moving picture experts compression standard audio layer 4) players, and the like. The terminal has installed and running therein an application supporting a virtual environment, such as an application supporting a three-dimensional virtual environment. The application may be any one of a virtual reality application, a three-dimensional map application, a military simulation application, a TPS game, an FPS game, a MOBA game. Alternatively, the application may be a stand-alone application, such as a stand-alone 3D game, or a network-on-line application.
The methods provided herein may be applied to virtual reality applications, three-dimensional map programs, military simulation programs, first-person shooter games (FPS), third-person shooter games (Third-Person Shooting game, TPS), multiplayer online tactical competition games (Multiplayer Online Battle Arena Games, MOBA), etc., and the following embodiments are exemplified by application in games.
Games based on virtual environments often consist of one or more maps of the game world, the virtual environments in the games simulate the scenes of the real world, and users can control virtual objects in the games to walk, run, jump, shoot, fight, drive, switch to use virtual props, attack other virtual objects by using the virtual props and other actions in the virtual environments, so that the interactivity is high. In such games, a player can change the appearance of a virtual prop by using a form in which the virtual prop is assembled using accessories, and cause the virtual prop to display the appearance in an environment screen. In such games, the state in which the virtual object controlled by the player holds the virtual prop will also be displayed in the environment screen. Fig. 1 shows a schematic diagram of an environment screen according to an exemplary embodiment of the present application. Referring to fig. 1, an environment screen 100 includes a virtual prop 101 and a portion of a virtual object 102. In fig. 1, virtual prop 101 is implemented as a virtual firearm and virtual object 102 is implemented as a virtual character controlling the virtual firearm. During the process of displaying virtual prop 101 by environment screen 100, virtual object 102 also performs a corresponding action in response to the terminal's control over itself. At this time, the environment screen 100 further includes an accessory switch control 103, and the accessory switch control 103 is used to add an accessory to the virtual prop 101, and further change the accessory state of the virtual object 102. Fig. 2 shows a schematic diagram of another environment screen provided in an exemplary embodiment of the present application. Please refer to fig. 2. The environment screen 100 includes a virtual prop 101 equipped with an accessory 104 and a part of a virtual object 102. The accessory 104 is mounted on virtual prop 101 in response to operation received by accessory control 103. Alternatively, as can be seen from a comparison of fig. 1 and 2, virtual prop 101 equipped with accessory 104 occupies a larger screen area in ambient screen 100 than virtual prop 101 not equipped with accessory 104 occupies in ambient screen 100. Assuming that the accessory state corresponding to the virtual prop 101 in fig. 1 is the first accessory state, and the accessory state corresponding to the virtual prop 101 in fig. 2 is the second accessory state, the screen area occupied by the virtual prop 101 in the second accessory state in the environment screen 100 is larger than the screen area occupied by the virtual prop 101 in the first accessory state in the environment screen 100.
Fig. 3 shows a block diagram of an electronic device according to an exemplary embodiment of the present application. The electronic device 300 includes: an operating system 320 and application programs 322.
Operating system 320 is the underlying software that provides applications 322 with secure access to computer hardware.
The application 322 is an application supporting a virtual environment. Alternatively, application 322 is an application that supports a three-dimensional virtual environment. The application 322 may be any one of a virtual reality application, a three-dimensional map program, a military simulation program, a TPS, FPS, MOBA game, a multiplayer gunfight survival game, and a massively multiplayer online role playing game (Massively Multiplayer Online Role Playing Game, MMORPG). The application 322 may be a stand-alone application, such as a stand-alone 3D game program.
FIG. 4 illustrates a block diagram of a computer system provided in an exemplary embodiment of the present application. The computer system 400 includes: a first device 420, a server 440, and a second device 460.
The first device 420 installs and runs an application supporting a virtual environment. The application may be any one of a virtual reality application, a three-dimensional map application, a military simulation application, a TPS game, an FPS game, a MOBA game, a multiplayer warfare survival game, an MMORPG game. The first device 420 is a device used by a first user to control a first virtual object located in a virtual environment to perform activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking up, shooting, attacking, throwing. Illustratively, the first virtual object is a first virtual character, such as an emulated persona or a cartoon persona.
The first device 420 is connected to the server 440 via a wireless network or a wired network.
The server 440 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. The server 440 is used to provide background services for applications supporting a three-dimensional virtual environment. Optionally, the server 440 takes on primary computing work, and the first device 420 and the second device 460 take on secondary computing work; alternatively, the server 440 performs the secondary computing job and the first device 420 and the second device 460 perform the primary computing job; alternatively, the server 440, the first device 420 and the second device 460 may perform collaborative computing using a distributed computing architecture.
The second device 460 installs and runs an application supporting a virtual environment. The application may be any one of a virtual reality application, a three-dimensional map application, a military simulation application, an FPS game, a MOBA game, a multiplayer warfare survival game, and a MMORPG game. The second device 460 is a device used by a second user that uses the second device 460 to control a second virtual object located in the virtual environment to perform activities including, but not limited to: adjusting at least one of body posture, crawling, walking, running, riding, jumping, driving, picking up, shooting, attacking, throwing. Illustratively, the second virtual object is a second virtual character, such as an emulated persona or a cartoon persona.
Optionally, the first avatar and the second avatar are in the same virtual environment. Alternatively, the first avatar and the second avatar may belong to the same team, the same organization, have a friend relationship, or have temporary communication rights. Alternatively, the first avatar and the second avatar may belong to different teams, different organizations, or two parties with hostility.
Alternatively, the applications installed on the first device 420 and the second device 460 are the same, or the applications installed on the two devices are the same type of application for different control system platforms. The first device 420 may refer broadly to one of a plurality of devices and the second device 460 may refer broadly to one of a plurality of devices, the present embodiment being illustrated with only the first device 420 and the second device 460. The device types of the first device 420 and the second device 460 are the same or different, and the device types include: at least one of a game console, a desktop computer, a smart phone, a tablet computer, an electronic book reader, an MP4 player, and a laptop portable computer. The following embodiments are illustrated with the device being a desktop computer.
Those skilled in the art will appreciate that the number of devices described above may be greater or lesser. Such as the above-mentioned devices may be only one, or the above-mentioned devices may be several tens or hundreds, or more. The number of devices and the types of devices are not limited in the embodiments of the present application.
The method for displaying virtual props provided in the embodiments of the present application will be described with reference to the above description of the noun introduction and the implementation environment. Fig. 5 shows a flowchart of a method for displaying a virtual prop according to an exemplary embodiment of the present application, where the method is applied to an electronic device for explanation, and the method includes:
step 501, displaying a first environment picture, wherein the first environment picture comprises a virtual environment, the virtual environment comprises a virtual object, the virtual object holds a virtual prop observed at a first observation angle, the virtual prop is in a first accessory state, and the first accessory state indicates an accessory to which the virtual prop is currently assembled.
In the embodiment of the application, the first environment picture comprises a virtual object and a virtual prop assembled by the virtual object. In one example, the first environment screen is an environment screen displayed in an environment screen of the application program, the first environment screen includes a virtual object controlled by a user, the user observes the virtual environment with the virtual object, and the first environment screen is a screen in which the virtual object observes the virtual environment.
In the embodiment of the application, the virtual prop is a virtual firearm assembled by a virtual object. The virtual object corresponds to a first accessory state that indicates an accessory to which the virtual prop is currently assembled. In one example, the virtual prop is implemented as a virtual firearm, and the corresponding accessory of the virtual prop includes at least one of a sight, muzzle, barrel, laser, underhang, clip, butt, side grip. When the virtual prop is realized as a virtual firearm, the virtual prop corresponds to different slots so as to be respectively matched with the accessories. Optionally, the information in the first accessory status includes whether the current virtual prop has an accessory and what accessory the current virtual prop is equipped with.
In this embodiment of the present application, the first observation angle is an angle at which the virtual object observes the environmental screen.
Step 502, in response to receiving an accessory status adjustment operation, adjusts the virtual prop from a first accessory status to a second accessory status.
In one example of the application, an accessory switching control corresponding to the accessory is further displayed on the environment screen in an overlapping manner, and the accessory switching control is used for receiving the accessory state adjustment triggering operation and installing the accessory corresponding to the virtual object control on the virtual object according to the accessory triggering adjustment operation. In one example, the user can set the accessory-switch control and select a second accessory status.
Referring to fig. 6, a first environment screen 600 includes a virtual object 601 and a virtual prop 602 held by the virtual object 601. Further, an accessory switch control 603 is included in the first environment interface 600. The accessory switch control 603 is used to trigger an accessory status adjustment operation.
Step 503, displaying a second environment picture, wherein the second environment picture includes virtual props observed at a second observation angle.
After the virtual prop is adjusted from the first accessory state to the second accessory state, the terminal displays a second environment picture comprising the virtual prop according to the difference between the virtual props respectively indicated by the first accessory state and the second accessory state, wherein the virtual prop in the second environment picture is displayed at a second observation angle.
In one example, the first accessory status indicates that the virtual firearm does not have an accessory thereon, and the second accessory status indicates that the virtual firearm has a sight thereon. In this case, referring to fig. 7, in the first fitting state, the virtual prop 701 has no fitting, and in the second fitting state, the virtual prop 701 is configured with the sighting telescope 702.
Referring to fig. 8, a first environment interface 800 includes a virtual object 801, a virtual prop 802 held by the virtual object 801, an accessory switching control 803, and a scope 804 mounted on the virtual prop 802. Then the combination of virtual prop 802 and scope 804 corresponds to a second viewing angle at this point. Optionally, the second viewing angle is different from the first viewing angle. In one example, the virtual prop in the second accessory state occupies an area of the second environmental screen equal to the area of the virtual prop in the first accessory state occupying the first environmental screen. The actual area corresponding to the virtual prop in the second accessory state is larger than the actual area of the first environment picture occupied by the virtual prop in the first accessory state.
In one example, occupation of the virtual prop by the area of the second environment picture is reduced by partially displaying the virtual prop corresponding to the second accessory state; in another example, occupation of the virtual prop by the area of the second environmental screen is reduced by rotating the virtual prop corresponding to the second accessory state. In the above example, the virtual prop corresponding to the second accessory state corresponds to the second viewing angle.
In summary, in the method provided in the embodiment, the virtual object in the first accessory state is displayed at the first viewing angle in the first environment picture, and when the virtual prop is changed from the first accessory state to the second accessory state, the virtual object in the second accessory state is displayed at the second viewing angle in the second environment picture. By changing the viewing angle for displaying the virtual prop in the environment picture, when the virtual prop is displayed in the environment picture at different viewing angles, that is, a large amount of space in the viewing angle is not occupied, and the efficiency of viewing the virtual environment at the viewing angle of the virtual object is improved.
In the application, the first observation angle and the second observation angle are obtained through a camera module in the virtual environment. After the second viewing angle is determined, the camera may be adjusted. Fig. 9 is a schematic flow chart for determining a second observation angle according to an exemplary embodiment of the present application, and the method is used in an electronic device for illustration, and includes:
Step 901, a first length of a virtual prop in a second accessory state in a reference direction is obtained.
Optionally, the embodiment of the application is triggered when the virtual prop in the second accessory state is observed at the first observation angle and the virtual prop exceeds the display range of the preset area.
The preset area is an area preset in the picture by the terminal, and in the environment picture, the virtual prop is always positioned in the preset area. Optionally, the preset area is located at the lower right of the screen. In one example, the preset area is a rectangular preset area, the lower edge of the rectangular preset area coincides with the lower edge of the picture, the right edge of the rectangular preset area coincides with the right edge of the picture, the width of the rectangular preset area is one half of the width of the picture, and the height is one half of the width of the picture; in another example, the lower edge of the preset area coincides with the lower edge of the picture, and the left edge coincides with the vertical center line of the picture. The length of the preset area is 3/8 of the picture length, and the width is 1/2 of the picture width.
Referring to fig. 10, virtual prop 1001 is a virtual firearm and is placed in a cartesian space coordinate system, and X-axis 1011, Y-axis 1012, and Z-axis 1013 of the cartesian space coordinate system are determined, wherein X-axis 1011, Y-axis 1012, and Z-axis 1013 are perpendicular to each other. At this time, the length of virtual prop 1001 in the X-axis 1011 direction is the longest length of virtual prop 1001 in the X-axis direction on the projection of the YOZ plane; the length of virtual prop 1001 in the direction of Y axis 1012 is the longest length of virtual prop 1001 in the direction of Y axis on the projection of XOZ plane; the length of virtual prop 1001 in the direction of Z axis 1013 is the longest length of virtual prop 1001 in the direction of Z axis on the projection of the XOY plane.
Optionally, after establishing the Cartesian space coordinate system, one of the directions of X-axis 1011, Y-axis 1012 and Z-axis 1013 is selected as the reference direction for the virtual prop. In this embodiment of the present application, the Z axis is selected as the reference direction, and the height of the virtual firearm is the first length of the virtual prop.
Step 902, obtaining a second length of the preset area in the reference direction.
In this embodiment, since the virtual prop is implemented as a virtual firearm, and the selected reference direction is the height direction of the virtual firearm, the corresponding reference direction of the preset area is the height direction of the preset area, and when the preset area is implemented as a rectangular area, the reference direction of the preset area is the direction corresponding to the left edge of the rectangular area, or the direction corresponding to the right edge of the rectangular area. In one example, the second length of the rectangular region in the reference direction is the length of the left edge of the rectangular region.
In step 903, in response to the first length being greater than the second length, an observation mode of the camera model for the virtual prop is determined according to the second accessory status, where the observation mode includes an observation target of the camera model.
Optionally, the observation mode indicates an observation target of the camera model. In one example, the observation mode is a virtual object observation mode, and then the observation mode instructs the camera model to observe the virtual object; in another example, the observation mode is a virtual prop observation mode, which then instructs the camera model to observe the virtual object.
In this embodiment of the present application, in order to ensure that the virtual prop corresponding to the second accessory state is within the preset area, the first length needs to be adjusted. In the application, the matching of the first length and the second length is realized by selecting the second observation angle of the virtual prop corresponding to the second accessory state, and the basic principle comprises the following two types:
(1) Referring to fig. 11, in fig. 11, a first length 1102 corresponding to a virtual object 1101 is greater than a second length 1112 corresponding to a preset area 1111, and the virtual object 1101 is located in the preset area 1111 by cutting a portion of the virtual object 1101 into the preset area 1111.
(2) Referring to fig. 12, in fig. 12, a first length 1202 corresponding to a virtual object 1201 is greater than a second length 1212 corresponding to a preset area 1211, and by tilting the virtual object, the first length 1202 is smaller than or equal to the second length 1212, so that the virtual object 1201 is located in the preset area 1211.
Step 904, determining a first distance from the first environmental picture to the camera model in response to the observation mode being the observation of the virtual prop.
In the embodiment of the application, the virtual environment corresponds to a near section and a far section, and a virtual object positioned in the near section and the far section is presented in an environment picture. In one example, the frame is determined to be coplanar with the near cross-section. Referring to fig. 13, a cartesian three-dimensional coordinate system is established with the position 1301 of the first camera model as the origin, and the near section 1302 and the far section 1303 are planes parallel to the YOZ plane. At this time, the first camera model 1301 projects to the far section 1303, and the content included in the projection is the content that can be displayed in the first environment screen. Meanwhile, on the near cross section 1302, a screen is correspondingly presented.
In the embodiment of the application, the first environment picture is coplanar with the near-section, and the first distance from the first environment picture to the camera model is the distance from the first camera model to the near-section.
Step 905 determines a second distance of the virtual prop from the camera model.
Referring to fig. 14, virtual prop 1401 in the first accessory state is mostly located between proximal section 1411 and distal section 1412. In fig. 14, the reference direction is an extending direction to the Y axis. The projection is a projection of virtual prop 1401 in a first accessory state on an XOY plane. The origin O of the coordinate system is the first camera model 1400. When camera model 1400 views virtual prop 1401 in the first accessory state, it corresponds to a virtual prop view mode. The most distant view angle position of the virtual prop 1401 in the first accessory state is the point P, the most closest view angle position of the virtual prop 1401 in the first accessory state is the limit position of the near cross section, the point Q ' is the point where the line of sight OP intersects the near cross section, and at this time, it can be determined that P ' Q ' is the distance from the virtual prop 1401 in the first accessory state to the camera model.
Step 906, determining a second observation angle of the camera model to the virtual prop according to the ratio of the first distance and the second distance.
Referring to fig. 15, when viewing virtual prop 1501 corresponding to the second accessory state from first camera model 1500 in the same manner, since virtual prop 1501 corresponding to the second accessory state includes a portion corresponding to accessory 1502, the final length is the sum of P 'Q' and P 'R' in the reference direction. At this time, the viewing angle for observing the virtual prop needs to be adjusted to be similar to P 'Q'.
Referring to fig. 16, let the first distance, i.e. the distance from the first camera model 1600 to the near section 1611, be w1, and let the second distance, i.e. the projection distance of the farthest position from the first camera model 1600 to the virtual prop 1601 on the X-axis be w2, and the height of the accessory 1602 on the virtual prop be h1, then in order to make the viewing experience of the second environmental image for viewing the virtual environment unchanged, the viewing angle up-shift distance of the first camera model 1600 with respect to the first viewing angle is determined as shown in the following formula 1:
equation 1: h2 =h1×w1/w2
The second viewing angle can be determined based on the position of the upward shift of the viewing angle.
In summary, in the method provided in this embodiment, a coordinate system is established corresponding to the virtual prop, and data corresponding to the virtual prop having the first accessory state and the virtual prop having the second accessory state for adjusting the angle are determined according to the coordinate system, so that the adjustment from the first observation angle to the second observation angle is performed according to the data. Through the position setting to the camera model, the difference between the first observation angle and the second observation angle is determined, so that the virtual prop which is finally displayed and corresponds to the second accessory state is smaller than the virtual prop which is displayed in the first accessory state and is in the first accessory state under the condition that the virtual props are all positioned in a preset area, and the efficiency of observing the virtual environment in the visual angle of the virtual object is further improved.
The method for obtaining the second observation angle by means of the relative relation between the first distance and the second distance further improves the accuracy of determining the second observation angle, and further improves the efficiency of observing the virtual environment by the visual angle of the virtual object.
Fig. 17 is a flowchart of a method for displaying a virtual prop according to an exemplary embodiment of the present application. The method is applied to the electronic equipment for illustration, and the method comprises the following steps:
step 1701, displaying a first environment picture, wherein the first environment picture comprises a virtual environment, the virtual environment comprises a virtual object, the virtual object holds a virtual prop observed at a first observation angle, the virtual prop is in a first accessory state, and the first accessory state indicates an accessory to which the virtual prop is currently assembled.
In this embodiment of the present application, the first environment image is a virtual program interface in an application program, where the application program is implemented as a game program including a virtual game, and the first environment image is a program interface for performing the virtual game. The virtual object is implemented as a virtual character and the virtual prop is implemented as a virtual firearm. In the virtual pairing of the application, the user uses a virtual firearm to conduct the pairing by controlling the virtual character. The first accessory status indicates that the virtual firearm currently has no accessory.
In response to receiving the accessory state adjustment operation, step 1702, the virtual prop is adjusted from the first accessory state to the second accessory state.
In the embodiment of the application, the fitting is assembled for the virtual firearm according to the triggering operation. After the fitting is assembled on the virtual firearm, the virtual firearm has new visual characteristics, so that the virtual prop corresponding to the second fitting state has new characteristics relative to the virtual prop in the first fitting state. In displaying the virtual prop corresponding to the second accessory state on the environment screen, the new visual feature needs to be embodied.
Because the size of the area occupied by the virtual prop corresponding to the first accessory state is different from that of the area occupied by the virtual prop corresponding to the second accessory state, the second observation angle for observing the virtual prop corresponding to the second accessory state needs to be adjusted.
Step 1703, a first length in a reference direction of the virtual prop in a second accessory state is acquired.
In the embodiment of the application, the first length of the virtual prop in the second accessory state in the reference direction is obtained by placing the virtual prop in a Cartesian coordinate system.
Step 1704, acquiring a second length of the preset area in the reference direction.
In this embodiment of the present application, the preset area is a rectangular preset area, and when the reference direction is parallel to any side of the rectangular preset area, the second length of the preset area in the reference direction is the side length of the rectangular preset area.
In step 1705, responsive to the first length being greater than the second length, an observation mode of the virtual prop by the camera model is determined from the second accessory status, the observation mode including an observation target of the camera model.
In this embodiment of the present application, the observation mode includes a virtual object observation mode and a virtual prop observation mode, and in addition, the specific type of the observation mode is not limited in this application, and only the observation mode includes a virtual prop corresponding to the second accessory state.
In step 1706, a first distance from the first environmental image to the camera model is determined in response to the observation mode being an observation of the virtual prop.
In this embodiment of the present application, the first distance is a distance between the camera model and the near-cross section. At this time, the position of the camera model is the position corresponding to the first observation angle.
Step 1707, a second distance of the virtual prop to the camera model is determined.
Optionally, the distance from the virtual prop to the camera model is the furthest distance between the projection of the virtual prop on the coordinate axis and the camera model.
Step 1708, determining a second observation angle of the camera model to the virtual prop according to the ratio of the first distance and the second distance.
After determining the ratio of the first distance to the second distance, in the embodiment of the present application, the adjustment distance of the observation angle is determined by the principle of similar triangles, and thus the second observation angle is obtained.
Step 1709, fixing the relative position between the camera model and the virtual prop.
Step 1710, determining an observation target of the camera model according to the second observation mode.
Steps 1709 through 1710 represent adjusting the second viewing angle by determining the viewing objective of the camera model in the event that no knowledge between the fixed camera model and the virtual prop is available.
Step 1711, fixing the observation target of the camera model.
Step 1712, determining a relative position between the camera model and the virtual prop according to the second observation mode.
Steps 1711-1712 represent determining a second viewing angle by adjusting the relative unknowns between the camera model and the virtual prop with the viewing target fixed.
Referring to fig. 18, in fig. 18, a camera model 1811, a camera model 1812, and a camera model 1813 are corresponding to a virtual prop 1801, and different camera models observe the virtual prop 1801 at different angles of view.
In step 1713, a second environmental screen is displayed, where the second environmental screen includes virtual props viewed at a second viewing angle.
Referring to fig. 19, the second environment screen 1900 includes a virtual object 1901, a virtual object 1902 corresponding to a second accessory status, an accessory switching control 1903, and a preset area 1904. Wherein the lower edge of the preset region 1904 coincides with the lower edge of the picture and the left edge coincides with the vertical center line of the picture. The length of the preset region 1904 is 3/8 of the frame length, and the width is 1/2 of the frame width. The second viewing angle corresponding to the virtual object 1902 corresponding to the second accessory status enables the position of the virtual object 1902 corresponding to the second accessory status to be located in the preset area 1904.
Optionally, in the embodiment of the present application, the center is displayed on the first environmental screen and the second environmental screen, and in the first accessory state and the second accessory state, the extending directions of the virtual props both point to the center, so as to embody reality.
In summary, in the method provided in the embodiment, the virtual object in the first accessory state is displayed at the first viewing angle in the first environment picture, and when the virtual prop is changed from the first accessory state to the second accessory state, the virtual object in the second accessory state is displayed at the second viewing angle in the second environment picture. By changing the viewing angle for displaying the virtual prop in the environment picture, when the virtual prop is displayed in the environment picture at different viewing angles, that is, a large amount of space in the viewing angle is not occupied, and the efficiency of viewing the virtual environment at the viewing angle of the virtual object is improved.
By reflecting the virtual prop in different view angles, the characteristics of the virtual prop can be more remarkably highlighted when the accessory state of the virtual prop is changed, and the efficiency of observing the virtual environment in view angles of the virtual object is further improved.
Fig. 20 is a schematic process diagram of a method for displaying a virtual prop according to an exemplary embodiment of the present application, where the method is applied to an electronic device for example and illustrated, and the process includes:
step 2001, beginning.
The process is a process of displaying the first environment picture. The first environment picture comprises a picture in which the virtual object observes the virtual environment, wherein the picture is realized as a virtual character, the picture only comprises a part of the virtual character, and the virtual prop is realized as a virtual firearm, and the firearm is positioned in a preset area at a first observation angle.
Step 2002, determining whether a virtual prop has an additional sight.
The virtual prop is a virtual firearm, and the aiming tool is one type of accessory. In the embodiment of the application, whether the virtual firearm is provided with the sighting telescope is judged by determining whether the accessory switching control overlapped on the picture receives the triggering operation.
Step 2003, the altitude value of the sighting telescope is obtained.
In this embodiment of the present application, the height value of the sight is the length of the sight in the reference direction.
In step 2004, a distance of the scope from the camera is obtained.
In this embodiment of the present application, the camera is an imaging point of the first viewing angle, and the distance between the sighting telescope and the camera is a second distance between the camera model and the virtual prop.
In step 2005, the distance from the near-cut surface to the camera is obtained.
In the embodiment of the application, the virtual environment is correspondingly provided with a near-cut surface. The near clipping surface can be directly implemented as an environmental picture. The distance between the near cutting surface and the camera is the first distance.
In step 2006, a camera offset is calculated.
In this embodiment of the present application, the offset of the camera model corresponding to the second observation angle may be determined according to the first distance and the second distance.
Step 2007, set the camera offset.
After determining the offset according to the method of step 2006, the virtual prop corresponding to the second accessory state is placed in the second environment screen in a method corresponding to the offset, and at this time, the virtual prop corresponding to the second accessory state is located in the preset area in the second observation state.
Step 2008, set the camera offset to 0.
When no accessory is added to the virtual firearm, the offset of the camera is 0, that is, the virtual prop is located in the preset area.
In summary, in the method provided in the embodiment, the virtual object in the first accessory state is displayed at the first viewing angle in the first environment picture, and when the virtual prop is changed from the first accessory state to the second accessory state, the virtual object in the second accessory state is displayed at the second viewing angle in the second environment picture. By changing the viewing angle for displaying the virtual prop in the environment picture, when the virtual prop is displayed in the environment picture at different viewing angles, that is, a large amount of space in the viewing angle is not occupied, and the efficiency of viewing the virtual environment at the viewing angle of the virtual object is improved.
Fig. 21 is a block diagram of a display device for a virtual prop according to an exemplary embodiment of the present application, where the device includes:
the display module 2101 is used for displaying a first environment picture, wherein the first environment picture comprises a virtual environment, the virtual environment comprises a virtual object, the virtual object holds a virtual prop observed at a first observation angle, the virtual prop is in a first accessory state, and the first accessory state indicates an accessory to which the virtual prop is assembled currently;
An adjustment module 2102 for adjusting the virtual prop from a first accessory state to a second accessory state in response to receiving an accessory state adjustment operation;
the display module 2101 is further configured to display a second environment screen according to the second accessory state, where the second environment screen includes a virtual prop observed at a second observation angle.
In an optional embodiment, a preset area is corresponding to the second environment picture, and the preset area is used for displaying the virtual prop;
the adjusting module 2102 is further configured to adjust the first viewing angle to the second viewing angle in response to the virtual prop in the second accessory state being viewed at the first viewing angle, the virtual prop exceeding a display range of the preset area.
In an alternative embodiment, referring to fig. 22, the apparatus further comprises an acquisition module 2103 for acquiring a first length of the virtual prop in the second accessory state in the reference direction;
acquiring a second length of the preset area in the reference direction;
an adjustment module 2102 for adjusting the first viewing angle to the second viewing angle in response to the first length being greater than the second length.
In an alternative embodiment, the virtual environment comprises a camera model, and the camera model is used for observing the virtual prop;
The device further comprises a determining module 2104, which is used for determining an observation mode of the camera model on the virtual prop according to the second accessory state, wherein the observation mode comprises an observation target of the camera model;
and determining a second observation angle corresponding to the virtual prop according to the observation mode.
In an alternative embodiment, determining module 2104 is configured to determine a first distance of the first environmental screen from the camera model in response to the observation mode being an observation of the virtual prop;
determining a second distance from the virtual prop to the camera model;
and determining a second observation angle of the camera model to the virtual prop according to the ratio of the first distance to the second distance.
In an alternative embodiment, the apparatus further comprises a fixing module 2105 for fixing the relative position between the camera model and the virtual prop;
a determining module 2104 is configured to determine an observation target of the camera model according to the second observation mode.
In an alternative embodiment, after determining the second observation angle of the camera model to the virtual prop according to the ratio of the first distance and the second distance, the fixing module 2105 is further configured to fix an observation target of the camera model;
the determining module 2104 is further configured to determine a relative position between the camera model and the virtual prop according to the second observation mode.
In an alternative embodiment, the center of gravity is displayed on the first environmental screen and on the second environmental screen;
in the first fitting state and the second fitting state, the extension direction of the virtual prop points to the center of gravity.
In summary, in the device provided in this embodiment, the virtual object in the first accessory state is displayed at the first viewing angle in the first environment screen, and when the virtual prop is changed from the first accessory state to the second accessory state, the virtual object in the second accessory state is displayed at the second viewing angle in the second environment screen. By changing the viewing angle for displaying the virtual prop in the environment picture, when the virtual prop is displayed in the environment picture at different viewing angles, that is, a large amount of space in the viewing angle is not occupied, and the efficiency of viewing the virtual environment at the viewing angle of the virtual object is improved.
It should be noted that: the display device of the virtual prop is only exemplified by the division of the functional modules, and in practical application, the functional allocation can be completed by different functional modules according to the needs, namely, the internal structure of the device is divided into different functional modules so as to complete all or part of the content described above.
Fig. 23 shows a block diagram of an electronic device 2300 provided by an exemplary embodiment of the present application. The electronic device 2300 may be a portable mobile terminal such as: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion picture expert compression standard audio plane 3), an MP4 (Moving Picture Experts Group Audio Layer IV, motion picture expert compression standard audio plane 4) player, a notebook computer, or a desktop computer. Electronic device 2300 may also be referred to by other names of user devices, portable terminals, laptop terminals, desktop terminals, and the like.
In general, the electronic device 2300 includes: a processor 2301 and a memory 2302.
The processor 2301 may include one or more processing cores, such as a 4-core processor, an 8-core processor, or the like. The processor 2301 may be implemented in at least one hardware form of a DSP (Digital Signal Processing ), FPGA (Field-Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array ). The processor 2301 may also include a main processor, which is a processor for processing data in an awake state, also called a CPU (Central Processing Unit ), and a coprocessor; a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 2301 may be integrated with a GPU (Graphics Processing Unit, image processor) for taking care of rendering and rendering of content that the display screen is required to display. In some embodiments, the processor 2301 may also include an AI (Artificial Intelligence ) processor for processing computing operations related to machine learning.
Memory 2302 may include one or more computer-readable storage media, which may be non-transitory. Memory 2302 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 2302 is used to store at least one instruction for execution by processor 2301 to implement the method of displaying virtual props provided by the method embodiments herein.
In some embodiments, electronic device 2300 may further optionally include: a peripheral interface 2303 and at least one peripheral. The processor 2301, memory 2302 and peripheral interface 2303 may be connected by a bus or signal line. Individual peripheral devices may be connected to peripheral device interface 2303 by buses, signal lines or a circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 2304, a display 2305, a camera assembly 2306, an audio circuit 2307, a positioning assembly 2308, and a power supply 2309.
Peripheral interface 2303 may be used to connect at least one Input/Output (I/O) related peripheral to processor 2301 and memory 2302. In some embodiments, the processor 2301, memory 2302 and peripheral interface 2303 are integrated on the same chip or circuit board; in some other embodiments, either or both of the processor 2301, the memory 2302 and the peripheral interface 2303 may be implemented on separate chips or circuit boards, which is not limited in this embodiment.
The Radio Frequency circuit 2304 is configured to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuit 2304 communicates with a communication network and other communication devices via electromagnetic signals. The radio frequency circuit 2304 converts an electrical signal into an electromagnetic signal for transmission, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 2304 includes: antenna systems, RF transceivers, one or more amplifiers, tuners, oscillators, digital signal processors, codec chipsets, subscriber identity module cards, and so forth. The radio frequency circuit 2304 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocol includes, but is not limited to: the world wide web, metropolitan area networks, intranets, generation mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity ) networks. In some embodiments, the radio frequency circuit 2304 may also include NFC (Near Field Communication ) related circuits, which are not limited in this application.
The display 2305 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display 2305 is a touch display, the display 2305 also has the ability to collect touch signals at or above the surface of the display 2305. The touch signal may be input to the processor 2301 as a control signal for processing. At this point, the display 2305 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 2305 may be one, disposed on a front panel of the electronic device 2300; in other embodiments, the display 2305 may be at least two, each disposed on a different surface of the electronic device 2300 or in a folded design; in other embodiments, the display 2305 may be a flexible display disposed on a curved surface or a folded surface of the electronic device 2300. Even more, the display 2305 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The display 2305 may be made of LCD (Liquid Crystal Display ), OLED (Organic Light-Emitting Diode) or other materials.
The camera assembly 2306 is used to capture images or video. Optionally, camera assembly 2306 includes a front camera and a rear camera. Typically, the front camera is disposed on the front panel of the terminal and the rear camera is disposed on the rear surface of the terminal. In some embodiments, the at least two rear cameras are any one of a main camera, a depth camera, a wide-angle camera and a tele camera, so as to realize that the main camera and the depth camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting and Virtual Reality (VR) shooting function or other fusion shooting functions. In some embodiments, camera assembly 2306 may also include a flash. The flash lamp can be a single-color temperature flash lamp or a double-color temperature flash lamp. The dual-color temperature flash lamp refers to a combination of a warm light flash lamp and a cold light flash lamp, and can be used for light compensation under different color temperatures.
The audio circuit 2307 may include a microphone and a speaker. The microphone is used for collecting sound waves of users and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 2301 for processing, or inputting the electric signals to the radio frequency circuit 2304 for voice communication. For purposes of stereo acquisition or noise reduction, the microphone may be multiple, and disposed at different locations of the electronic device 2300. The microphone may also be an array microphone or an omni-directional pickup microphone. The speaker is then used to convert electrical signals from the processor 2301 or the radio frequency circuit 2304 into sound waves. The speaker may be a conventional thin film speaker or a piezoelectric ceramic speaker. When the speaker is a piezoelectric ceramic speaker, not only the electric signal can be converted into a sound wave audible to humans, but also the electric signal can be converted into a sound wave inaudible to humans for ranging and other purposes. In some embodiments, audio circuit 2307 may also include a headphone jack.
The location component 2308 is used to locate the current geographic location of the electronic device 2300 to enable navigation or LBS (Location Based Service, location-based services). The positioning component 2308 may be a positioning component based on the United states GPS (Global Positioning System ), the Beidou system of China, or the Galileo system of Russia.
The power supply 2309 is used to power the various components in the electronic device 2300. The power source 2309 may be alternating current, direct current, disposable or rechargeable. When the power source 2309 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, electronic device 2300 further includes one or more sensors 2310. The one or more sensors 2310 include, but are not limited to: an acceleration sensor 2311, a gyro sensor 2312, a pressure sensor 2313, a fingerprint sensor 2314, an optical sensor 2315 and a proximity sensor 2316.
The acceleration sensor 2311 may detect the magnitudes of accelerations on three coordinate axes of a coordinate system established with the electronic device 2300. For example, the acceleration sensor 2311 may be used to detect components of gravitational acceleration in three coordinate axes. The processor 2301 may control the display screen 2305 to display a user interface in either a landscape view or a portrait view based on gravitational acceleration signals acquired by the acceleration sensor 2311. The acceleration sensor 2311 may also be used for the acquisition of motion data of a game or a user.
The gyro sensor 2312 may detect a body direction and a rotation angle of the electronic device 2300, and the gyro sensor 2312 may collect a 3D motion of the user on the electronic device 2300 in cooperation with the acceleration sensor 2311. The processor 2301 may perform the following functions based on the data collected by the gyro sensor 2312: motion sensing (e.g., changing UI according to a tilting operation by a user), image stabilization at shooting, game control, and inertial navigation.
The pressure sensor 2313 may be disposed at a side frame of the electronic device 2300 and/or at an underside of the display 2305. When the pressure sensor 2313 is disposed at a side frame of the electronic device 2300, a grip signal of the electronic device 2300 by a user may be detected, and the processor 2301 may perform a left-right hand recognition or a quick operation according to the grip signal collected by the pressure sensor 2313. When the pressure sensor 2313 is disposed at the lower layer of the display screen 2305, the processor 2301 controls the operability control on the UI interface according to the pressure operation of the user on the display screen 2305. The operability controls include at least one of a button control, a scroll bar control, an icon control, and a menu control.
The fingerprint sensor 2314 is used to collect a fingerprint of a user, and the processor 2301 identifies the identity of the user based on the fingerprint collected by the fingerprint sensor 2314 or the fingerprint sensor 2314 identifies the identity of the user based on the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, the processor 2301 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying for and changing settings, and the like. The fingerprint sensor 2314 may be disposed on the front, back, or side of the electronic device 2300. When a physical key or vendor Logo is provided on the electronic device 2300, the fingerprint sensor 2314 may be integrated with the physical key or vendor Logo.
The optical sensor 2315 is used to collect ambient light intensity. In one embodiment, the processor 2301 may control the display brightness of the display screen 2305 based on the ambient light intensity collected by the optical sensor 2315. Specifically, when the ambient light intensity is high, the display luminance of the display screen 2305 is turned up; when the ambient light intensity is low, the display luminance of the display screen 2305 is turned down. In another embodiment, the processor 2301 may also dynamically adjust the photographing parameters of the camera assembly 2306 based on the intensity of ambient light collected by the optical sensor 2315.
A proximity sensor 2316, also referred to as a distance sensor, is typically provided on the front panel of the electronic device 2300. The proximity sensor 2316 is used to capture the distance between the user and the front of the electronic device 2300. In one embodiment, when the proximity sensor 2316 detects that the distance between the user and the front of the electronic device 2300 is gradually decreasing, the processor 2301 controls the display 2305 to switch from the bright screen state to the off screen state; when the proximity sensor 2316 detects that the distance between the user and the front surface of the electronic device 2300 gradually increases, the processor 2301 controls the display 2305 to switch from the off-screen state to the on-screen state.
Those skilled in the art will appreciate that the structure shown in fig. 23 is not limiting of the electronic device 2300 and may include more or fewer components than shown, or may combine certain components, or may employ a different arrangement of components.
The embodiment of the application also provides a computer readable storage medium, wherein at least one instruction, at least one section of program, a code set or an instruction set is stored in the readable storage medium, and the at least one instruction, the at least one section of program, the code set or the instruction set is loaded and executed by a processor to realize the method for displaying the virtual prop.
The present application also provides a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the method for displaying a virtual prop according to any of the above embodiments.
Those of ordinary skill in the art will appreciate that all or part of the steps in the various methods of the above embodiments may be implemented by a program for instructing related hardware, and the program may be stored in a computer readable storage medium, which may be a computer readable storage medium included in the memory of the above embodiments; or may be a computer-readable storage medium, alone, that is not incorporated into the terminal. The computer readable storage medium stores at least one instruction, at least one program, a code set, or an instruction set, and the at least one instruction, the at least one program, the code set, or the instruction set is loaded and executed by a processor to implement the method for displaying virtual props.
Alternatively, the computer-readable storage medium may include: read Only Memory (ROM), random access Memory (RAM, random Access Memory), solid state disk (SSD, solid State Drives), or optical disk, etc. The random access memory may include resistive random access memory (ReRAM, resistance Random Access Memory) and dynamic random access memory (DRAM, dynamic Random Access Memory), among others. The foregoing embodiment numbers of the present application are merely for describing, and do not represent advantages or disadvantages of the embodiments.
It will be appreciated by those of ordinary skill in the art that all or part of the steps of implementing the above embodiments may be implemented by hardware, or may be implemented by a program to instruct related hardware, and the program may be stored in a computer readable storage medium, where the storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The foregoing description of the preferred embodiments is merely exemplary in nature and is not intended to limit the invention, but is intended to cover various modifications, substitutions, improvements, and alternatives falling within the spirit and principles of the invention.

Claims (11)

1. A method for displaying a virtual prop, the method comprising:
Displaying a first environment picture, wherein the first environment picture comprises a virtual environment, the virtual environment comprises a virtual object, the virtual object holds a virtual prop observed at a first observation angle, the virtual prop is in a first accessory state, and the first accessory state indicates an accessory currently assembled by the virtual prop;
in response to receiving an accessory state adjustment operation, adjusting the virtual prop from the first accessory state to a second accessory state;
acquiring a first length of the virtual prop in the second accessory state in a reference direction, wherein the reference direction is a height direction of the virtual prop;
acquiring a second length of a preset area in a second environment picture in the reference direction, wherein the preset area is used for displaying the virtual prop;
and in response to the first length being greater than the second length, adjusting the first viewing angle to a second viewing angle, and displaying the second environment picture, wherein the second environment picture comprises the virtual prop observed at the second viewing angle.
2. The method of claim 1, wherein the virtual environment includes a camera model therein, the camera model being used to view the virtual prop;
Before the first observation angle is adjusted to the second observation angle to display the second environment picture, the method further comprises:
determining an observation mode of the camera model on the virtual prop according to the second accessory state, wherein the observation mode comprises an observation target of the camera model;
and determining a second observation angle corresponding to the virtual prop according to the observation mode.
3. The method of claim 2, wherein determining a second viewing angle corresponding to the virtual prop according to the viewing mode comprises:
determining a first distance from the first environment picture to the camera model in response to the observation mode being to observe the virtual prop;
determining a second distance from the virtual prop to the camera model;
and determining a second observation angle of the camera model to the virtual prop according to the ratio of the first distance to the second distance.
4. The method of claim 3, wherein after determining the second viewing angle of the camera model for the virtual prop according to the ratio of the first distance and the second distance, further comprising:
Fixing the relative position between the camera model and the virtual prop;
and determining an observation target of the camera model according to the observation mode.
5. The method of claim 3, wherein after determining the second viewing angle of the camera model for the virtual prop according to the ratio of the first distance and the second distance, further comprising:
fixing an observation target of the camera model;
and determining the relative position between the camera model and the virtual prop according to the observation mode.
6. The method of claim 1, wherein a center of gravity is displayed on the first environmental screen and on the second environmental screen;
in the first fitting state and the second fitting state, the extension direction of the virtual prop is directed toward the center of gravity.
7. A display device for a virtual prop, the device comprising:
the display module is used for displaying a first environment picture, wherein the first environment picture comprises a virtual environment, the virtual environment comprises a virtual object, the virtual object holds a virtual prop observed at a first observation angle, the virtual prop is in a first accessory state, and the first accessory state indicates an accessory to which the virtual prop is currently assembled;
The adjusting module is used for responding to the received accessory state adjusting operation and adjusting the virtual prop from the first accessory state to a second accessory state;
the acquisition module is used for acquiring a first length of the virtual prop in the second accessory state in a reference direction, wherein the reference direction is the height direction of the virtual prop; acquiring a second length of a preset area in a second environment picture in the reference direction, wherein the preset area is used for displaying the virtual prop;
an adjustment module for adjusting the first viewing angle to a second viewing angle in response to the first length being greater than the second length;
the display module is further configured to display the second environment image, where the second environment image includes the virtual prop observed at the second observation angle.
8. The apparatus of claim 7, wherein the virtual environment includes a camera model therein for viewing the virtual prop;
the device further comprises a determining module, wherein the determining module is used for determining an observation mode of the camera model on the virtual prop according to the second accessory state, and the observation mode comprises an observation target of the camera model;
And determining a second observation angle corresponding to the virtual prop according to the observation mode.
9. The apparatus of claim 8, wherein the device comprises a plurality of sensors,
the determining module is further configured to determine a first distance from the first environmental picture to the camera model in response to the observation mode being to observe the virtual prop;
determining a second distance from the virtual prop to the camera model;
and determining a second observation angle of the camera model to the virtual prop according to the ratio of the first distance to the second distance.
10. A computer device comprising a processor and a memory, wherein the memory stores at least one program, the at least one program being loaded and executed by the processor to implement the method of displaying a virtual prop of any of claims 1 to 6.
11. A computer readable storage medium having stored therein at least one program loaded and executed by a processor to implement the method of displaying a virtual prop of any of claims 1 to 6.
CN202011222394.7A 2020-11-05 2020-11-05 Virtual prop display method, device, equipment and readable storage medium Active CN112330823B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011222394.7A CN112330823B (en) 2020-11-05 2020-11-05 Virtual prop display method, device, equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011222394.7A CN112330823B (en) 2020-11-05 2020-11-05 Virtual prop display method, device, equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN112330823A CN112330823A (en) 2021-02-05
CN112330823B true CN112330823B (en) 2023-06-16

Family

ID=74315994

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011222394.7A Active CN112330823B (en) 2020-11-05 2020-11-05 Virtual prop display method, device, equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN112330823B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113797529B (en) * 2021-09-18 2023-11-21 珠海金山数字网络科技有限公司 Target display method and device, computing equipment and computer readable storage medium
CN116943214A (en) * 2022-04-14 2023-10-27 腾讯科技(深圳)有限公司 Virtual prop using method, device, equipment, medium and program product
CN117618919A (en) * 2022-08-12 2024-03-01 腾讯科技(成都)有限公司 Transformation processing method and device for virtual prop, electronic equipment and storage medium

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009259130A (en) * 2008-04-18 2009-11-05 Sony Computer Entertainment Inc Image display device, method for controlling the image display device, and program
EP2426910A1 (en) * 2010-09-06 2012-03-07 Guangzhou SAT Infrared Technology Co., Ltd. Infrared camera with rotatable core assembly
JP2018036720A (en) * 2016-08-29 2018-03-08 株式会社タカラトミー Virtual space observation system, method and program
CN108671540A (en) * 2018-05-09 2018-10-19 腾讯科技(深圳)有限公司 Accessory switching method, equipment and storage medium in virtual environment
CN108815851A (en) * 2018-06-05 2018-11-16 腾讯科技(深圳)有限公司 Interface display method, equipment and storage medium when being shot in virtual environment
CN109224439A (en) * 2018-10-22 2019-01-18 网易(杭州)网络有限公司 The method and device of game aiming, storage medium, electronic device
CN109529319A (en) * 2018-11-28 2019-03-29 腾讯科技(深圳)有限公司 Display methods, equipment and the storage medium of interface control
WO2019153750A1 (en) * 2018-02-09 2019-08-15 腾讯科技(深圳)有限公司 Method, apparatus and device for view switching of virtual environment, and storage medium
WO2019201047A1 (en) * 2018-04-16 2019-10-24 腾讯科技(深圳)有限公司 Method for adjusting viewing angle in virtual environment, device, and readable storage medium
CN110413171A (en) * 2019-08-08 2019-11-05 腾讯科技(深圳)有限公司 Control method, apparatus, equipment and medium that virtual objects carry out prompt operation
CN110448908A (en) * 2019-08-22 2019-11-15 腾讯科技(深圳)有限公司 The application method of gun sight, device, equipment and storage medium in virtual environment
CN110465083A (en) * 2019-08-16 2019-11-19 腾讯科技(深圳)有限公司 Map area control method, device, equipment and medium in virtual environment
CN110559662A (en) * 2019-09-12 2019-12-13 腾讯科技(深圳)有限公司 Visual angle switching method, device, terminal and medium in virtual environment
CN111202979A (en) * 2020-01-06 2020-05-29 腾讯科技(深圳)有限公司 Virtual item control method and device, electronic equipment and storage medium
CN111408133A (en) * 2020-03-17 2020-07-14 腾讯科技(深圳)有限公司 Interactive property display method, device, terminal and storage medium
CN111589141A (en) * 2020-05-14 2020-08-28 腾讯科技(深圳)有限公司 Virtual environment picture display method, device, equipment and medium
CN111617472A (en) * 2020-05-27 2020-09-04 腾讯科技(深圳)有限公司 Method and related device for managing model in virtual scene
WO2020215792A1 (en) * 2019-04-22 2020-10-29 网易(杭州)网络有限公司 Game angle of view control method and device
CN111841014A (en) * 2020-07-22 2020-10-30 腾讯科技(深圳)有限公司 Virtual article display method and device, electronic equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10688396B2 (en) * 2017-04-28 2020-06-23 Sony Interactive Entertainment Inc. Second screen virtual window into VR environment

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009259130A (en) * 2008-04-18 2009-11-05 Sony Computer Entertainment Inc Image display device, method for controlling the image display device, and program
EP2426910A1 (en) * 2010-09-06 2012-03-07 Guangzhou SAT Infrared Technology Co., Ltd. Infrared camera with rotatable core assembly
JP2018036720A (en) * 2016-08-29 2018-03-08 株式会社タカラトミー Virtual space observation system, method and program
WO2019153750A1 (en) * 2018-02-09 2019-08-15 腾讯科技(深圳)有限公司 Method, apparatus and device for view switching of virtual environment, and storage medium
WO2019201047A1 (en) * 2018-04-16 2019-10-24 腾讯科技(深圳)有限公司 Method for adjusting viewing angle in virtual environment, device, and readable storage medium
CN108671540A (en) * 2018-05-09 2018-10-19 腾讯科技(深圳)有限公司 Accessory switching method, equipment and storage medium in virtual environment
CN108815851A (en) * 2018-06-05 2018-11-16 腾讯科技(深圳)有限公司 Interface display method, equipment and storage medium when being shot in virtual environment
CN109224439A (en) * 2018-10-22 2019-01-18 网易(杭州)网络有限公司 The method and device of game aiming, storage medium, electronic device
CN109529319A (en) * 2018-11-28 2019-03-29 腾讯科技(深圳)有限公司 Display methods, equipment and the storage medium of interface control
WO2020215792A1 (en) * 2019-04-22 2020-10-29 网易(杭州)网络有限公司 Game angle of view control method and device
CN110413171A (en) * 2019-08-08 2019-11-05 腾讯科技(深圳)有限公司 Control method, apparatus, equipment and medium that virtual objects carry out prompt operation
CN110465083A (en) * 2019-08-16 2019-11-19 腾讯科技(深圳)有限公司 Map area control method, device, equipment and medium in virtual environment
CN110448908A (en) * 2019-08-22 2019-11-15 腾讯科技(深圳)有限公司 The application method of gun sight, device, equipment and storage medium in virtual environment
CN110559662A (en) * 2019-09-12 2019-12-13 腾讯科技(深圳)有限公司 Visual angle switching method, device, terminal and medium in virtual environment
CN111202979A (en) * 2020-01-06 2020-05-29 腾讯科技(深圳)有限公司 Virtual item control method and device, electronic equipment and storage medium
CN111408133A (en) * 2020-03-17 2020-07-14 腾讯科技(深圳)有限公司 Interactive property display method, device, terminal and storage medium
CN111589141A (en) * 2020-05-14 2020-08-28 腾讯科技(深圳)有限公司 Virtual environment picture display method, device, equipment and medium
CN111617472A (en) * 2020-05-27 2020-09-04 腾讯科技(深圳)有限公司 Method and related device for managing model in virtual scene
CN111841014A (en) * 2020-07-22 2020-10-30 腾讯科技(深圳)有限公司 Virtual article display method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN112330823A (en) 2021-02-05

Similar Documents

Publication Publication Date Title
CN108434736B (en) Equipment display method, device, equipment and storage medium in virtual environment battle
CN110694261B (en) Method, terminal and storage medium for controlling virtual object to attack
CN108619721B (en) Distance information display method and device in virtual scene and computer equipment
CN110413171B (en) Method, device, equipment and medium for controlling virtual object to perform shortcut operation
KR102637047B1 (en) Virtual object control method, device and media for marking virtual items
CN110755841B (en) Method, device and equipment for switching props in virtual environment and readable storage medium
CN112494955B (en) Skill releasing method, device, terminal and storage medium for virtual object
CN111589142A (en) Virtual object control method, device, equipment and medium
CN112330823B (en) Virtual prop display method, device, equipment and readable storage medium
CN111921197B (en) Method, device, terminal and storage medium for displaying game playback picture
CN110694273A (en) Method, device, terminal and storage medium for controlling virtual object to use prop
CN111035918A (en) Reconnaissance interface display method and device based on virtual environment and readable storage medium
CN111589127B (en) Control method, device and equipment of virtual role and storage medium
CN113398571B (en) Virtual item switching method, device, terminal and storage medium
WO2021031765A1 (en) Application method and related apparatus of sighting telescope in virtual environment
CN108786110B (en) Method, device and storage medium for displaying sighting telescope in virtual environment
CN111672106B (en) Virtual scene display method and device, computer equipment and storage medium
CN111589141B (en) Virtual environment picture display method, device, equipment and medium
CN112316421A (en) Equipment method, device, terminal and storage medium of virtual prop
CN108744511B (en) Method, device and storage medium for displaying sighting telescope in virtual environment
CN112221142A (en) Control method and device of virtual prop, computer equipment and storage medium
CN111013137B (en) Movement control method, device, equipment and storage medium in virtual scene
CN111249726B (en) Operation method, device, equipment and readable medium of virtual prop in virtual environment
CN113289336A (en) Method, apparatus, device and medium for tagging items in a virtual environment
CN111035929B (en) Elimination information feedback method, device, equipment and medium based on virtual environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40038701

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant