WO2024032176A1 - 虚拟道具的处理方法、装置、电子设备、存储介质及程序产品 - Google Patents

虚拟道具的处理方法、装置、电子设备、存储介质及程序产品 Download PDF

Info

Publication number
WO2024032176A1
WO2024032176A1 PCT/CN2023/102688 CN2023102688W WO2024032176A1 WO 2024032176 A1 WO2024032176 A1 WO 2024032176A1 CN 2023102688 W CN2023102688 W CN 2023102688W WO 2024032176 A1 WO2024032176 A1 WO 2024032176A1
Authority
WO
WIPO (PCT)
Prior art keywords
processing
interface
sliding
virtual
processing interface
Prior art date
Application number
PCT/CN2023/102688
Other languages
English (en)
French (fr)
Inventor
孙单
刘焕然
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Publication of WO2024032176A1 publication Critical patent/WO2024032176A1/zh

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/63Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by the player, e.g. authoring using a level editor
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets

Definitions

  • the present application relates to the field of computer human-computer interaction technology, and in particular to a virtual prop processing method, device, electronic equipment, storage medium and program product.
  • the human-computer interaction technology of virtual scenes based on graphics processing hardware can realize diversified interactions between virtual objects controlled by users or artificial intelligence according to actual application requirements, and has broad practical value.
  • virtual scenes such as games
  • the real battle process between virtual objects can be simulated.
  • firearms have multiple parts that can be processed, when players process multiple parts of a gun, they need to Repeat the following processing: select the component that needs to be processed (such as the muzzle) in the whole gun interface and enter the muzzle processing interface. After completing the muzzle processing in the muzzle processing interface, return to the whole gun interface. In the whole gun interface Re-select the parts that need to be processed (such as the handguard) and enter the handguard processing interface. After completing the handguard processing interface, return to the entire gun interface and select other parts that need to be processed again.
  • the component that needs to be processed such as the muzzle
  • enter the muzzle processing interface After completing the muzzle processing in the muzzle processing interface, return to the whole gun interface.
  • the whole gun interface Re-select the parts that need to be processed (such as the handguard) and enter the handguard processing interface. After completing the handguard processing interface, return to the entire gun interface and select other parts that need to be processed again.
  • Embodiments of the present application provide a method, device, electronic device, computer-readable storage medium, and computer program product for processing virtual props, which can improve the processing efficiency of virtual props.
  • An embodiment of the present application provides a method for processing virtual props, which is executed by an electronic device, including:
  • the processing portal In response to a triggering operation for the processing portal, display a first processing interface, wherein the first processing interface at least includes processing controls;
  • An embodiment of the present application provides a processing device for virtual props, including:
  • a display module configured to display a processing entrance for virtual props in the virtual scene
  • the display module is further configured to display a first processing interface in response to a triggering operation for the processing portal, Wherein, the first processing interface at least includes processing controls;
  • the display module is further configured to, in response to a triggering operation on the processing control, display the processed virtual props instead of displaying the virtual props before processing;
  • a switching module configured to switch from displaying the first processing interface to displaying a second processing interface different from the first processing interface in response to an interface jump triggering operation.
  • An embodiment of the present application provides an electronic device, including:
  • Memory used to store executable instructions
  • the processor is configured to implement the virtual prop processing method provided by the embodiment of the present application when executing executable instructions stored in the memory.
  • Embodiments of the present application provide a computer-readable storage medium that stores computer-executable instructions for implementing the virtual prop processing method provided by embodiments of the present application when executed by a processor.
  • Embodiments of the present application provide a computer program product, which includes a computer program or computer executable instructions, used to implement the virtual prop processing method provided by embodiments of the present application when executed by a processor.
  • Figure 1A is a schematic diagram of the application mode of the virtual prop processing method provided by the embodiment of the present application.
  • Figure 1B is a schematic diagram of the application mode of the virtual prop processing method provided by the embodiment of the present application.
  • Figure 2 is a schematic structural diagram of an electronic device 500 provided by an embodiment of the present application.
  • Figure 3 is a schematic flowchart of a method for processing virtual props provided by an embodiment of the present application
  • FIGS. 4A to 4C are schematic diagrams of application scenarios of the virtual prop processing method provided by the embodiment of the present application.
  • Figure 5 is a schematic structural diagram of a virtual rifle provided by an embodiment of the present application.
  • Figure 6 is a schematic flowchart of a method for processing virtual props provided by an embodiment of the present application.
  • Figures 7A and 7B are schematic diagrams of application scenarios of the virtual prop processing method provided by the embodiment of the present application.
  • Figure 8 is a schematic flowchart of a method for processing virtual props provided by an embodiment of the present application.
  • Figure 9 is a schematic quadrant diagram provided by the embodiment of the present application.
  • data related to user information (such as data related to virtual props owned by virtual objects controlled by the user) is involved.
  • user permission or consent is required, and the collection, use and processing of relevant data need to comply with relevant laws, regulations and standards of relevant countries and regions.
  • first ⁇ second ⁇ involved are only used to distinguish similar objects and do not represent a specific ordering of objects. It is understandable that "first ⁇ second ⁇ ..” .” The specific order or sequence may be interchanged where permitted, so that the embodiments of the application described herein can be implemented in an order other than that illustrated or described herein.
  • Virtual scene It is the scene displayed (or provided) when the application is running on the terminal device.
  • the virtual scene can be a simulation environment of the real world, a semi-simulation and semi-fictitious virtual environment, or a purely fictitious virtual environment.
  • the virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, or a three-dimensional virtual scene.
  • the embodiments of this application do not limit the dimensions of the virtual scene.
  • the virtual scene can include the sky, land, ocean, etc.
  • the land can include environmental elements such as deserts and cities, and the user can control virtual objects to move in the virtual scene.
  • Virtual props Props that can be used by virtual objects in virtual scenes are structurally composed of multiple components. For example, it can be virtual shooting props used to attack other virtual objects, such as virtual firearms, virtual bows and arrows, etc.; it can also be virtual vehicles used for virtual objects to drive in virtual scenes, such as virtual vehicles, virtual ships, virtual airplanes, Virtual bikes and more.
  • Virtual objects images of various people and objects that can interact in the virtual scene, or movable objects in the virtual scene.
  • the movable object may be a virtual character, a virtual animal, an animation character, etc., such as a character, animal, etc. displayed in a virtual scene.
  • the virtual object may be a virtual avatar representing the user in the virtual scene.
  • the virtual scene may include multiple virtual objects. Each virtual object has its own shape and volume in the virtual scene and occupies a part of the space in the virtual scene.
  • Scene data represents the characteristic data of the virtual scene, for example, it can be the area of the construction area in the virtual scene, the architectural style of the virtual scene currently located, etc.; it can also include the location of the virtual building in the virtual scene, and the virtual building of floor space, etc.
  • Client Application programs running in terminal devices to provide various services, such as video playback clients, game clients, etc.
  • Virtual prop processing that is, changes to virtual props, including color updates, structural modifications, etc. Take the transformation of virtual props as an example. Transformation is an operation to change the structure of a virtual prop, including operations such as disassembly, installation, and replacement of parts of the virtual prop. For example, a new muzzle can be used to replace the original part of the virtual firearm. muzzle, or install front grip, laser equipment and other accessories on the handguard.
  • Embodiments of the present application provide a method, device, electronic device, computer-readable storage medium, and computer program product for processing virtual props, which can improve the processing efficiency of virtual props.
  • an exemplary implementation scenario of the virtual prop processing method provided by the embodiment of the present application is first described.
  • the virtual props in the virtual prop processing method provided by the embodiment of the present application are Scenarios can be completely based on terminal device output, or based on terminal device and server collaborative output.
  • the virtual scene can be an environment for virtual objects (such as game characters) to interact.
  • virtual objects such as game characters
  • it can be a place for game characters to compete in the virtual scene.
  • two parties can interact in the virtual scene. , thus enabling users to relieve life stress during the game.
  • Figure 1A is a schematic diagram of the application mode of the virtual prop processing method provided by the embodiment of the present application. It is suitable for some virtual scenes 100 that completely rely on the graphics processing hardware computing power of the terminal device 400.
  • the application mode of relevant data calculation such as a stand-alone version/offline mode game, completes the output of the virtual scene through various different types of terminal devices 400 such as smartphones, tablets, and virtual reality/augmented reality devices.
  • graphics processing hardware examples include central processing units (CPU, Central Processing Unit) and graphics processing units (GPU, Graphics Processing Unit).
  • CPU central processing units
  • GPU Graphics Processing Unit
  • the terminal device 400 calculates the data required for display through the graphics computing hardware, completes the loading, parsing and rendering of the display data, and outputs video frames capable of forming the visual perception of the virtual scene through the graphics output hardware.
  • video frames capable of forming the visual perception of the virtual scene through the graphics output hardware.
  • two-dimensional video frames are presented on the display screen of a smartphone, or video frames that achieve a three-dimensional display effect are projected on the lenses of augmented reality/virtual reality glasses; in addition, in order to enrich the perception effect, the terminal device 400 can also use Different hardware to form one or more of auditory perception, tactile perception, motion perception and taste perception.
  • the terminal device 400 runs a client 410 (for example, a stand-alone version of a game application).
  • client 410 for example, a stand-alone version of a game application.
  • the virtual scene may be an environment for game characters to interact, for example It can be plains, streets, valleys, etc. for game characters to fight; taking the first-person perspective to display the virtual scene 100 as an example, a virtual object 101 is displayed in the virtual scene 100, where the virtual object 101 can be controlled by the user.
  • the game character that is, the virtual object 101 is controlled by a real user and will move in the virtual scene 100 in response to the real user's operations on the controller (such as a touch screen, voice-activated switches, keyboard, mouse, joystick, etc.), such as When the real user moves the joystick to the right, the virtual object 101 will move to the right in the virtual scene 100, and can also remain stationary, jump, and control the virtual object 101 to perform shooting operations.
  • the controller such as a touch screen, voice-activated switches, keyboard, mouse, joystick, etc.
  • a virtual object 101 and a virtual firearm 102 held by the virtual object 101 are displayed in the virtual scene 100 .
  • a processing portal 103 for the virtual firearm 102 is also displayed in the virtual scene 100 .
  • the virtual scene 100 displayed in the human-computer interaction interface is switched to the first processing interface 104 (for example, the processing interface of the gun muzzle), where the first processing
  • the interface 104 displays the first component 105 of the virtual firearm 102 (for example, the muzzle), and the processing control 106 of the first component 105 (for example, the modification control that can be used to replace the new muzzle of the first component 105); then the client 410 may display the processed first component 105 (e.g., a new muzzle) in response to a triggering operation of the processing control 106 for the first component 105, instead of displaying the pre-processed first component 105, thereby completing the processing for the muzzle.
  • the processed first component 105 e.g., a new muzzle
  • the client 410 can directly switch from displaying the first processing interface 104 to displaying the second processing interface 107 (for example, the processing interface of the handguard) in response to the interface jump triggering operation based on the first processing interface 104, where the second The processing interface 107 displays the second component 108 (such as the handguard) of the virtual firearm 102 and the processing control 109 of the second component 108 (such as the right rail that can be installed on the second component 108).
  • the second component 108 such as the handguard
  • the processing control 109 of the second component 108 such as the right rail that can be installed on the second component 108.
  • FIG. 1B is a schematic diagram of the application mode of the virtual prop processing method provided by the embodiment of the present application. It is applied to the terminal device 400 and the server 200 and is suitable for completing virtualization relying on the computing power of the server 200 . The scene is calculated and the application mode of the virtual scene is output on the terminal device 400.
  • the server 200 calculates the virtual scene-related display data (such as scene data) and sends it to the terminal device 400 through the network 300.
  • the terminal device 400 relies on the graphics computing hardware to complete the loading of the calculation display data. , parsing and rendering, relying on graphics output hardware to output virtual scenes to form visual perception.
  • two-dimensional video frames can be presented on the display screen of a smartphone, or projected on the lenses of augmented reality/virtual reality glasses to achieve a three-dimensional display effect. video frames; for the perception of the form of the virtual scene
  • corresponding hardware output of the terminal device 400 can be used, such as using a microphone to form auditory perception, using a vibrator to form tactile perception, and so on.
  • the terminal device 400 runs a client 410 (for example, a network version of a game application), and interacts with other users by connecting to the server 200 (for example, a game server).
  • the terminal device 400 outputs the virtual scene 100 of the client 410 to For example, a virtual scene 100 is displayed from a first-person perspective.
  • a virtual object 101 is displayed in the virtual scene 100.
  • the virtual object 101 may be a game character controlled by the user. That is, the virtual object 101 is controlled by a real user and will respond to the real user's actions. Movement in the virtual scene 100 in response to the operation of a controller (such as a touch screen, voice-activated switch, keyboard, mouse, joystick, etc.).
  • a controller such as a touch screen, voice-activated switch, keyboard, mouse, joystick, etc.
  • the real user moves the joystick to the right
  • the virtual object 101 will move in the virtual scene 100 By moving to the right, you can also stay still, jump, and control the virtual object 101 to perform shooting operations.
  • a virtual object 101 and a virtual firearm 102 held by the virtual object 101 are displayed in the virtual scene 100 .
  • a processing portal 103 for the virtual firearm 102 is also displayed in the virtual scene 100 .
  • the virtual scene 100 displayed in the human-computer interaction interface is switched to the first processing interface 104 (for example, the processing interface of the gun muzzle), where the first processing
  • the first component 105 of the virtual firearm 102 such as a muzzle
  • the processing control 106 of the first component 105 such as a new muzzle for replacing the first component 105
  • the client 410 can respond to
  • the processed first component 105 is displayed instead of displaying the first component 105 before processing, thereby completing the processing of the muzzle;
  • the client 410 In response to the interface jump triggering operation based on the first processing interface 104, directly switching from displaying the first processing interface 104 to displaying the second processing interface 107 (for example, the processing interface of the handguard), wherein the second processing interface 107 displays There is a second part 108 of the virtual firearm
  • the terminal device 400 can also implement the virtual prop processing method provided by the embodiments of the present application by running a computer program.
  • the computer program can be a native program or software module in the operating system; it can be a native program.
  • Application APP, APplication
  • APP Application
  • APplication a program that needs to be installed in the operating system to run, such as a shooting game APP (ie, the above-mentioned client 410); it can also be a small program, that is, it only needs to be downloaded to the browser environment
  • the program can be run.
  • the computer program described above can be any form of application, module or plug-in.
  • the terminal device 400 installs and runs an application program that supports virtual scenes.
  • the application can be any one of a first-person shooting game (FPS, First-Person Shooting game), a third-person shooting game, a virtual reality application, a three-dimensional map program, or a multiplayer gunfight survival game.
  • the user uses the terminal device 400 to operate virtual objects located in the virtual scene to perform activities, which activities include but are not limited to: adjusting body posture, crawling, walking, running, riding, jumping, driving, picking up, shooting, attacking, throwing, building virtual At least one of the buildings.
  • the virtual character may be a virtual character, such as a simulated character or an animation character.
  • Cloud Technology refers to the unification of a series of resources such as hardware, software, and networks within a wide area network or a local area network to realize data calculation and storage.
  • Cloud technology is a general term for network technology, information technology, integration technology, management platform technology, and application technology based on the cloud computing business model. It can form a resource pool and use it on demand, which is flexible and convenient. Cloud computing technology will become an important support. The background services of technical network systems require a large amount of computing and storage resources.
  • the server 200 in Figure 1B can be an independent physical server, a server cluster or a distributed system composed of multiple physical servers, or it can provide cloud services, cloud databases, cloud computing, cloud functions, and cloud storage. , network services, cloud communications, middleware services, domain name services, security services, content delivery network (CDN, Content Delivery Network), and cloud servers for basic cloud computing services such as big data and artificial intelligence platforms.
  • the terminal device 400 can be a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart speaker, a smart watch, etc., but is not limited thereto.
  • the terminal device 400 and the server 200 can be connected directly or indirectly through wired or wireless communication methods, which are not limited in the embodiments of this application.
  • FIG. 2 is a schematic structural diagram of an electronic device 500 provided by an embodiment of the present application.
  • the electronic device 500 shown in Figure 2 includes: at least one processor 510, a memory 550, at least one Network interface 520 and user interface 530.
  • the various components in electronic device 500 are coupled together by bus system 540 .
  • bus system 540 is used to implement connection communication between these components.
  • the bus system 540 also includes a power bus, a control bus and a status signal bus.
  • the various buses are labeled bus system 540 in FIG. 2 .
  • the processor 510 may be an integrated circuit chip with signal processing capabilities, such as a general-purpose processor, a digital signal processor (DSP, Digital Signal Processor), or other programmable logic devices, discrete gate or transistor logic devices, or discrete hardware Components, etc., wherein the general processor can be a microprocessor or any conventional processor, etc.
  • DSP Digital Signal Processor
  • User interface 530 includes one or more output devices 531 that enable the presentation of media content, including one or more speakers and/or one or more visual displays.
  • User interface 530 also includes one or more input devices 532, including user interface components that facilitate user input, such as a keyboard, mouse, microphone, touch screen display, camera, and other input buttons and controls.
  • Memory 550 may be removable, non-removable, or a combination thereof.
  • Exemplary hardware devices include solid state memory, hard disk drives, optical disk drives, etc.
  • Memory 550 optionally includes one or more storage devices physically located remotely from processor 510 .
  • Memory 550 includes volatile memory or non-volatile memory, and may include both volatile and non-volatile memory.
  • Non-volatile memory can be read-only memory (ROM, Read Only Memory), and volatile memory can be random access memory (RAM, Random Access Memory).
  • RAM Random Access Memory
  • the memory 550 described in the embodiments of this application is intended to include any suitable type of memory.
  • the memory 550 is capable of storing data to support various operations, examples of which include programs, modules, and data structures, or subsets or supersets thereof, as exemplarily described below.
  • the operating system 551 includes system programs used to process various basic system services and perform hardware-related tasks, such as the framework layer, core library layer, driver layer, etc., which are used to implement various basic services and process hardware-based tasks;
  • Network communications module 552 for reaching other computing devices via one or more (wired or wireless) network interfaces 520
  • example network interfaces 520 include: Bluetooth, Wireless Compliance Certified (WiFi), and Universal Serial Bus ( USB, Universal Serial Bus), etc.;
  • Presentation module 553 for enabling the presentation of information (e.g., a user interface for operating peripheral devices and displaying content and information) via one or more output devices 531 (e.g., display screens, speakers, etc.) associated with user interface 530 );
  • information e.g., a user interface for operating peripheral devices and displaying content and information
  • output devices 531 e.g., display screens, speakers, etc.
  • An input processing module 554 for detecting one or more user inputs or interactions from one or more input devices 532 and translating the detected inputs or interactions.
  • the device provided by the embodiment of the present application can be implemented in a software manner.
  • Figure 2 shows a virtual prop processing device 555 stored in the memory 550, which can be software in the form of programs, plug-ins, etc., including the following Software modules: display module 5551, switching module 5552, acquisition module 5553, determination module 5554, click Multiplication module 5555, control module 5556, detection module 5557, transfer module 5558, adjustment module 5559, shooting module 55510, loading module 55511, interpolation module 55512 and insertion module 55513, these modules are logical, so according to the functions implemented Any combination or further split is possible.
  • Figure 3 is a schematic flowchart of a method for processing virtual props provided by an embodiment of the present application, which will be described in conjunction with the steps shown in Figure 3.
  • the method shown in Figure 3 can be executed by various forms of computer programs running on the terminal device, and is not limited to the client.
  • it can also be the operating system, software module, script and applet mentioned above. Programs, etc., therefore, the example of the client in the following should not be regarded as limiting the embodiments of the present application.
  • the terminal device and the client running on the terminal device in the following.
  • step 301 a processing entry for virtual props in the virtual scene is displayed.
  • a client that supports virtual scenes is installed on the terminal device (for example, when the virtual scene is a game, the corresponding client can be a shooting game APP).
  • the user opens the client installed on the terminal device (for example, the user clicks on the icon corresponding to the shooting game APP presented on the user interface of the terminal device), and when the terminal device runs the client, the virtual object (for example, controlled by The virtual object A) of the user 1 and the virtual props (such as virtual shooting props, virtual throwing props, etc.) held by the virtual object A through the holding part (such as the hand).
  • a processing entrance for the virtual prop may also be displayed in the virtual scene.
  • the virtual prop is a virtual firearm
  • a processing entrance for the virtual firearm may be displayed in the virtual scene.
  • step 302 in response to a triggering operation for the processing portal, the first processing interface is displayed.
  • the first processing interface at least includes processing controls.
  • the first processing interface may also include the first component of the virtual prop.
  • the first component may be any component to be modified in the virtual prop. ;
  • components in the virtual props other than the first component may not be displayed, may be partially displayed, or may be displayed in full.
  • the displayed number of components other than the first component may depend on the scaling ratio of the first processing interface (ie, the ratio between the size of the virtual prop and the size of the first processing interface).
  • the larger the scaling ratio the smaller the display quantity. , to make it easier to observe the details of the components.
  • the smaller the zoom ratio the greater the number displayed, making it easier to observe the overall structure of the virtual props.
  • types of processing controls in the first processing interface may include color controls for changing colors and modification controls for modification.
  • the processing control can be dedicated to processing the first component, that is, different components each have a corresponding processing control; the processing control can also be universal, that is, used to batch process multiple components in the virtual props including the first component.
  • the terminal device may also perform the following processing: display a virtual prop viewing interface, where the virtual prop viewing interface includes multiple components of the virtual props; in response to the The selection operation of the first component is transferred to the processing of displaying the first processing interface.
  • the terminal device when the terminal device receives the user's click operation on the processing entrance of the virtual firearm, it can first display the virtual firearm viewing interface (such as the whole gun interface, in which a virtual firearm is displayed.
  • the terminal device responds to the user's selection operation for the first part (such as the muzzle) in the virtual firearm viewing interface (such as receiving the user's Clicking the interactive button for the muzzle) displays a first processing interface (i.e., a muzzle processing interface), where the first component of the virtual firearm (e.g., the muzzle) and the gun can be displayed in the first processing interface.
  • mouth place management controls such as a new muzzle to replace the original muzzle).
  • Figure 4A is a schematic diagram of an application scenario of the method for processing virtual props provided by an embodiment of the present application.
  • a virtual scene 400 is displayed in the human-computer interaction interface, and a virtual scene 400 is displayed in the virtual scene 400.
  • a virtual object 401 for example, game character A controlled by user 1
  • a virtual gun 402 held by the virtual object 401.
  • a processing entrance 403 for the virtual firearm 402 is also displayed in the virtual scene 400 .
  • the virtual scene 400 displayed in the human-computer interaction interface is switched to the virtual firearm viewing interface 404 (ie, the whole gun interface).
  • the virtual firearm viewing interface 404 Multiple components of the virtual firearm 402 that can be processed are shown, including, for example, a barrel 405, a handguard 406, a magazine 407, and an optical sight 408.
  • the virtual firearm viewing interface 404 displayed in the human-computer interaction interface is switched to the handguard processing interface 409.
  • a handguard 410 of the virtual firearm 402 and a handling control 411 of the handguard 410 are displayed.
  • step 303 in response to a triggering operation on the processing control, the processed virtual prop is displayed instead of displaying the pre-processed virtual prop.
  • the processing control in response to a triggering operation for the processing control, some or all components in the virtual props are processed in batches, and the processed virtual props are displayed to replace The virtual props before processing are displayed. After the parts in the virtual props are batch-processed, some of the virtual props before processing are displayed instead; after all the parts in the virtual props are batch-processed, the before-processing virtual props are displayed. All virtual props are displayed instead.
  • step 303 may be implemented in the following manner: in response to a triggering operation on the processing control, One component is processed, and the processed first component is displayed instead of displaying the first component before processing; wherein, when the processed virtual prop is displayed on the first processing interface, in addition to displaying the processed first component, the virtual prop is Components other than the first component may not be displayed, may be partially or fully displayed, and the number of components other than the first component displayed may depend on the scaling ratio of the first processing interface (i.e., the size of the virtual props is proportional to the size of the first processing interface). (ratio between sizes), the larger the zoom ratio, the smaller the number displayed to facilitate observation of the details of the components; the smaller the zoom ratio, the larger the number displayed to facilitate observation of the overall structure of the virtual props.
  • the terminal device when the terminal device receives a user's triggering operation for the processing control of the first component, the terminal device may display the processed first component instead of displaying the pre-processed first component.
  • the terminal device may display the processed first component instead of displaying the pre-processed first component.
  • the terminal device may display the processed first component instead of displaying the pre-processed first component.
  • the terminal device may display the processed first component instead of displaying the pre-processed first component.
  • the terminal device may display the processed first component instead of displaying the pre-processed first component.
  • the terminal device may display the processed first component instead of displaying the pre-processed first component.
  • the terminal device may display the processed first component instead of displaying the pre-processed first component.
  • the terminal device When the terminal device receives the user's trigger operation for the processing control of the handguard (for example, for the front grip of multiple accessories that can be installed on the handguard), When selecting the operation), you can add the selected foregrip to the handguard of the virtual firearm to realize the processing of the handguard.
  • step 304 in response to the interface jump triggering operation, switching from displaying the first processing interface to displaying a second processing interface different from the first processing interface.
  • the display method of the second processing interface is similar to the display method of the first processing interface.
  • the processing controls can be displayed in the second processing interface; in some embodiments, the processing controls can also be displayed in the second processing interface.
  • Display the second component which can be any component to be modified in the virtual prop other than the first component;
  • the processing controls in the second processing interface can be universal, that is, used for batch processing of virtual props including Multiple components including the second component;
  • the processing control can also be a control dedicated to transforming the second component, that is, the processing control of the second component.
  • the first processing interface may also include a second component of the virtual prop, and the interface jump triggering operation may be a triggering operation for the second component.
  • the terminal device may implement step 304 in the following manner: respond Upon a triggering operation (such as a click operation or an operation of drawing a specific graphic) on the second component in the first processing interface, the display of the first processing interface is switched to the display of the second processing interface.
  • a triggering operation such as a click operation or an operation of drawing a specific graphic
  • Figure 4B is a schematic diagram of an application scenario of the virtual prop processing method provided by the embodiment of the present application.
  • the processing interface 412 on the rear grip
  • the processing interface 412 also displays other components of the virtual firearm, including, for example, the magazine 414, the butt 415, and the sight 416.
  • the magazine 414 i.e., the second component
  • the rear grip processing interface 412 displayed in the human-computer interaction interface is directly switched to the cartridge.
  • the magazine processing interface 417 displays the magazine 414 of the virtual firearm and the processing control 418 of the magazine 414 (for example, a new magazine used to expand the capacity of the magazine 414). In this way, through By clicking on the components of the firearm model, you can quickly jump directly from the processing interface of one component to the processing interface of another component, which improves the processing efficiency of virtual props.
  • the first processing interface may also include browsing controls corresponding to at least one direction.
  • the interface jump triggering operation may be a triggering operation for the browsing control.
  • the terminal device may implement the above step 304 in the following manner. : In response to the triggering operation of the browsing control for the first direction in the first processing interface, switching from displaying the first processing interface to displaying the second processing interface, wherein the distribution direction of the second component relative to the first component is the first The opposite direction of the direction, and is the component closest to the first component in the opposite direction.
  • Figure 4C is a schematic diagram of an application scenario of the virtual prop processing method provided by the embodiment of the present application.
  • the processing at the muzzle In addition to displaying the muzzle 420 and the processing control 421 of the muzzle 420 (for example, a new muzzle used to replace the muzzle 420), the interface 419 also displays a browsing control 422 including the four directions of up, down, left, and right.
  • the terminal device When receiving the user's click operation on the left direction key of the browsing control 422 (because the handguard is the component of the virtual firearm located on the right side of the muzzle and closest to the muzzle), the terminal device will display in the human-computer interaction interface
  • the muzzle processing interface 419 is directly switched to the handguard processing interface 409.
  • the handguard 410 and the processing control 411 of the handguard 410 are displayed in the handguard processing interface 409 (for example, for installation on the handguard 410 (left and right guide rails), so that by triggering the browsing control, you can quickly jump directly from the processing interface of one component to the processing interface of another component, which improves the processing efficiency of virtual props.
  • the interface jump triggering operation may also be a sliding operation
  • the terminal device may implement the above step 304 in the following manner: respond to the sliding operation in the first processing interface, and the sliding direction of the sliding operation is located in the first processing interface.
  • the first direction interval of a component that is, the sub-interval of the 0-360 degree direction interval that starts from the first component and slides outward.
  • the direction perpendicular to the muzzle and upward can be regarded as 0 degrees, so as to range from 0 degrees to 360 degrees.
  • the degree is divided into different direction intervals.
  • the first direction interval can be 225 degrees to 315 degrees, that is, the interval centered directly to the left of the muzzle (i.e. 270 degrees).
  • the reverse interval of the first direction interval is an interval formed by the opposite directions of the two boundary directions of the first direction interval.
  • the reverse interval is 45 degrees to 135 degrees, that is, the second part of the virtual prop is distributed in the interval centered directly to the right of the gun muzzle (i.e. 90 degrees), and the distance between the second part and the first part , is proportional to the sliding distance of the sliding operation, that is, the greater the sliding distance of the sliding operation, the greater the distance between the second component and the first component.
  • the terminal device taking the virtual prop as a virtual firearm, on the right side of the muzzle, there are handguards, receivers and butts distributed in order from near to far from the muzzle, so 2 can be preset.
  • Different levels of distance thresholds are L1 (for example, 1 cm) and L2 (for example, 2 cm).
  • L1 for example, 1 cm
  • L2 for example, 2 cm
  • the terminal device When the sliding distance of the sliding operation is less than or equal to L1 (for example, 0.7 cm), the terminal device will display it in the human-computer interaction interface.
  • the muzzle processing interface is directly switched to the handguard processing interface; when the sliding distance of the sliding operation is greater than L1 and less than or equal to L2 (for example, 1.4 cm), the terminal device will display the muzzle of the gun in the human-computer interaction interface.
  • the processing interface directly switch to the processing interface of the casing; when the sliding distance of the sliding operation is greater than L2 (for example, 2.3 cm), the terminal device will be in the human-computer interaction interface
  • L2 for example, 2.3 cm
  • the displayed muzzle processing interface is directly switched to the gun butt processing interface.
  • the user can flexibly jump from the muzzle processing interface to the processing interface of different components according to the sliding distance of the sliding operation. That is, the user can flexibly jump from the muzzle processing interface to the processing interface of different components according to the sliding distance of the sliding operation. It is necessary to control the sliding distance of the sliding operation to jump to the processing interface of the corresponding component, which greatly improves the processing efficiency of virtual firearms.
  • the interface jump triggering operation may be a sliding operation
  • the terminal device may implement the above-mentioned step 304 in the following manner: responding to the sliding operation in the first processing interface (for example, the processing interface of the muzzle), and
  • the sliding direction of the sliding operation is located in the first direction interval of the first component (for example, it can be 225 degrees to 315 degrees of the muzzle), and the display of the first processing interface is directly switched to the display of the second processing interface, wherein the first direction interval
  • the second part of the virtual prop is distributed in the reverse interval (for example, 45 degrees to 135 degrees of the muzzle), and the second part is the component (for example, the handguard) closest to the first part in the reverse interval.
  • the closest component in response to the sliding operation in the processing interface of the muzzle, and the sliding direction of the sliding operation is located in the first direction interval of the muzzle (for example, 225 degrees to 315 degrees of the muzzle, that is, located at the muzzle
  • the interval centered directly to the left of the muzzle i.e. 270 degrees
  • the reverse interval of the first direction interval such as 45 degrees to 135 degrees from the muzzle, that is, centered directly to the right of the muzzle (i.e.
  • the terminal device can directly switch the muzzle processing interface displayed in the human-computer interaction interface to the handguard processing interface. In this way, through sliding operation, you can Quickly jump directly from the processing interface of one component to the processing interface of another component, which improves the processing efficiency of virtual firearms, thereby improving the user's gaming experience.
  • the processing interface corresponding to each component can also be pre-configured with corresponding sliding parameters. Then the terminal device can implement the above-mentioned response to the sliding operation in the first processing interface in the following manner, and the sliding operation The sliding direction is located in the first direction interval of the first component, switching from displaying the first processing interface to displaying the second processing interface: obtaining the first sliding parameter configured for the first processing interface, where the first sliding parameter includes the first component At least one direction interval, the at least one direction interval includes the first direction interval, and a component of the virtual prop is distributed in the reverse interval of each direction interval; in response to the sliding operation in the first processing interface, obtaining the sliding operation Angle value; in response to the angle value of the sliding operation being located in a first direction interval in at least one direction interval, switching from displaying the first processing interface to displaying the second processing interface.
  • Figure 5 is a schematic structural diagram of a virtual rifle provided by an embodiment of the present application.
  • the following components in a virtual rifle can be processed: Gun Mouth, handguard, receiver, magazine, sight, rear grip, and butt.
  • the handguard is located on the right side of the muzzle
  • the receiver is located on the right side of the handguard
  • the sight is located on the upper right side of the handguard.
  • the magazine is located on the lower right side of the handguard
  • the rear grip is located under the receiver
  • the butt is located on the right side of the receiver.
  • the processing interface for the muzzle can only configure one direction range of the muzzle.
  • the center point of the muzzle can be used as the starting point.
  • Set the angle corresponding to the direction perpendicular to the muzzle and upward (here upward is the top of the screen) to 0 degrees, and divide 0 degrees to 360 degrees clockwise into different direction intervals of the muzzle, for example, for
  • the sliding operation received in the muzzle processing interface when the sliding direction of the sliding operation is located in the first direction interval of the muzzle (for example, 225 degrees to 315 degrees of the muzzle, is distributed in the reverse interval of the first direction interval of the muzzle When there is a handguard), the muzzle processing interface will be switched to the handguard processing interface.
  • the handguard's processing interface can be configured
  • the four direction intervals can take the center point of the handguard as the starting point, set the direction perpendicular to the handguard and upward to 0 degrees, and divide 0 degrees to 360 degrees clockwise into different directions of the handguard.
  • Interval for example, for the sliding operation received in the processing interface of the handguard, when the sliding direction of the sliding operation is located in the first direction interval of the handguard (for example, 45 degrees to 135 degrees of the handguard, in the first direction interval of the handguard There are muzzles distributed in the reverse interval), and the processing interface of the handguard will be switched to the processing interface of the muzzle. surface; when the sliding direction of the sliding operation is located in the second direction range of the handguard (for example, 180 degrees to 225 degrees of the handguard, and there are sights distributed in the reverse range of the second direction range of the handguard), the sight will be removed from the handguard.
  • the first direction interval of the handguard for example, 45 degrees to 135 degrees of the handguard, in the first direction interval of the handguard
  • muzzles distributed in the reverse interval
  • the processing interface is switched to the processing interface of the sight; when the sliding direction of the sliding operation is in the third direction range of the handguard (for example, 225 degrees to 315 degrees of the handguard, it is distributed in the reverse range of the third direction range of the handguard) When there is a casing), it will switch from the processing interface of the handguard to the processing interface of the receiver; when the sliding direction of the sliding operation is in the fourth direction range of the handguard (for example, 315 degrees to 360 degrees of the handguard, in the fourth direction range of the handguard, There are magazines distributed in the reverse section of the four-way section), which will switch from the handguard processing interface to the magazine processing interface.
  • the third direction range of the handguard for example, 225 degrees to 315 degrees of the handguard, it is distributed in the reverse range of the third direction range of the handguard
  • the fourth direction range of the handguard for example, 315 degrees to 360 degrees of the handguard, in the fourth direction range of the handguard.
  • the terminal device can obtain the angle value of the sliding operation in the following manner: obtaining the starting point (assumed to be point A(x1, y1)) and the end point (assumed to be point B(x2, y2) of the sliding operation) ); Based on the starting point and the end point, determine the sliding direction of the sliding operation. For example, you can point the starting point to the end point as the sliding direction of the sliding operation; compare the vector of the sliding direction with the vector of the reference direction, which is Base(0, 1) Perform dot multiplication processing.
  • the dot multiplication processing is an operation in which the corresponding bits of two vectors are multiplied one by one and then summed.
  • the result of the dot multiplication is a scalar, and the obtained dot multiplication result is used as the angle value of the sliding operation.
  • the terminal device can also perform the following processing: in response to the angle value of the sliding operation not being located in any one of the at least one angle interval of the first component, controlling the virtual prop in the first processing interface Medium rotation; where the angle of rotation is a fixed value, for example, every time the user performs a sliding operation, the virtual prop rotates 30 degrees; or the angle of rotation is proportional to the sliding distance of the sliding operation.
  • a proportional coefficient can be configured in advance, and the multiplication result of the sliding distance and the proportional coefficient can be determined as the angle of rotation. In this way, the user can control the sliding distance of the sliding operation according to his or her own needs.
  • the processing interface for the muzzle is configured with a corresponding first sliding parameter, where the first sliding parameter includes a first direction interval of the muzzle (such as 225 degrees to 315 degrees), when the sliding direction of the user's sliding operation is not in the first direction range of the muzzle (for example, assuming that the user slides the screen to the right and the angle value of the sliding operation is 80 degrees), the current sliding operation can be determined
  • the operation is not used to trigger the interface jump, but to simply slide the screen back and forth to change the angle to appreciate the appearance of the virtual rifle. Therefore, the virtual rifle can be controlled to rotate in the muzzle processing interface, and the angle of rotation of the virtual rifle can be It is proportional to the sliding distance of the sliding operation. For example, when the sliding distance of the sliding operation is 1 cm, the corresponding rotation angle of the virtual rifle is 50 degrees. When the sliding distance of the sliding operation is 2 cm, the corresponding rotation angle of the virtual rifle is 100 degrees. Spend.
  • the terminal device may also perform the following processing: detect the sliding operation based on the first sliding parameter to obtain the detection result; and characterize the sliding operation as an interface jump in response to the detection result. Trigger the operation and transfer to the processing of obtaining the angle value of the sliding operation; in response to the detection result indicating that the sliding operation is a virtual prop viewing operation, control the virtual props to be selected in the first processing interface, where the angle of rotation is a fixed value, or, The angle of rotation is proportional to the sliding distance of the sliding operation.
  • the first sliding parameter may include at least one of the following parameters: a set sliding duration (for example, it may be a minimum sliding duration or a maximum sliding duration), a set sliding distance (for example, it may be a minimum sliding distance or a maximum sliding distance) , the set pressure parameter (for example, it can be the minimum pressure value or the maximum pressure value), the set number of contacts (for example, it can be a single point corresponding to the interface jump trigger operation, or a double point corresponding to the interface jump trigger operation).
  • a set sliding duration for example, it may be a minimum sliding duration or a maximum sliding duration
  • a set sliding distance for example, it may be a minimum sliding distance or a maximum sliding distance
  • the set pressure parameter for example, it can be the minimum pressure value or the maximum pressure value
  • the set number of contacts for example, it can be a single point corresponding to the interface jump trigger operation, or a double point corresponding to the interface jump trigger operation.
  • the terminal device can implement the above-mentioned detection of the sliding operation based on the first sliding parameter and obtain the detection result in the following manner: obtain the sliding duration of the sliding operation, and compare the sliding duration with the set sliding duration. Compare a certain sliding duration to obtain a comparison result.
  • the comparison result indicates that the sliding duration meets the duration condition
  • the sliding operation is determined to be an interface jump triggering operation.
  • the sliding operation is determined to be an interface jump triggering operation.
  • the sliding operation is determined to be a virtual prop viewing operation.
  • the set sliding duration is the minimum sliding duration (for example, 1 second), and the sliding duration of the sliding operation is detected to be greater than or equal to the minimum sliding duration, it is determined that the duration condition is met; when the set sliding duration is the maximum sliding duration (for example, 2 seconds), and when the sliding duration of the sliding operation is detected to be less than the maximum sliding duration, it is determined that the duration condition is met.
  • the minimum sliding duration for example, 1 second
  • the maximum sliding duration for example, 2 seconds
  • the terminal device can implement the above-mentioned detection of the sliding operation based on the first sliding parameter and obtain the detection result in the following manner: obtain the sliding distance of the sliding operation, and compare the sliding distance with the set sliding distance. Compare a certain sliding distance to obtain a comparison result.
  • the comparison result represents that the sliding distance meets the distance condition
  • the sliding operation is determined to be an interface jump triggering operation.
  • the sliding operation is determined to be Virtual prop viewing operation.
  • the set sliding distance is the minimum sliding distance (for example, 1 cm), and the sliding distance of the sliding operation is detected to be greater than or equal to the minimum sliding distance, it is determined that the distance condition is met; when the set sliding distance is the maximum sliding distance (for example, 2 cm), and when the sliding distance of the sliding operation is detected to be less than the maximum sliding distance, it is determined that the distance condition is met.
  • the minimum sliding distance for example, 1 cm
  • the maximum sliding distance for example, 2 cm
  • the terminal device can implement the above-mentioned detection of the sliding operation based on the first sliding parameter and obtain the detection result in the following manner: obtain the pressure parameter of the sliding operation, and compare the pressure parameter with the set pressure parameter. Compare with certain pressure parameters to obtain a comparison result.
  • the comparison result represents that the pressure parameter meets the pressure condition
  • the sliding operation is determined to be an interface jump triggering operation.
  • the sliding operation is determined to be Virtual prop viewing operation.
  • the set pressure parameter is the minimum pressure threshold and the pressure value of the sliding operation is detected to be greater than or equal to the minimum pressure threshold, it is determined that the pressure condition is met; when the set pressure parameter is the maximum pressure threshold and the sliding operation is detected. When the pressure value is less than or equal to the maximum pressure threshold, it is determined that the pressure condition is met.
  • minimum pressure threshold and maximum pressure threshold can be configured.
  • different minimum pressure thresholds or maximum pressure thresholds can be configured for different components of the virtual props. This is not specifically limited in the embodiment of the present application.
  • the terminal device can implement the above-mentioned detection of the sliding operation based on the first sliding parameter and obtain the detection result in the following manner: obtain the number of contacts of the sliding operation, and add the contact points to The number is compared with the set number of contacts to obtain the comparison result.
  • the comparison result indicates that the two are consistent
  • the sliding operation is determined to be an interface jump triggering operation.
  • the comparison result indicates that the two are inconsistent
  • the sliding operation is determined to be virtual. Prop view operation.
  • the sliding operation is determined to be an interface jump triggering operation; when the number of contacts for the sliding operation is multiple points, the sliding operation is determined to be Virtual prop viewing operation.
  • the first processing interface and the second processing interface may be captured through a virtual camera, and each component of the virtual prop is configured with lens parameters corresponding to the virtual camera, then the terminal device displays the first Before the processing interface switches to displaying the second processing interface, the following processing can also be performed: obtain the second lens parameters configured for the second component; adjust the posture of the virtual camera in the virtual scene based on the second lens parameters, and call the adjusted virtual camera The camera shoots the virtual prop; the processing control of the second component is loaded in the captured picture to obtain the second processing interface.
  • the lens parameters may include at least one of the following parameters: the component corresponding to the lens (for example, when the lens parameter is the lens parameter configured for the muzzle of the virtual rifle, the component corresponding to the lens is the muzzle), the rotation angle of the lens, the The offset relative to the starting point of the component (for example, when the lens parameters are the lens parameters configured for the muzzle of the virtual rifle, the offset here refers to the offset value relative to the starting point of the muzzle), the distance between the lens and the focus distance, angle of view of the lens.
  • the component corresponding to the lens for example, when the lens parameter is the lens parameter configured for the muzzle of the virtual rifle, the component corresponding to the lens is the muzzle
  • the rotation angle of the lens for example, when the lens parameters are the lens parameters configured for the muzzle of the virtual rifle, the offset here refers to the offset value relative to the starting point of the muzzle
  • the offset here refers to the offset value relative to the starting point of the muzzle
  • the terminal device can also perform the following processing: obtain the first lens parameter configured for the first component; perform interpolation processing based on the first lens parameter and the second lens parameter to obtain at least one intermediate lens parameters, where each intermediate lens parameter is used to adjust the posture of the virtual camera, and adjust Use the adjusted virtual camera to shoot the virtual props to obtain a corresponding intermediate interface; during the process of switching from displaying the first processing interface to displaying the second processing interface, insert at least one intermediate interface.
  • the number of intermediate interfaces inserted during the switching process can be fixed, or it can be proportional to the frame rate when displaying the virtual scene, that is, the higher the frame rate, the greater the number of intermediate interfaces inserted. large, thus enabling smooth switching from the first processing interface to the second processing interface, thereby improving the user's visual experience.
  • the terminal device can implement the above-mentioned interpolation process based on the first lens parameter and the second lens parameter to obtain at least one intermediate lens parameter in the following manner: multiply the second lens parameter and t to obtain the first multiplication process. Result, where t is the time elapsed after starting the switch, and the value range of t satisfies: 0 ⁇ t ⁇ T, T is the total time for switching from the first processing interface to the second processing interface, and T is a real number greater than 0 ; Multiply the subtraction result of T and t with the first lens parameter to obtain the second multiplication result; determine the summation result of the first multiplication result and the second multiplication result as at least one intermediate lens parameter .
  • FIG. 6 is a schematic flowchart of a method for processing virtual props provided by an embodiment of the present application. As shown in FIG. 6 , after step 304 shown in FIG. 3 is executed, Step 305 shown in Figure 6 will be described in conjunction with the steps shown in Figure 6 .
  • Step 304 is explained above in conjunction with triggering different types of interface jump triggering operations based on the first processing interface. It should be noted that although the interface jump triggering operations described above are all triggered based on the first processing interface, they are not It should be regarded as a limitation that the interface jump triggering operation can only be implemented based on the first processing interface. Other situations of interface jump triggering operations will be described below.
  • the interface jump triggering operation may be a voice instruction, and the voice instruction may indicate the direction of switching from the first component.
  • the voice instruction may be to switch to the left, then the display of the second processing interface including the second component will be switched to , where the second component is the component located on the left side of the first component and closest to the first component.
  • the voice command can further indicate the number of jumps.
  • the voice command can be to switch two components to the left, then the second processing interface including the second component will be switched to display, where the second component is located to the left of the first component. and be separated from the first component by two components.
  • the interface jump triggering operation may be a somatosensory operation
  • the somatosensory operation may be an operation of shaking the terminal device in a certain direction.
  • the somatosensory operation may be shaking to the left, and the display of the third component including the second component will be switched.
  • Two processing interfaces wherein the second component is the component located on the left side of the first component and closest to the first component.
  • the somatosensory operation can further indicate the number of jumps.
  • the amplitude of left shaking can be positively correlated with the number of components switched to the left. The greater the amplitude, the greater the number of components switched to the left.
  • step 305 in response to the third component of the virtual prop satisfying the processing condition, switching from displaying the second processing interface to displaying a third processing interface different from the second processing interface.
  • the display mode of the third processing interface is similar to the display mode of the first processing interface.
  • the third processing interface at least includes processing controls; in other embodiments, the third processing interface may also include virtual props.
  • the third component may be any component to be transformed in the virtual prop except the first component and the second component.
  • the types of processing controls in the third processing interface may include color controls for changing colors and transformation controls for transformation.
  • the processing control can be dedicated to processing the third component, that is, the processing control of the third component; the processing control can also be universal, that is, used to batch process multiple components in the virtual props including the third component.
  • components in the virtual props other than the third component may not be displayed, partially or completely displayed.
  • the displayed number of components other than the third component may depend on the scaling ratio of the third processing interface (ie, the ratio between the size of the virtual prop and the size of the first processing interface).
  • the larger the scaling ratio the smaller the display quantity.
  • the smaller the zoom ratio the greater the number displayed, in order to facilitate the observation of the overall structure of the virtual props.
  • the interface jump operation can also be implemented automatically.
  • the terminal device detects that the third component of the virtual prop meets the processing conditions, it can automatically jump from the second processing interface to the third processing interface, where , the processing condition may include at least one of the following: the degree of wear and tear of the third component (for example, as the user uses the virtual prop for a longer time, the third component of the virtual prop will slowly produce wear; or the third component of the user's virtual prop When attacked by other players, the third component of the virtual prop will also be damaged.
  • the degree of loss of the third component is greater than the loss threshold, it will affect the normal use of the virtual prop.
  • the butt of a virtual gun as the third component as an example, as the user uses the virtual gun in the game for a longer time, the butt will slowly produce virtual losses, such as breakage or deformation, causing the recoil to become abnormal and affecting the user. shooting experience) greater than or equal to the loss threshold (for example, 30%); obtain new accessories that can be used by the third component.
  • the loss threshold for example, 30%
  • the terminal device detects that the wear degree of the butt of the virtual rifle is greater than the wear degree threshold (for example, it has affected the user's use of the virtual rifle in the game), it can automatically start from the third component.
  • the second processing interface (such as the processing interface of the handguard) jumps to the processing interface of the gun butt, thereby making it convenient for the user to process the gun butt. In this way, the processing efficiency of the virtual props is improved, and the user's gaming experience is also improved.
  • the virtual prop processing method provided by the embodiment of the present application can directly jump from the first processing interface to the second processing corresponding to the second component when receiving an interface jump trigger operation in the first processing interface corresponding to the first component. interface, without first returning to the previous level to select the second component. In this way, the processing efficiency of virtual props can be greatly improved and the user's gaming experience can be improved.
  • Embodiments of the present application provide a method for processing virtual props. According to the relative positions of various components on the virtual firearm, through sliding operations in different directions or directly clicking on the corresponding components, a quick jump directly from the modification interface of one component to The modification interface of another component (for example, jumping directly from the modification interface of the muzzle to the modification interface of the handguard), without the need to repeatedly return to the previous level (such as the entire gun interface) to select new modification parts, greatly improves the efficiency Transformation efficiency of virtual firearms.
  • the user can quickly jump to the modification interface of the corresponding component by clicking on the component in the virtual firearm that needs to be modified.
  • FIG. 4B in addition to the rear grip 413, other components of the virtual firearm are also displayed in the rear grip modification interface 412, such as the magazine 414, the butt 415 and the sight 416.
  • Each firearm component has its own envelope box.
  • the modification interface 412 of the rear grip will jump directly to the modification interface 417 of the magazine.
  • the user can also jump between the modification interfaces of different components by sliding the screen.
  • the following parts of a virtual rifle can be modified: muzzle, handguard, receiver, sight, magazine, rear grip, butt.
  • Position distribution for example, the handguard is on the right side of the muzzle, the receiver is on the right side of the handguard, the sight is on the upper right side of the handguard, and the magazine is on the lower right side of the handguard. Therefore, as shown in Figure 7A, assuming that the human-computer interaction interface displays the muzzle modification interface 701, the user only needs to slide his finger to the left to jump from the muzzle modification interface 701 to the handguard modification interface.
  • handguard modification interface 702 when receiving the sliding operation of the user to slide the screen to the upper left corner, the handguard modification interface 702 will jump to the magazine modification interface 703. Similarly, to jump from the handguard modification interface to the sight modification interface, slide the screen to the lower left corner.
  • a direction interval can be set to determine whether a specific sliding direction can trigger an interface jump.
  • the set direction range can be 225 degrees to 315 degrees. Then as long as the client detects a sliding operation in this direction range, that is Can trigger interface jump.
  • a duration limit for the sliding operation For example, when the duration of the sliding operation exceeds the duration threshold (for example, 2 seconds), then The interface jump operation is not triggered and is regarded as an operation to appreciate the appearance of the firearm.
  • a set of parameters can be configured separately to determine the jump sequence and angle interval parameters of the sliding screen direction.
  • FIG. 7B is a schematic diagram of an application scenario of the method for modifying virtual props provided by an embodiment of the present application.
  • the handguard modification interface 702 in addition to displaying the information for replacing the original guard of the firearm, in addition to the wooden handguard 704, there are also sub-accessories that can be added to the handguard, such as the left rail 705, the right rail 706, and the front grip 707. In this way, the handguard can be modified in a modification interface. It is modified together with its sub-accessories to improve the efficiency of virtual firearm modification.
  • FIG. 8 is a schematic flowchart of a method for processing virtual props provided by an embodiment of the present application, which will be described in conjunction with the steps shown in FIG. 8 .
  • step 801 the client receives the player's screen sliding operation.
  • the client receives the sliding screen operation triggered by the player in the modification interface of the gun muzzle.
  • step 802 the client calculates the sliding angle of the sliding operation.
  • the client can, as shown in Figure 5, take the center point of the muzzle as the starting point and set the angle perpendicular to the muzzle and upward as 0 degrees, and divide 0 degrees to 360 degrees into eight quadrants as shown in Figure 9, and then determine the sixth and seventh quadrants (i.e. 225 degrees to 315 degrees) as the first direction interval of the muzzle , that is, if the angle value of the player's sliding direction P falls in the sixth quadrant or the sixth quadrant (i.e., the first direction interval), a jump from the muzzle modification interface to the handguard modification interface will be triggered.
  • the sixth and seventh quadrants i.e. 225 degrees to 315 degrees
  • the client can obtain the starting point A (x1, y1) and the end point B (x2, y2) of the player's sliding operation, and calculate the sliding direction P from point A to point B (P is the vector from A to B ), and then do a dot product between the sliding direction P and the vector Base(0,1) in the reference direction.
  • the obtained dot multiplication result is used as the angle value of the player's sliding direction P.
  • the player's angle can be determined by the angle value. In which quadrant does the sliding screen direction P fall?
  • step 803 the client reads the sliding screen parameters.
  • a set of sliding screen parameters (corresponding to the above-mentioned sliding parameters) can be pre-configured to control the player's sliding screen jump, where the sliding screen parameters can include : Min(float) and Max(float) represent the sliding screen interval (i.e. direction interval) required to modify the component.
  • Min(float) and Max(float) are respectively 225 degrees and 315 degrees (corresponding to the sixth and seventh quadrants in Figure 9, that is, if the angle value of the player's sliding screen direction P falls in the sixth or sixth quadrant, the transformation interface from the muzzle to The handguard's modification interface jumps), MinDist (float) represents the minimum sliding distance required, MaxDuration (float) represents the maximum sliding time that triggers the interface jump, and PointType (enum) represents the modification component corresponding to the lens.
  • MinDist float
  • MaxDuration float
  • PointType represents the modification component corresponding to the lens.
  • the sliding screen parameters corresponding to the modification interface of each component can be pre-configured.
  • step 804 the client determines the type of the sliding screen operation based on the sliding screen parameters.
  • step 805 is executed.
  • step 806 is executed. 807.
  • step 805 the client rotates the virtual firearm in the modification interface of the current component.
  • the client when the client determines that the type of sliding screen operation is a virtual firearm viewing operation based on the sliding screen parameters, the client can modify the muzzle modification interface in the muzzle modification interface. Rotate the virtual firearm and display the rotated virtual firearm to meet the player's need to appreciate the appearance of the firearm.
  • step 806 the client obtains target lens parameters.
  • a set of lens parameters can be pre-configured to describe the lens corresponding to the component, where the lens parameters can include: PointType (enum) represents the modified component corresponding to the lens, Rotation (Vector3) Indicates the rotation angle of the lens, Offset(Vector2) indicates the offset of the lens relative to the initial point, CameraDis(float) indicates the distance between the lens and the focus, FOV(float) indicates the angle of view of the lens, LerpSpeed(float) indicates the distance between the lens and the next lens The speed of the insertion transition.
  • PointType encodes the modified component corresponding to the lens
  • Rotation Vector3
  • Offset(Vector2) indicates the offset of the lens relative to the initial point
  • CameraDis(float) indicates the distance between the lens and the focus
  • FOV(float) indicates the angle of view of the lens
  • LerpSpeed(float) indicates the distance between the lens and the next lens The speed of the insertion transition.
  • the client first determines whether the sliding distance of the current player's sliding screen operation is greater than the minimum Sliding distance (MinDist). If it is less than MinDist, it is determined that the sliding operation is a virtual firearm viewing operation. The client can rotate the virtual firearm in the muzzle modification interface and display the rotated virtual firearm to satisfy the player. The need to appreciate the appearance of the firearm; if it is greater than MinDist, continue to determine whether the sliding duration of the sliding screen operation is less than the maximum sliding time (MaxDuration).
  • MinDist minimum Sliding distance
  • the sliding screen operation is a virtual firearm viewing operation, and the client is at the muzzle of the gun.
  • the quadrant in which direction P falls obtains the corresponding target lens parameters. For example, if the angle value of the player's sliding direction P falls in the sixth quadrant or the sixth quadrant, obtain the lens parameters configured for the handguard (i.e., the target lens parameters) .
  • step 807 the client performs interpolation processing based on the initial lens parameters and the target lens parameters, and sequentially displays the pictures captured by each lens.
  • the client can also perform linear interpolation processing on each parameter in the initial lens parameters (that is, the lens parameters corresponding to the current modification interface) and the target lens parameters, wherein:
  • DeltaTime represents the time elapsed since the switch started
  • A represents the lens parameters corresponding to the current transformation interface
  • B represents the target lens parameters
  • P represents the intermediate lens parameters obtained through interpolation processing.
  • the posture of the virtual camera can be adjusted based on the initial lens parameters, the intermediate lens parameters, and the target lens parameters in sequence, and the adjusted virtual camera can be called to shoot the virtual firearm, and the pictures captured by the virtual camera in different postures (i.e., the virtual camera) can be obtained. Pictures taken by the camera when in different postures), and finally display multiple pictures taken in sequence, thereby achieving a smooth switch from the muzzle modification interface to the handguard modification interface.
  • the method for modifying virtual props provided by the embodiments of the present application is to quickly jump directly from the modification interface of one component to another by sliding the screen in different directions or directly clicking on the model according to the relative positions of the various components on the virtual firearm.
  • the modification interface of another component does not require repeatedly returning to the previous level (such as the entire gun interface) to select new modification components, which improves the efficiency of virtual firearm modification, thereby improving the user's gaming experience.
  • the virtual prop modification processing device 555 stored in the memory 550
  • the software modules in may include: display module 5551 and switching module 5552.
  • the display module 5551 is configured to display a transformation portal for virtual props in the virtual scene; the display module 5551 is also configured to display a first processing interface in response to a triggering operation for the processing portal, wherein the first processing interface at least includes processing controls ; The display module 5551 is also configured to display the virtual props after processing in response to a triggering operation for the processing control, instead of displaying the virtual props before processing; the switching module 5552 is configured to respond to the interface A jump trigger operation switches from displaying the first processing interface to displaying a second processing interface different from the first processing interface. .
  • the first processing interface further includes a first component of the virtual prop; the display module 5551 is further configured to display the processed first component in response to a triggering operation for the processing control, instead of displaying the pre-processed first component.
  • a component wherein components in the processed virtual prop except the first component are not displayed or are at least partially displayed.
  • the first processing interface also includes a second component of the virtual prop, and the interface jump triggering operation is a triggering operation for the second component; the switching module 5552 is also configured to respond to the triggering operation for the third component in the first processing interface.
  • the trigger operation of the two components switches from displaying the first processing interface to displaying the second processing interface.
  • the first processing interface also includes a browsing control corresponding to at least one direction, and the interface jump triggering operation is a triggering operation for the browsing control; the switching module 5552 is also configured to respond to a request in the first processing interface.
  • the display of the first processing interface is switched to the display of the second processing interface, wherein the distribution direction of the second component relative to the first component is the opposite direction of the first direction, and is in the opposite direction. Up to the component closest to the first component.
  • the interface jump triggering operation is a sliding operation
  • the switching module 5552 is also configured to respond to the sliding operation in the first processing interface, and the sliding direction of the sliding operation is located in the first direction interval of the first component, Switch from displaying the first processing interface to displaying the second processing interface, wherein the second component is distributed in the reverse section of the first direction section, and the distance between the second component and the first component is proportional to the sliding distance of the sliding operation.
  • the interface jump triggering operation is a sliding operation; the switching module 5552 is also configured to respond to the sliding operation in the first processing interface, and the sliding direction of the sliding operation is located in the first direction interval of the first component, Switch from displaying the first processing interface to displaying the second processing interface, wherein the second component is distributed in the reverse section of the first direction section, and the second component is the component closest to the first component in the reverse section.
  • the virtual prop processing device 555 also includes an acquisition module 5553 configured to acquire a first sliding parameter configured for the first processing interface, where the first sliding parameter includes at least one direction interval of the first component, At least one direction interval includes a first direction interval, and a component of the virtual prop is distributed in the reverse interval of each direction interval; and is configured to obtain the angle value of the sliding operation in response to the sliding operation in the first processing interface;
  • the switching module 5552 is further configured to switch from displaying the first processing interface to displaying the second processing interface in response to the angle value of the sliding operation being located in a first direction interval in at least one direction interval.
  • the acquisition module 5553 is also configured to acquire the starting point and the end point of the sliding operation;
  • the virtual prop processing device 555 also includes a determination module 5554 and a click multiplication module 5555, wherein the determination module 5554 is configured to based on The starting point and the end point determine the sliding direction of the sliding operation;
  • the dot product module 5555 is configured to perform dot product processing on the sliding direction and the reference direction to obtain the angle value of the sliding operation.
  • the virtual prop processing device 555 also includes a control module 5556 configured to control the virtual prop in the first processing interface in response to the angle value of the sliding operation not being located in any one of the at least one angle interval. Rotation, where the angle of rotation is a fixed value, or the angle of rotation is proportional to the sliding distance of the sliding operation.
  • the virtual prop processing device 555 also includes a detection module 5557 and a transfer module 5558, wherein the detection module 5557 is configured to obtain the module 5553 based on the first sliding parameter pair before acquiring the angle value of the sliding operation.
  • the sliding operation is detected and the detection result is obtained;
  • the transfer module 5558 is configured to respond to the detection result to indicate that the sliding operation is an interface jump triggering operation, and transfer to the processing of obtaining the angle value of the sliding operation;
  • the control module 5556 is also configured to respond When the detection result indicates that the sliding operation is a virtual prop viewing operation, the virtual prop is controlled to rotate in the first processing interface, and the angle of rotation is a fixed value, or the angle of rotation is proportional to the sliding distance of the sliding operation.
  • the first sliding parameter also includes at least one of the following parameters: a set sliding duration, a set sliding distance, a set pressure parameter, and a set number of contacts; the detection module 5557 is also configured to Execute the following At least one of the reasons: obtain the sliding duration of the sliding operation, compare the sliding duration with the set sliding duration, and obtain a comparison result. When the comparison result indicates that the sliding duration satisfies the duration condition, the sliding operation is determined to be an interface jump triggering operation.
  • the sliding operation when the comparison result indicates that the sliding duration does not meet the duration condition, determine the sliding operation as a virtual prop viewing operation; obtain the sliding distance of the sliding operation, compare the sliding distance with the set sliding distance, and obtain the comparison result, where, when the comparison result When the sliding distance meets the distance condition, the sliding operation is determined to be an interface jump triggering operation.
  • the comparison result indicates that the sliding distance does not meet the distance condition, the sliding operation is determined to be a virtual prop viewing operation; the pressure parameter of the sliding operation is obtained, and the pressure parameter is The set pressure parameters are compared to obtain a comparison result.
  • the comparison result represents that the pressure parameter meets the pressure condition
  • the sliding operation is determined to be an interface jump trigger operation.
  • the sliding operation is determined.
  • View operations for virtual props obtain the number of contact points of the sliding operation, compare the number of contact points with the set number of contact points, and obtain the comparison result.
  • the sliding operation is determined to be an interface jump. Trigger the operation.
  • the sliding operation is determined to be a virtual prop viewing operation.
  • the first processing interface and the second processing interface are captured by a virtual camera, and each component of the virtual prop is configured with lens parameters corresponding to the virtual camera;
  • the acquisition module 5553 is also configured as a switching module 5552 Before switching from displaying the first processing interface to displaying the second processing interface, obtain the second lens parameters configured for the second component;
  • the virtual prop processing device 555 also includes an adjustment module 5559, a shooting module 55510, and a loading module 55511, where , the adjustment module 5559 is configured to adjust the posture of the virtual camera in the virtual scene based on the second lens parameters;
  • the shooting module 55510 is configured to call the adjusted virtual camera to shoot the virtual props;
  • the loading module 55511 is configured to shoot the Load the processing control of the second component in the screen to obtain the second processing interface.
  • the lens parameters include at least one of the following parameters: the component corresponding to the lens, the rotation angle of the lens, the offset of the lens relative to the starting point of the component, the distance between the lens and the focus, and the angle of view of the lens.
  • the acquisition module 5553 is also configured to acquire the first lens parameters configured for the first component;
  • the virtual prop processing device 555 also includes an interpolation module 55512 and an insertion module 55513, wherein the interpolation module 55512 is configured as Interpolation processing is performed based on the first lens parameter and the second lens parameter to obtain at least one intermediate lens parameter, where each intermediate lens parameter is used to adjust the posture of the virtual camera and call the adjusted virtual camera to shoot the virtual props , to obtain a corresponding intermediate interface;
  • the insertion module 55513 is configured so that the switching module 5552 inserts at least one intermediate interface during the process of switching from displaying the first processing interface to displaying the second processing interface.
  • the interpolation module 55512 is also configured to multiply the second lens parameter and t to obtain the first multiplication result, where t is the elapsed time after starting the switching, and the value range of t satisfies : 0 ⁇ t ⁇ T, T is the total time for switching from the first processing interface to the second processing interface, and T is a real number greater than 0; the subtraction result of T and t is multiplied by the first lens parameter to obtain The second multiplication result: determine the sum of the first multiplication result and the second multiplication result as at least one intermediate lens parameter.
  • the display module 5551 is also configured to display the virtual props viewing interface before displaying the first processing interface, where the virtual props viewing interface includes multiple components of the virtual props; the transfer module 5558 is also configured to respond After the selection operation of the first component in the virtual prop viewing interface, the process of displaying the first processing interface is transferred.
  • the switching module 5552 is further configured to switch from displaying the second processing interface to displaying the third processing interface in response to the third component of the virtual prop satisfying the processing condition, wherein the third processing interface includes the third component, and processing controls for third components.
  • the processing conditions include at least one of the following: the wear degree of the third component is greater than or equal to the wear degree threshold; and new accessories that can be used by the third component are obtained.
  • Embodiments of the present application provide a computer program product.
  • the computer program product includes a computer program or computer-executable instructions.
  • the computer program or computer-executable instructions are stored in a computer-readable storage medium.
  • the processor of the computer device reads the computer-executable instructions from the computer-readable storage medium, and the processor executes the computer-executable instructions, so that the computer device executes the virtual prop modification processing method described above in the embodiment of the present application.
  • Embodiments of the present application provide a computer-readable storage medium storing computer-executable instructions.
  • the computer-executable instructions are stored therein.
  • the computer-executable instructions When executed by a processor, they will cause the processor to execute the steps provided by the embodiments of the present application.
  • the method for modifying virtual props is, for example, the method for modifying virtual props as shown in FIG. 3 or FIG. 6 .
  • the computer-readable storage medium may be a memory such as FRAM, ROM, PROM, EPROM, EEPROM, flash memory, magnetic surface memory, optical disk, or CD-ROM; it may also include one or any combination of the above memories.
  • Various equipment may be a memory such as FRAM, ROM, PROM, EPROM, EEPROM, flash memory, magnetic surface memory, optical disk, or CD-ROM; it may also include one or any combination of the above memories.
  • Various equipment may be a memory such as FRAM, ROM, PROM, EPROM, EEPROM, flash memory, magnetic surface memory, optical disk, or CD-ROM; it may also include one or any combination of the above memories.
  • executable instructions may take the form of a program, software, software module, script, or code, written in any form of programming language, including compiled or interpreted languages, or declarative or procedural languages, and their May be deployed in any form, including deployed as a stand-alone program or deployed as a module, component, subroutine, or other unit suitable for use in a computing environment.
  • executable instructions may be deployed to execute on one electronic device, or on multiple electronic devices located at one location, or on multiple electronic devices distributed across multiple locations and interconnected by a communications network. execute on.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

提供了一种虚拟道具的处理方法、装置、电子设备及存储介质;方法包括:显示针对虚拟场景中的虚拟道具的处理入口;响应于针对处理入口的触发操作,显示第一处理界面,其中,第一处理界面至少包括处理控件;响应于针对处理控件的触发操作,显示处理后的虚拟道具,以替代显示处理前的虚拟道具;响应于界面跳转触发操作,从显示第一处理界面切换到显示不同于第一处理界面的第二处理界面。

Description

虚拟道具的处理方法、装置、电子设备、存储介质及程序产品
相关申请的交叉引用
本申请基于申请号为2022109711982、申请日为2022年8月12日的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此引入本申请作为参考。
技术领域
本申请涉及计算机人机交互技术领域,尤其涉及一种虚拟道具的处理方法、装置、电子设备、存储介质及程序产品。
背景技术
基于图形处理硬件的虚拟场景的人机交互技术,能够根据实际应用需求实现受控于用户或人工智能的虚拟对象之间的多样化的交互,具有广泛的实用价值。例如在游戏等的虚拟场景中,能够模拟虚拟对象之间的真实的对战过程。
以射击游戏为例,在大部分射击游戏中,都有枪械处理系统,以处理系统为例,由于枪械有多个可以处理的部件,当玩家对一把枪进行多个部件的处理时,需要反复执行以下处理:在整枪界面中选择需要处理的部件(例如枪口)进入枪口的处理界面,在枪口的处理界面对枪口处理完成后,返回整枪界面,在整枪界面中重新选择需要处理的部件(例如护木)进入护木的处理界面,在护木的处理界面对护木处理完成后,返回整枪界面,并再次选择需要处理的其他部件。
可以看出,相关技术在对枪械进行处理的过程中,需要频繁地在整枪界面和对应部件的处理界面来回跳转,导致虚拟道具的处理效率较低。
发明内容
本申请实施例提供一种虚拟道具的处理方法、装置、电子设备、计算机可读存储介质及计算机程序产品,能够提高虚拟道具的处理效率。
本申请实施例的技术方案是这样实现的:
本申请实施例提供一种虚拟道具的处理方法,由电子设备执行,包括:
显示针对虚拟场景中的虚拟道具的处理入口;
响应于针对所述处理入口的触发操作,显示第一处理界面,其中,所述第一处理界面至少包括处理控件;
响应于针对所述处理控件的触发操作,显示处理后的所述虚拟道具,以替代显示处理前的所述虚拟道具;
响应于界面跳转触发操作,从显示所述第一处理界面切换到显示不同于所述第一处理界面的第二处理界面。
本申请实施例提供一种虚拟道具的处理装置,包括:
显示模块,配置为显示针对虚拟场景中的虚拟道具的处理入口;
所述显示模块,还配置为响应于针对所述处理入口的触发操作,显示第一处理界面, 其中,所述第一处理界面至少包括处理控件;
所述显示模块,还配置为响应于针对所述处理控件的触发操作,显示处理后的所述虚拟道具,以替代显示处理前的所述虚拟道具;
切换模块,配置为响应于界面跳转触发操作,从显示所述第一处理界面切换到显示不同于所述第一处理界面的第二处理界面。
本申请实施例提供一种电子设备,包括:
存储器,用于存储可执行指令;
处理器,用于执行所述存储器中存储的可执行指令时,实现本申请实施例提供的虚拟道具的处理方法。
本申请实施例提供一种计算机可读存储介质,存储有计算机可执行指令,用于被处理器执行时,实现本申请实施例提供的虚拟道具的处理方法。
本申请实施例提供一种计算机程序产品,包括计算机程序或计算机可执行指令,用于被处理器执行时,实现本申请实施例提供的虚拟道具的处理方法。
本申请实施例具有以下有益效果:
在第一部件对应的第一处理界面中接收到界面跳转触发操作时,可以从第一处理界面跳转至第二部件对应的第二处理界面,与先返回到上一层级中去选择第二部件并进入相应的处理界面相比,在一个部件处理完成后能够快速进行另一个部件的处理,节约了操作时间,进而提高了虚拟道具的处理效率。
附图说明
图1A是本申请实施例提供的虚拟道具的处理方法的应用模式示意图;
图1B是本申请实施例提供的虚拟道具的处理方法的应用模式示意图;
图2是本申请实施例提供的电子设备500的结构示意图;
图3是本申请实施例提供的虚拟道具的处理方法的流程示意图;
图4A至图4C是本申请实施例提供的虚拟道具的处理方法的应用场景示意图;
图5是本申请实施例提供的虚拟步枪的结构示意图;
图6是本申请实施例提供的虚拟道具的处理方法的流程示意图;
图7A和图7B是本申请实施例提供的虚拟道具的处理方法的应用场景示意图;
图8是本申请实施例提供的虚拟道具的处理方法的流程示意图;
图9是本申请实施例提供的象限示意图。
具体实施方式
为了使本申请的目的、技术方案和优点更加清楚,下面将结合附图对本申请作进一步地详细描述,所描述的实施例不应视为对本申请的限制,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其它实施例,都属于本申请保护的范围。
在以下的描述中,涉及到“一些实施例”,其描述了所有可能实施例的子集,但是可以理解,“一些实施例”可以是所有可能实施例的相同子集或不同子集,并且可以在不冲突的情况下相互结合。
可以理解的是,在本申请实施例中,涉及到用户信息等相关的数据(例如用户控制的虚拟对象所拥有的虚拟道具的相关数据),当本申请实施例运用到具体产品或技术中时,需要获得用户许可或者同意,且相关数据的收集、使用和处理需要遵守相关国家和地区的相关法律法规和标准。
在以下的描述中,所涉及的术语“第一\第二\...”仅仅是是区别类似的对象,不代表针对对象的特定排序,可以理解地,“第一\第二\...”在允许的情况下可以互换特定的顺序或先后次序,以使这里描述的本申请实施例能够以除了在这里图示或描述的以外的顺序实施。
除非另有定义,本文所使用的所有的技术和科学术语与属于本申请的技术领域的技术人员通常理解的含义相同。本文中所使用的术语只是为了描述本申请实施例的目的,不是旨在限制本申请。
对本申请实施例进行进一步详细说明之前,对本申请实施例中涉及的名词和术语进行说明,本申请实施例中涉及的名词和术语适用于如下的解释。
1)响应于:用于表示所执行的操作所依赖的条件或者状态,当满足所依赖的条件或状态时,所执行的一个或多个操作可以是实时的,也可以具有设定的延迟;在没有特别说明的情况下,所执行的多个操作不存在执行先后顺序的限制。
2)虚拟场景:是应用程序在终端设备上运行时显示(或提供)的场景。该虚拟场景可以是对真实世界的仿真环境,也可以是半仿真半虚构的虚拟环境,还可以是纯虚构的虚拟环境。虚拟场景可以是二维虚拟场景、2.5维虚拟场景或者三维虚拟场景中的任意一种,本申请实施例对虚拟场景的维度不加以限定。例如,虚拟场景可以包括天空、陆地、海洋等,该陆地可以包括沙漠、城市等环境元素,用户可以控制虚拟对象在该虚拟场景中进行移动。
3)虚拟道具:虚拟场景中能够被虚拟对象使用的道具,在结构上由多个部件构成。例如可以是用于攻击其他虚拟对象的虚拟射击道具,例如虚拟枪械、虚拟弓箭等;也可以是虚拟场景中用于供虚拟对象进行驾驶的虚拟载具,例如虚拟车辆、虚拟轮船、虚拟飞机、虚拟自行车等。
4)虚拟对象:虚拟场景中可以进行交互的各种人和物的形象,或在虚拟场景中的可活动对象。该可活动对象可以是虚拟人物、虚拟动物、动漫人物等,例如在虚拟场景中显示的人物、动物等。该虚拟对象可以是虚拟场景中的一个虚拟的用于代表用户的虚拟形象。虚拟场景中可以包括多个虚拟对象,每个虚拟对象在虚拟场景中具有自身的形状和体积,占据虚拟场景中的一部分空间。
5)场景数据:表示虚拟场景的特征数据,例如可以是虚拟场景中建造区域的面积、虚拟场景当前所处的建筑风格等;也可以包括虚拟建筑在虚拟场景中所处的位置、以及虚拟建筑的占地面积等。
6)客户端:终端设备中运行的用于提供各种服务的应用程序,例如视频播放客户端、游戏客户端等。
7)虚拟道具处理,即对虚拟道具进行的变化,包括颜色的更新、结构的改造等。以虚拟道具的改造为例,改造是对虚拟道具的结构进行变化的操作,包括对虚拟道具的部件进行拆卸、安装、替换等的操作,例如可以是使用一个新枪口来替换虚拟枪械原有的枪口,或者是在护木上安装前握把、激光设备等配件。
本申请实施例提供一种虚拟道具的处理方法、装置、电子设备、计算机可读存储介质及计算机程序产品,能够提高虚拟道具的处理效率。为便于更容易理解本申请实施例提供的虚拟道具的处理方法,首先说明本申请实施例提供的虚拟道具的处理方法的示例性实施场景,本申请实施例提供的虚拟道具的处理方法中的虚拟场景可以完全基于终端设备输出,或者基于终端设备和服务器协同输出。
在一些实施例中,虚拟场景可以是供虚拟对象(例如游戏角色)进行交互的环境,例如可以是供游戏角色在虚拟场景中进行对战,通过控制游戏角色的行动可以在虚拟场景中进行双方互动,从而使用户能够在游戏的过程中舒缓生活压力。
在一个实施场景中,参见图1A,图1A是本申请实施例提供的虚拟道具的处理方法的应用模式示意图,适用于一些完全依赖于终端设备400的图形处理硬件计算能力即可完成虚拟场景100的相关数据计算的应用模式,例如单机版/离线模式的游戏,通过智能手机、平板电脑和虚拟现实/增强现实设备等各种不同类型的终端设备400完成虚拟场景的输出。
作为示例,图形处理硬件的类型包括中央处理器(CPU,Central Processing Unit)和图形处理器(GPU,Graphics Processing Unit)。
当形成虚拟场景100的视觉感知时,终端设备400通过图形计算硬件计算显示所需要的数据,并完成显示数据的加载、解析和渲染,在图形输出硬件输出能够对虚拟场景形成视觉感知的视频帧,例如,在智能手机的显示屏幕呈现二维的视频帧,或者,在增强现实/虚拟现实眼镜的镜片上投射实现三维显示效果的视频帧;此外,为了丰富感知效果,终端设备400还可以借助不同的硬件来形成听觉感知、触觉感知、运动感知和味觉感知的一种或多种。
作为示例,终端设备400上运行有客户端410(例如单机版的游戏应用),在客户端410的运行过程中输出包括有角色扮演的虚拟场景,虚拟场景可以是供游戏角色交互的环境,例如可以是用于供游戏角色进行对战的平原、街道、山谷等等;以第一人称视角显示虚拟场景100为例,在虚拟场景100中显示有虚拟对象101,其中,虚拟对象101可以是受用户控制的游戏角色,即虚拟对象101受控于真实用户,将响应于真实用户针对控制器(例如触控屏、声控开关、键盘、鼠标和摇杆等)的操作而在虚拟场景100中运动,例如当真实用户向右移动摇杆时,虚拟对象101将在虚拟场景100中向右部移动,还可以保持原地静止、跳跃以及控制虚拟对象101进行射击操作等。
举例来说,以虚拟道具为虚拟枪械为例,在虚拟场景100中显示有虚拟对象101、以及虚拟对象101握持的虚拟枪械102。此外,在虚拟场景100中还显示有针对虚拟枪械102的处理入口103。当客户端410接收到用户针对处理入口103的触发操作时,将在人机交互界面中显示的虚拟场景100,切换为第一处理界面104(例如枪口的处理界面),其中,第一处理界面104中显示有虚拟枪械102的第一部件105(例如枪口)、以及第一部件105的处理控件106(例如可以用于替换第一部件105的新枪口的改造控件);接着客户端410可以响应于针对第一部件105的处理控件106的触发操作,显示处理后的第一部件105(例如新枪口),以替代显示处理前的第一部件105,从而完成针对枪口的处理;随后客户端410可以响应于基于第一处理界面104的界面跳转触发操作,从显示第一处理界面104直接切换到显示第二处理界面107(例如护木的处理界面),其中,第二处理界面107中显示有虚拟枪械102的第二部件108(例如护木)、以及第二部件108的处理控件109(例如可以在第二部件108上安装的右导轨),如此,在处理完一个部件之后,可以直接从一个部件的处理界面快捷跳转到另一个部件的处理界面,而不需要反复返回到上一层级(例如整枪界面)去选择新的处理部件,节约了操作时间,进而提高了虚拟道具的处理效率。
在另一个实施场景中,参见图1B,图1B是本申请实施例提供的虚拟道具的处理方法的应用模式示意图,应用于终端设备400和服务器200,适用于依赖于服务器200的计算能力完成虚拟场景计算、并在终端设备400输出虚拟场景的应用模式。
以形成虚拟场景100的视觉感知为例,服务器200进行虚拟场景相关显示数据(例如场景数据)的计算并通过网络300发送到终端设备400,终端设备400依赖于图形计算硬件完成计算显示数据的加载、解析和渲染,依赖于图形输出硬件输出虚拟场景以形成视觉感知,例如可以在智能手机的显示屏幕呈现二维的视频帧,或者,在增强现实/虚拟现实眼镜的镜片上投射实现三维显示效果的视频帧;对于虚拟场景的形式的感知而 言,可以理解,可以借助于终端设备400的相应硬件输出,例如使用麦克风形成听觉感知,使用振动器形成触觉感知等等。
作为示例,终端设备400上运行有客户端410(例如网络版的游戏应用),通过连接服务器200(例如游戏服务器)与其他用户进行游戏互动,终端设备400输出客户端410的虚拟场景100,以第一人称视角显示虚拟场景100为例,在虚拟场景100中显示有虚拟对象101,其中,虚拟对象101可以是受用户控制的游戏角色,即虚拟对象101受控于真实用户,将响应于真实用户针对控制器(例如触控屏、声控开关、键盘、鼠标和摇杆等)的操作而在虚拟场景100中运动,例如当真实用户向右移动摇杆时,虚拟对象101将在虚拟场景100中向右部移动,还可以保持原地静止、跳跃以及控制虚拟对象101进行射击操作等。
举例来说,以虚拟道具为虚拟枪械为例,在虚拟场景100中显示有虚拟对象101、以及虚拟对象101握持的虚拟枪械102。此外,在虚拟场景100中还显示有针对虚拟枪械102的处理入口103。当客户端410接收到用户针对处理入口103的触发操作时,将在人机交互界面中显示的虚拟场景100,切换为第一处理界面104(例如枪口的处理界面),其中,第一处理界面104中显示有虚拟枪械102的第一部件105(例如枪口)、以及第一部件105的处理控件106(例如用于替换第一部件105的新枪口);接着客户端410可以响应于针对第一部件105的处理控件106的触发操作,显示处理后的第一部件105(例如新枪口),以替代显示处理前的第一部件105,从而完成针对枪口的处理;随后客户端410响应于基于第一处理界面104的界面跳转触发操作,从显示第一处理界面104直接切换到显示第二处理界面107(例如护木的处理界面),其中,第二处理界面107中显示有虚拟枪械102的第二部件108(例如护木)、以及第二部件108的处理控件109(例如可以在第二部件108上安装的右导轨),如此,在处理完一个部件之后,可以直接从一个部件的处理界面快捷跳转到另一个部件的处理界面,而不需要反复返回到上一层级(例如整枪界面)去选择新的处理部件,节约了操作时间,进而提高了虚拟道具的处理效率。
在一些实施例中,终端设备400还可以通过运行计算机程序来实现本申请实施例提供的虚拟道具的处理方法,例如,计算机程序可以是操作系统中的原生程序或软件模块;可以是本地(Native)应用程序(APP,APPlication),即需要在操作系统中安装才能运行的程序,例如射击类游戏APP(即上述的客户端410);也可以是小程序,即只需要下载到浏览器环境中就可以运行的程序。总而言之,上述计算机程序可以是任意形式的应用程序、模块或插件。
以计算机程序为应用程序为例,在实际实施时,终端设备400安装和运行有支持虚拟场景的应用程序。该应用程序可以是第一人称射击游戏(FPS,First-Person Shooting game)、第三人称射击游戏、虚拟现实应用程序、三维地图程序或者多人枪战类生存游戏中的任意一种。用户使用终端设备400操作位于虚拟场景中的虚拟对象进行活动,该活动包括但不限于:调整身体姿态、爬行、步行、奔跑、骑行、跳跃、驾驶、拾取、射击、攻击、投掷、建造虚拟建筑中的至少一种。示意性的,该虚拟角色可以是虚拟人物,比如仿真人物角色或动漫人物角色等。
在另一些实施例中,本申请实施例还可以借助于云技术(Cloud Technology)实现,云技术是指在广域网或局域网内将硬件、软件、网络等系列资源统一起来,实现数据的计算、储存、处理和共享的一种托管技术。
云技术是基于云计算商业模式应用的网络技术、信息技术、整合技术、管理平台技术、以及应用技术等的总称,可以组成资源池,按需所用,灵活便利。云计算技术将变成重要支撑。技术网络系统的后台服务需要大量的计算、存储资源。
示例的,图1B中的服务器200可以是独立的物理服务器,也可以是多个物理服务器构成的服务器集群或者分布式系统,还可以是提供云服务、云数据库、云计算、云函数、云存储、网络服务、云通信、中间件服务、域名服务、安全服务、内容分发网络(CDN,Content Delivery Network)、以及大数据和人工智能平台等基础云计算服务的云服务器。终端设备400可以是智能手机、平板电脑、笔记本电脑、台式计算机、智能音箱、智能手表等,但并不局限于此。终端设备400以及服务器200可以通过有线或无线通信方式进行直接或间接地连接,本申请实施例中不做限制。
下面继续对本申请实施例提供的电子设备的结构进行说明。以电子设备为终端设备为例,参见图2,图2是本申请实施例提供的电子设备500的结构示意图,图2所示的电子设备500包括:至少一个处理器510、存储器550、至少一个网络接口520和用户接口530。电子设备500中的各个组件通过总线系统540耦合在一起。可理解,总线系统540用于实现这些组件之间的连接通信。总线系统540除包括数据总线之外,还包括电源总线、控制总线和状态信号总线。但是为了清楚说明起见,在图2中将各种总线都标为总线系统540。
处理器510可以是一种集成电路芯片,具有信号的处理能力,例如通用处理器、数字信号处理器(DSP,Digital Signal Processor),或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等,其中,通用处理器可以是微处理器或者任何常规的处理器等。
用户接口530包括使得能够呈现媒体内容的一个或多个输出装置531,包括一个或多个扬声器和/或一个或多个视觉显示屏。用户接口530还包括一个或多个输入装置532,包括有助于用户输入的用户接口部件,比如键盘、鼠标、麦克风、触屏显示屏、摄像头、其他输入按钮和控件。
存储器550可以是可移除的,不可移除的或其组合。示例性的硬件设备包括固态存储器,硬盘驱动器,光盘驱动器等。存储器550可选地包括在物理位置上远离处理器510的一个或多个存储设备。
存储器550包括易失性存储器或非易失性存储器,也可包括易失性和非易失性存储器两者。非易失性存储器可以是只读存储器(ROM,Read Only Memory),易失性存储器可以是随机存取存储器(RAM,Random Access Memory)。本申请实施例描述的存储器550旨在包括任意适合类型的存储器。
在一些实施例中,存储器550能够存储数据以支持各种操作,这些数据的示例包括程序、模块和数据结构或者其子集或超集,下面示例性说明。
操作系统551,包括用于处理各种基本系统服务和执行硬件相关任务的系统程序,例如框架层、核心库层、驱动层等,用于实现各种基础业务以及处理基于硬件的任务;
网络通信模块552,用于经由一个或多个(有线或无线)网络接口520到达其他计算设备,示例性的网络接口520包括:蓝牙、无线相容性认证(WiFi)、和通用串行总线(USB,Universal Serial Bus)等;
呈现模块553,用于经由一个或多个与用户接口530相关联的输出装置531(例如,显示屏、扬声器等)使得能够呈现信息(例如,用于操作外围设备和显示内容和信息的用户接口);
输入处理模块554,用于对一个或多个来自一个或多个输入装置532之一的一个或多个用户输入或互动进行检测以及翻译所检测的输入或互动。
在一些实施例中,本申请实施例提供的装置可以采用软件方式实现,图2示出了存储在存储器550中的虚拟道具的处理装置555,其可以是程序和插件等形式的软件,包括以下软件模块:显示模块5551、切换模块5552、获取模块5553、确定模块5554、点 乘模块5555、控制模块5556、检测模块5557、转入模块5558、调整模块5559、拍摄模块55510、加载模块55511、插值模块55512和插入模块55513,这些模块是逻辑上的,因此根据所实现的功能可以进行任意的组合或进一步拆分。需要指出的是,在图2中为了方便表达,一次性示出了上述所有模块,但是不应视为在虚拟道具的处理装置555排除了可以只包括显示模块5551和切换模块5552的实施,将在下文中说明各个模块的功能。
下面将结合本申请实施例提供的终端设备的示例性应用和实施,对本申请实施例提供的虚拟道具的处理方法进行具体说明。
参见图3,图3是本申请实施例提供的虚拟道具的处理方法的流程示意图,将结合图3示出的步骤进行说明。
需要说明的是,图3示出的方法可以由终端设备运行的各种形式的计算机程序执行,并不局限于客户端,例如还可以是上文所述的操作系统、软件模块、脚本和小程序等,因此下文中以客户端的示例不应视为对本申请实施例的限定。此外,为了表述方便,下文中不对终端设备和终端设备运行的客户端进行具体区分。
在步骤301中,显示针对虚拟场景中的虚拟道具的处理入口。
在一些实施例中,在终端设备上安装有支持虚拟场景的客户端(例如当虚拟场景为游戏时,对应的客户端可以是射击类游戏APP),当用户打开终端设备上安装的客户端(例如用户点击在终端设备的用户界面呈现的射击类游戏APP对应的图标),且终端设备运行该客户端时,可以在客户端的人机交互界面呈现的虚拟场景中显示虚拟对象(例如受控于用户1的虚拟对象A)以及虚拟对象A通过握持部位(例如手部)握持的虚拟道具(例如虚拟射击道具、虚拟投掷道具等)。此外,在虚拟场景中还可以显示有针对虚拟道具的处理入口,例如当虚拟道具为虚拟枪械时,可以在虚拟场景中显示针对虚拟枪械的处理入口。
在步骤302中,响应于针对处理入口的触发操作,显示第一处理界面。
在一些实施例中,第一处理界面至少包括处理控件,在另一些实施例中,第一处理界面还可以包括虚拟道具的第一部件,第一部件可以是虚拟道具中待改造的任意一个部件;在第一处理界面中,虚拟道具中除第一部件之外的部件可以不显示、显示部分或全部显示。除第一部件之外的部件的显示数量可以取决于第一处理界面的缩放比例(即虚拟道具的尺寸与第一处理界面的尺寸之间的比例),缩放比例越大,则显示数量越小,以方便观察部件的细节,缩放比例越小,则显示数量越多,以方便观察虚拟道具的整体结构。
作为示例,第一处理界面中的处理控件的类型可以包括用于改变颜色的颜色控件和用于改造的改造控件。处理控件可以专用于处理第一部件,即不同的部件各自对应有一个处理控件;处理控件也可以是通用性的,即用于批量处理虚拟道具中包括第一部件在内的多个部件。
在一些实施例中,终端设备在显示第一处理界面之前,还可以执行以下处理:显示虚拟道具查看界面,其中,虚拟道具查看界面包括虚拟道具的多个部件;响应于针对虚拟道具查看界面中的第一部件的选择操作,转入执行显示第一处理界面的处理。
示例的,以虚拟道具为虚拟枪械为例,终端设备在接收到用户针对虚拟枪械的处理入口的点击操作时,可以首先显示虚拟枪械查看界面(例如整枪界面,在整枪界面中展示有虚拟枪械的枪体全貌,并且还显示有全部可以处理的部件的交互按钮),接着,终端设备响应于用户针对虚拟枪械查看界面中的第一部件(例如枪口)的选择操作(例如接收到用户针对枪口的交互按钮的点击操作),显示第一处理界面(即枪口的处理界面),其中,在第一处理界面中可以显示有虚拟枪械的第一部件(例如枪口)、以及枪口的处 理控件(例如用于替换原先枪口的新枪口)。
示例的,参见图4A,图4A是本申请实施例提供的虚拟道具的处理方法的应用场景示意图,如图4A所示,在人机交互界面中显示有虚拟场景400,在虚拟场景400中显示有虚拟对象401(例如受控于用户1的游戏角色A)、以及虚拟对象401握持的虚拟枪械402。此外,在虚拟场景400中还显示有虚拟枪械402的处理入口403。当接收到针对虚拟枪械402的处理入口403的点击操作时,将在人机交互界面中显示的虚拟场景400,切换为虚拟枪械查看界面404(即整枪界面),在虚拟枪械查看界面404中显示有虚拟枪械402能够被处理的多个部件,例如包括枪管405、护木406、弹匣407、以及光学瞄具408。当在虚拟枪械查看界面404中接收到针对护木406的点击操作时,将在人机交互界面中显示的虚拟枪械查看界面404,切换为护木的处理界面409,在护木的处理界面409中显示有虚拟枪械402的护木410、以及护木410的处理控件411(例如包括可以安装在护木410上的左导轨和右导轨)。
在步骤303中,响应于针对处理控件的触发操作,显示处理后的虚拟道具,以替代显示处理前的虚拟道具。
在一些实施例中,以处理控件是通用性的控件为例,响应于针对处理控件的触发操作,对虚拟道具中的部分或全部部件进行批量化的处理,显示处理后的虚拟道具,以替代显示处理前的虚拟道具,其中,对虚拟道具中的部分进行批量化的处理后,对处理前的虚拟道具进行部分替代显示;对虚拟道具中的部分进行全部批量化的处理后,对处理前的虚拟道具进行全部的替代显示。
在一些实施例中,以第一处理界面还包括虚拟道具的第一部件和专用于第一控件的处理控件为例,步骤303可以通过以下方式实现:响应于在处理控件的触发操作,对第一部件进行处理,显示处理后的第一部件,以替代显示处理前的第一部件;其中,在第一处理界面显示处理后的虚拟道具时,除了显示处理后的第一部件,虚拟道具中除第一部件之外的部件可以不显示、显示部分或全部显示,除第一部件之外的部件的显示数量可以取决于第一处理界面的缩放比例(即虚拟道具的尺寸与第一处理界面的尺寸之间的比例),缩放比例越大,则显示数量越小,以方便观察部件的细节,缩放比例越小,则显示数量越多,以方便观察虚拟道具的整体结构。
在一些实施例中,终端设备在接收到用户针对第一部件的处理控件的触发操作时,可以显示处理后的第一部件,以替代显示处理前的第一部件。例如以第一部件为虚拟枪械的枪口为例,当终端设备接收到用户针对枪口的处理控件的触发操作时(例如针对新枪口的选中操作),可以在虚拟枪械的枪口的位置显示新枪口,以替代显示之前的枪口,从而完成针对枪口的处理。再例如以第一部件为虚拟枪械的护木为例,当终端设备接收到用户针对护木的处理控件的触发操作(例如针对多个可以加装在护木上的配件中的前握把的选中操作)时,可以在虚拟枪械的护木上加装被选中的前握把,从而实现对护木的处理。
在步骤304中,响应于界面跳转触发操作,从显示第一处理界面切换到显示不同于第一处理界面的第二处理界面。
需要说明的是,第二处理界面的显示方式与第一处理界面的显示方式是类似的,例如在第二处理界面中可以显示处理控件;在一些实施例中,还可以在第二处理界面中显示第二部件,第二部件可以是虚拟道具中处第一部件之外的待改造的任意一个部件;第二处理界面中的处理控件可以是通用性的,即用于批量处理虚拟道具中包括第二部件在内的多个部件;处理控件也可以是专用于改造第二部件的控件,即第二部件的处理控件。
在一些实施例中,第一处理界面还可以包括虚拟道具的第二部件,界面跳转触发操作可以是针对第二部件的触发操作,则终端设备可以通过以下方式实现步骤304:响应 于针对第一处理界面中的第二部件的触发操作(例如点击操作或者绘制特定图形的操作),从显示第一处理界面切换到显示第二处理界面。
示例的,以第一部件为虚拟枪械的后握把为例,参见图4B,图4B是本申请实施例提供的虚拟道具的处理方法的应用场景示意图,如图4B所示,在后握把的处理界面412中除了显示有虚拟枪械的后握把413之外,还显示有虚拟枪械的其他部件,例如包括弹匣414、枪托415和瞄具416。当接收到玩家针对后握把的处理界面412中显示的弹匣414(即第二部件)的点击操作时,将在人机交互界面中显示的后握把的处理界面412,直接切换为弹匣的处理界面417,在弹匣的处理界面417中显示有虚拟枪械的弹匣414、以及弹匣414的处理控件418(例如用于对弹匣414进行扩容的新弹匣),如此,通过点击枪械模型的部件的方式,可以直接从一个部件的处理界面快捷跳转至另一个部件的处理界面,提高了针对虚拟道具的处理效率。
在另一些实施例中,第一处理界面还可以包括与至少一个方向分别对应的浏览控件,界面跳转触发操作可以是针对浏览控件的触发操作,则终端设备可以通过以下方式实现上述的步骤304:响应于在第一处理界面中针对第一方向的浏览控件的触发操作,从显示第一处理界面切换到显示第二处理界面,其中,第二部件相对于第一部件的分布方向为第一方向的反方向,且是在反方向上与第一部件距离最近的部件。
示例的,以第一部件为虚拟枪械的枪口为例,参见图4C,图4C是本申请实施例提供的虚拟道具的处理方法的应用场景示意图,如图4C所示,在枪口的处理界面419中除了显示有枪口420、枪口420的处理控件421(例如用于替代枪口420的新枪口)之外,还显示有一个包括上下左右四个方向的浏览控件422。当接收到用户针对浏览控件422的左方向键的点击操作时(由于护木是虚拟枪械中位于枪口右边,且与枪口的距离最近的部件),终端设备将在人机交互界面中显示的枪口的处理界面419,直接切换为护木的处理界面409,在护木的处理界面409中显示有护木410、以及护木410的处理控件411(例如用于安装在护木410上的左、右导轨),如此通过针对浏览控件的触发操作,可以直接从一个部件的处理界面快捷跳转至另一个部件的处理界面,提高了虚拟道具的处理效率。
在一些实施例中,界面跳转触发操作还可以是滑动操作,则终端设备可以通过以下方式实现上述的步骤304:响应于在第一处理界面中的滑动操作,且滑动操作的滑动方向位于第一部件的第一方向区间(即以第一部件为起点向外滑动0-360度方向区间的子区间,例如可以将垂直于枪口,且向上的方向作为0度,以将0度至360度划分成不同的方向区间,其中,第一方向区间可以是225度至315度,即以枪口的正左方(即270度)为中心的区间),从显示第一处理界面直接切换到显示第二处理界面,其中,第一方向区间的反向区间,反向区间是将第一方向区间的两个边界方向的反方向所形成的区间,例如,假设第一方向区间为225度315度,则反向区间为45度至135度,即以枪口的正右方(即90度)为中心的区间中分布有虚拟道具的第二部件,且第二部件与第一部件的距离,与滑动操作的滑动距离成正比例,即滑动操作的滑动距离越大,第二部件与第一部件的距离也越大。
示例的,以虚拟道具为虚拟枪械为例,在枪口的右侧,按照与枪口的距离从近到远的顺序依次分布有护木、机匣和枪托,因此可以预先设定2个不同等级的距离阈值,分别为L1(例如1厘米)和L2(例如2厘米),当滑动操作的滑动距离小于或者等于L1(例如0.7厘米)时,终端设备将在人机交互界面中显示的枪口的处理界面,直接切换至护木的处理界面;当滑动操作的滑动距离大于L1、且小于或等于L2时(例如1.4厘米),终端设备将在人机交互界面中显示的枪口的处理界面,直接切换至机匣的处理界面;当滑动操作的滑动距离大于L2时(例如2.3厘米),终端设备将在人机交互界面中 显示的枪口的处理界面,直接切换至枪托的处理界面,如此,可以根据滑动操作的滑动距离,灵活地从枪口的处理界面跳转至不同部件的处理界面,即用户可以根据自己的需求来控制滑动操作的滑动距离,以跳转至相应部件的处理界面,大大提高了虚拟枪械的处理效率。
在一些实施例中,界面跳转触发操作可以是滑动操作,则终端设备可以通过以下方式实现上述的步骤304:响应于在第一处理界面(例如枪口的处理界面)中的滑动操作,且滑动操作的滑动方向位于第一部件的第一方向区间(例如可以是枪口的225度至315度),从显示第一处理界面直接切换到显示第二处理界面,其中,第一方向区间的反向区间(例如枪口的45度至135度)中分布有虚拟道具的第二部件,且第二部件是在反向区间中与第一部件距离最近的部件(例如护木)。
示例的,以虚拟道具为虚拟枪械为例,在枪口的右侧,按照与枪口的距离从近到远的顺序依次分布有护木、机匣和枪托(即护木是与枪口的距离最近的部件),响应于在枪口的处理界面中的滑动操作,且滑动操作的滑动方向位于枪口的第一方向区间(例如枪口的225度至315度,即位于以枪口的正左方(即270度)为中心的区间,并且在第一方向区间的反向区间,例如枪口的45度至135度,即以枪口的正右方(即90度)为中心的区间中分布有与枪口距离最近的护木),则终端设备可以将在人机交互界面中显示的枪口的处理界面,直接切换为护木的处理界面,如此,通过滑动操作,可以直接从一个部件的处理界面快捷跳转至另一个部件的处理界面,提高了虚拟枪械的处理效率,进而提升了用户的游戏体验。
在另一些实施例中,针对每个部件对应的处理界面还可以预先配置有对应的滑动参数,则终端设备可以通过以下方式实现上述的响应于在第一处理界面中的滑动操作,且滑动操作的滑动方向位于第一部件的第一方向区间,从显示第一处理界面切换到显示第二处理界面:获取针对第一处理界面配置的第一滑动参数,其中,第一滑动参数包括第一部件的至少一个方向区间,至少一个方向区间包括第一方向区间,且每个方向区间的反向区间中分布有虚拟道具的一个部件;响应于在第一处理界面中的滑动操作,获取滑动操作的角度值;响应于滑动操作的角度值位于至少一个方向区间中的第一方向区间,从显示第一处理界面切换到显示第二处理界面。
示例的,以虚拟道具为虚拟步枪为例,参见图5,图5是本申请实施例提供的虚拟步枪的结构示意图,如图5所示,一把虚拟步枪中的以下部件可以被处理:枪口、护木、机匣、弹匣、瞄具、后握把、枪托,其中,护木位于枪口的右侧,机匣位于护木的右侧,瞄具位于护木的右上方,弹匣位于护木的右下方,后握把位于机匣的下方,枪托位于机匣的右侧。以枪口为例,由于枪口的右侧仅有护木这一部件,因此,针对枪口的处理界面,可以仅配置枪口的一个方向区间,例如可以以枪口的中心点为起点,将垂直于枪口且向上(这里的向上是屏幕的上方)的方向对应的角度设定为0度,并按照顺时针方向将0度至360度划分成枪口的不同方向区间,例如针对在枪口的处理界面中接收到的滑动操作,当滑动操作的滑动方向位于枪口的第一方向区间(例如枪口的225度至315度,枪口的第一方向区间的反向区间中分布有护木)时,将从枪口的处理界面切换至护木的处理界面。
再以护木为例,由于虚拟步枪中与护木相邻的部件有4个(包括枪口、机匣、瞄具、弹匣),因此,针对护木的处理界面,可以配置护木的四个方向区间,例如可以以护木的中心点为起点,将垂直于护木,且向上的方向设定为0度,并按照顺时针方向将0度至360度划分成护木的不同方向区间,例如针对在护木的处理界面中接收到的滑动操作,当滑动操作的滑动方向位于护木的第一方向区间(例如护木的45度至135度,在护木的第一方向区间的反向区间中分布有枪口),将从护木的处理界面切换至枪口的处理界 面;当滑动操作的滑动方向位于护木的第二方向区间(例如护木的180度至225度,在护木的第二方向区间的反向区间中分布有瞄具),将从护木的处理界面切换至瞄具的处理界面;当滑动操作的滑动方向位于护木的第三方向区间(例如护木的225度至315度,在护木的第三方向区间的反向区间中分布有机匣)时,将从护木的处理界面切换至机匣的处理界面;当滑动操作的滑动方向位于护木的第四方向区间(例如护木的315度至360度,在护木的第四方向区间的反向区间中分布有弹匣),将从护木的处理界面切换至弹匣的处理界面。
需要说明的是,针对虚拟步枪的其他部件的处理界面配置滑动参数的方式与针对枪口、护木的处理界面配置滑动参数的方式类似,本申请实施例在此不再赘述。
在另一些实施例中,终端设备可以通过以下方式获取滑动操作的角度值:获取滑动操作的起始点(假设为点A(x1,y1))和结束点(假设为点B(x2,y2));基于起始点和结束点,确定滑动操作的滑动方向,例如可以将起始点指向结束点的方向作为滑动操作的滑动方向;将滑动方向的向量与基准方向的向量即Base(0,1)进行点乘处理,点乘处理是对两个向量对应位一一相乘之后求和的操作,点乘的结果是一个标量,将得到的点乘结果作为滑动操作的角度值。
例如,假设滑动方向的向量为(3,4),与上述的Base(0,1)点乘,具体是将3与0点乘得到0,将4与1点乘得到4,,从而点乘结果为4(即0与4的加和)。
在一些实施例中,承接上述示例,终端设备还可以执行以下处理:响应于滑动操作的角度值未位于第一部件的至少一个角度区间中的任意一个角度区间,控制虚拟道具在第一处理界面中旋转;其中,旋转的角度为固定值,例如用户每次在进行滑动操作时,虚拟道具都旋转30度;或者,旋转的角度与滑动操作的滑动距离成正比例。例如可以预先配置一个比例系数,将滑动距离与比例系数的相乘结果确定为旋转的角度,如此,用户可以根据自己的需求,来控制滑动操作的滑动距离。
示例的,以第一部件为虚拟步枪的枪口为例,针对枪口的处理界面配置有对应的第一滑动参数,其中,第一滑动参数包括枪口的第一方向区间(例如枪口的225度至315度),当用户的滑动操作的滑动方向未位于枪口的第一方向区间时(例如假设用户向右滑动屏幕,滑动操作的角度值为80度),则可以确定当前的滑动操作并不是用于触发界面跳转的,而是单纯来回滑屏转换角度来欣赏虚拟步枪的外观的,因此,可以控制虚拟步枪在枪口的处理界面中旋转,其中,虚拟步枪旋转的角度可以是滑动操作的滑动距离成正比例,例如当滑动操作的滑动距离为1厘米时,虚拟步枪对应的旋转角度为50度,当滑动操作的滑动距离为2厘米时,虚拟步枪对应的旋转角度为100度。
在另一些实施例中,终端设备在获取滑动操作的角度值之前,还可以执行以下处理:基于第一滑动参数对滑动操作进行检测,得到检测结果;响应于检测结果表征滑动操作为界面跳转触发操作,转入执行获取滑动操作的角度值的处理;响应于检测结果表征滑动操作为虚拟道具查看操作,控制虚拟道具在第一处理界面中选择,其中,旋转的角度为固定值,或者,旋转的角度与滑动操作的滑动距离成正比例。
示例的,第一滑动参数可以包括以下参数至少之一:设定的滑动时长(例如可以是最小滑动时长或者最大滑动时长)、设定的滑动距离(例如可以是最小滑动距离或者最大滑动距离)、设定的压力参数(例如可以是最小压力值或者最大压力值)、设定的触点数量(例如可以是单点对应界面跳转触发操作,或者双点对应界面跳转触发操作)。
以第一滑动参数为设定的滑动时长为例,终端设备可以通过以下方式实现上述的基于第一滑动参数对滑动操作进行检测,得到检测结果:获取滑动操作的滑动时长,将滑动时长与设定的滑动时长进行比较,得到比较结果,其中,当比较结果表征滑动时长满足时长条件时,确定滑动操作为界面跳转触发操作,当比较结果表征滑动时长不满足时 长条件时,确定滑动操作为虚拟道具查看操作。例如当设定的滑动时长为最小滑动时长(例如1秒),且检测到滑动操作的滑动时长大于或等于最小滑动时长时,确定满足时长条件;当设定的滑动时长为最大滑动时长(例如2秒),且检测到滑动操作的滑动时长小于最大滑动时长时,确定满足时长条件。
以第一滑动参数为设定的滑动距离为例,终端设备可以通过以下方式实现上述的基于第一滑动参数对滑动操作进行检测,得到检测结果:获取滑动操作的滑动距离,将滑动距离与设定的滑动距离进行比较,得到比较结果,其中,当比较结果表征滑动距离满足距离条件时,确定滑动操作为界面跳转触发操作,当比较结果表征滑动距离不满足距离条件时,确定滑动操作为虚拟道具查看操作。例如当设定的滑动距离为最小滑动距离(例如1厘米),且检测到滑动操作的滑动距离大于或等于最小滑动距离时,确定满足距离条件;当设定的滑动距离为最大滑动距离(例如2厘米),且检测到滑动操作的滑动距离小于最大滑动距离时,确定满足距离条件。
以第一滑动参数为设定的压力参数为例,终端设备可以通过以下方式实现上述的基于第一滑动参数对滑动操作进行检测,得到检测结果:获取滑动操作的压力参数,将压力参数与设定的压力参数进行比较,得到比较结果,其中,当比较结果表征压力参数满足压力条件时,确定滑动操作为界面跳转触发操作,当比较结果表征压力参数不满足压力条件时,确定滑动操作为虚拟道具查看操作。例如当设定的压力参数为最小压力阈值,且检测到滑动操作的压力值大于或等于最小压力阈值时,确定满足压力条件;当设定的压力参数为最大压力阈值,且检测到滑动操作的压力值小于或等于最大压力阈值时,确定满足压力条件。
需要说明的是,上述的最小压力阈值和最大压力阈值是可以被配置的,例如针对虚拟道具的不同部件,可以配置不同的最小压力阈值或最大压力阈值,本申请实施例对此不作具体限定。
以第一滑动参数为设定的触点数量为例,终端设备可以通过以下方式实现上述的基于第一滑动参数对滑动操作进行检测,得到检测结果:获取滑动操作的触点数量,将触点数量与设定的触点数量进行比较,得到比较结果,其中,当比较结果表征两者一致时,确定滑动操作为界面跳转触发操作,当比较结果表征两者不一致时,确定滑动操作为虚拟道具查看操作。例如假设设定的触点数量为单点,当滑动操作的触点数量也是单点时,确定滑动操作为界面跳转触发操作;当滑动操作的触点数量为多点时,确定滑动操作为虚拟道具查看操作。
在一些实施例中,第一处理界面和第二处理界面可以是通过虚拟摄像机拍摄得到的,且虚拟道具的每个部件均配置有与虚拟摄像机对应的镜头参数,则终端设备在从显示第一处理界面切换到显示第二处理界面之前,还可以执行以下处理:获取针对第二部件配置的第二镜头参数;基于第二镜头参数调整虚拟场景中的虚拟摄像机的姿态,并调用调整后的虚拟摄像机对虚拟道具进行拍摄;在拍摄得到的画面中加载第二部件的处理控件,得到第二处理界面。
示例的,镜头参数可以包括以下参数至少之一:镜头对应的部件(例如当镜头参数为针对虚拟步枪的枪口配置的镜头参数时,镜头对应的部件为枪口)、镜头的旋转角度、镜头相对于部件的起点的偏移量(例如当镜头参数为针对虚拟步枪的枪口配置的镜头参数时,这里偏移量是指相对于枪口的起点的偏移值)、镜头与焦点之间的距离、镜头的视角。
在另一些实施例中,承接上述示例,终端设备还可以执行以下处理:获取针对第一部件配置的第一镜头参数;基于第一镜头参数和第二镜头参数进行插值处理,得到至少一个中间镜头参数,其中,每个中间镜头参数用于对虚拟摄像机的姿态进行调整,并调 用调整后的虚拟摄像机对虚拟道具进行拍摄,得到对应的一个中间界面;在从显示第一处理界面切换到显示第二处理界面的过程中,插入至少一个中间界面。
需要说明的是,在切换过程中插入的中间界面的数量可以是固定的,也可以是与在显示虚拟场景时的帧率成正比例的,即帧率越高,插入的中间界面的数量也越大,从而可以实现平滑地从第一处理界面切换到第二处理界面,提高了用户的视觉体验。
此外,还需要说明的是,无论是从第一处理界面直接切换至第二处理界面,还是在切换的过程中增加过渡动画(即插入中间界面),其本质都是从显示一个部件的处理界面跳转至另一个部件的处理界面,中途没有插入第三方功能的界面(例如整枪界面)。
示例的,终端设备可以通过以下方式实现上述的基于第一镜头参数和第二镜头参数进行插值处理,得到至少一个中间镜头参数:对第二镜头参数与t进行相乘处理,得到第一相乘结果,其中,t为开始切换后经过的时长,且t的取值范围满足:0≤t≤T,T为从第一处理界面切换到第二处理界面的总时长,T为大于0的实数;对T与t的相减结果与第一镜头参数进行相乘处理,得到第二相乘结果;将第一相乘结果与第二相乘结果的求和结果,确定为至少一个中间镜头参数。
在另一些实施例中,参见图6,图6是本申请实施例提供的虚拟道具的处理方法的流程示意图,如图6所示,在执行完图3示出的步骤304之后,还可以执行图6示出的步骤305,将结合图6示出的步骤进行说明。
上文结合基于第一处理界面触发不同类型的界面跳转触发操作来说明步骤304,需要说明的是,上文所述的界面跳转触发操作虽然都是基于第一处理界面触发的,但是不应视为界面跳转触发操作只能基于第一处理界面来实施的限定,下面继续说明界面跳转触发操作的其他情形。
在一些实施例中,界面跳转触发操作可以是语音指令,语音指令可以指示从第一部件切换的方向,例如语音指令可以是向左切换,则将切换显示包括第二部件的第二处理界面,其中,第二部件是位于第一部件的左侧且与第一部件距离最近的部件。语音指令还可以进一步指示跳转的数量,例如语音指令可以是向左切换两个部件,则将切换显示包括第二部件的第二处理界面,其中,第二部件是位于第一部件的左侧且与第一部件之间间隔两个部件。
在另一些实施例中,界面跳转触发操作可以是体感操作,体感操作可以是向某个方向晃动终端设备的操作,例如体感操作可以是向左晃动,则将切换显示包括第二部件的第二处理界面,其中,第二部件是位于第一部件的左侧且与第一部件距离最近的部件。体感操作还可以进一步指示跳转的数量,例如,向左晃动的幅度可以与向左切换部件的数量正相关,幅度越大,则向左切换的部件的数量越大。
在步骤305中,响应于虚拟道具的第三部件满足处理条件,从显示第二处理界面切换到显示不同于第二处理界面的第三处理界面。
第三处理界面的显示方式与第一处理界面的显示方式是类似的,在一些实施例中,第三处理界面至少包括处理控件;在另一些实施例中,第三处理界面还可以包括虚拟道具的第三部件,第三部件可以是虚拟道具中除第一部件和第二部件之外的待改造的任意一个部件。
作为示例,第三处理界面中的处理控件的类型可以包括用于改变颜色的颜色控件和用于改造的改造控件。处理控件可以专用于处理第三部件,即第三部件的处理控件;处理控件也可以是通用性的,即用于批量处理虚拟道具中包括第三部件在内的多个部件。
在第三处理界面中,虚拟道具中除第三部件之外的部件可以不显示、显示部分或全部显示。除第三部件之外的部件的显示数量可以取决于第三处理界面的缩放比例(即虚拟道具的尺寸与第一处理界面的尺寸之间的比例),缩放比例越大,则显示数量越小, 以方便观察部件的细节,缩放比例越小,则显示数量越多,以方便观察虚拟道具的整体结构。
在一些实施例中,界面跳转操作也可以是自动实现的,例如当终端设备检测到虚拟道具的第三部件满足处理条件时,可以自动从第二处理界面跳转至第三处理界面,其中,处理条件可以包括以下至少之一:第三部件的损耗程度(例如随着用户使用虚拟道具的时间变长,虚拟道具的第三部件会慢慢产生损耗;或者用户的虚拟道具的第三部件遭受其他玩家的攻击时,也会造成虚拟道具的第三部件产生损耗,当第三部件的损耗程度大于损耗程度阈值时,将影响虚拟道具的正常使用。
以第三部件为虚拟枪械的枪托为例,随着用户在游戏中使用虚拟枪械的时间变长,枪托会慢慢产生虚拟损耗,例如破损或变形,从而使得后坐力变得异常,影响用户的射击体验)大于或等于损耗程度阈值(例如30%);获取到第三部件能够使用的新配件。例如以第三部件为虚拟步枪的枪托为例,当终端设备检测到虚拟步枪的枪托的损耗程度大于损耗程度阈值时(例如已经影响到用户在游戏中使用虚拟步枪),可以自动从第二处理界面(例如护木的处理界面)跳转至枪托的处理界面,从而方便用户对枪托进行处理,如此,提高了虚拟道具的处理效率,进而也提升了用户的游戏体验。
本申请实施例提供的虚拟道具的处理方法,在第一部件对应的第一处理界面中接收到界面跳转触发操作时,可以从第一处理界面直接跳转至第二部件对应的第二处理界面,而不需要首先返回到上一层级中去选择第二部件,如此,可以大大提高虚拟道具的处理效率,提升了用户的游戏体验。
下面,以虚拟道具为虚拟枪械、处理为改造处理为例说明示例性应用。
本申请实施例提供一种虚拟道具的处理方法,根据虚拟枪械上各个部件之间的相对位置,通过不同方向的滑动操作或直接点击相应部件的方式,直接从一个部件的改造界面快捷跳转至另一个部件的改造界面(例如直接从枪口的改造界面跳转至护木的改造界面),而不需要反复返回到上一层级(例如整枪界面)去选择新的改造部件,大大提高了虚拟枪械的改造效率。
下面对本申请实施例提供的虚拟道具的处理方法进行具体说明。
在一些实施例中,用户可以通过点击虚拟枪械中需要改造的部件的方式,来快捷跳转至对应部件的改造界面。
示例的,如图4B所示,在后握把的改造界面412中除了显示有后握把413之外,还显示有虚拟枪械的其他部件,例如弹匣414、枪托415和瞄具416,每个枪械部件都有自己的包络盒,当接收到用户针对弹匣414的包络盒的点击操作时,将直接从后握把的改造界面412跳转至弹匣的改造界面417。
在另一些实施例中,用户也可以通过滑动屏幕来实现不同部件的改造界面之间的跳转。
示例的,如图5所示,一把虚拟步枪中的以下部件可以被改造:枪口、护木、机匣、瞄具、弹匣、后握把、枪托,这些部件之间有相对的位置分布,例如护木在枪口的右边,机匣在护木的右边,瞄具在护木的右上方,弹匣在护木的右下方。因此,如图7A所示,假设人机交互界面中显示的是枪口的改造界面701,那么用户只需要手指往左滑,就会从枪口的改造界面701跳转至护木的改造界面702,这是因为视觉上的认知是护木在枪口的右边,那么用户往左滑动屏幕时,就应该从枪口位置转移到护木位置,也即应该从枪口的改造界面701跳转到护木的改造界面702。同理,在接收到用户往左上角滑动屏幕的滑动操作时,将从护木的改造界面702跳转至弹匣的改造界面703。类似的,从护木的改造界面到瞄具的改造界面的跳转是往左下角滑动屏幕。
在另一些实施例中,可以设置一个方向区间来判定具体的滑动方向能否触发界面跳 转,继续参见图5,以枪口为例,例如从枪口到护木,设置的方向区间可以是225度至315度,那么客户端只要检测到有这个方向区间内的滑屏操作,即可触发界面跳转。
此外,为了区分用户是想触发界面跳转还是只是单纯来回滑屏转换角度的欣赏枪械外观,可以设定一个滑动操作的时长限制,例如当滑动操作的时长超过时长阈值(例如2秒),则不触发界面跳转操作,视为欣赏枪械外观的操作。
在一些实施例中,针对不同类型的枪,由于可以改造的部件不同,因此可以单独配置一套参数来决定跳转顺序和滑屏方向的角度区间参数。
需要说明的是,类似于护木、瞄具这样的部件,可以在该部件上继续加装子部件(或称子配件)的操作,例如可以在护木上加装一个前握把、一个手电筒、或者激光设备,那么护木及护木上的所有子配件可以都统一放在护木的改造界面,在这个改造界面可以对护木及其子配件一起进行改造,跳转时,将护木及其子配件视为一体。
示例的,参见图7B,图7B是本申请实施例提供的虚拟道具的改造处理方法的应用场景示意图,如图7B所示,在护木的改造界面702中除了显示有用于替换枪械原有护木的护木704之外,还显示有能够在护木上继续加装的子配件,例如包括左导轨705、右导轨706、以及前握把707,如此,在一个改造界面中可以对护木及其子配件一起进行改造,提高了虚拟枪械的改造效率。
下面将结合图8对本申请实施例提供的虚拟道具的处理方法进行具体说明。
示例的,参见图8,图8是本申请实施例提供的虚拟道具的处理方法的流程示意图,将结合图8示出的步骤进行说明。
在步骤801中,客户端接收玩家的滑屏操作。
在一些实施例中,以当前显示的改造界面为枪口的改造界面为例,客户端接收玩家在枪口的改造界面中触发的滑屏操作。
在步骤802中,客户端计算滑屏操作的滑屏角度。
在一些实施例中,以枪口为例,客户端在计算滑屏角度之前,可以如图5所示,以枪口的中心点为起点,将垂直于枪口,且向上的角度设定为0度,并将0度至360度分割成如图9所示的八个象限,随后将其中的第六象限和第七象限(即225度至315度)确定为枪口的第一方向区间,也即如果玩家的滑屏方向P的角度值落在第六象限或第六象限(即第一方向区间),则会触发从枪口的改造界面至护木的改造界面的跳转。
接着,客户端可以获取玩家滑屏操作的起始点A(x1,y1)和结束点B(x2,y2),计算出A点到B点的滑屏方向P(P为从A指向B的向量),随后将滑屏方向P与基准方向的向量Base(0,1)进行点乘处理,将得到的点乘结果作为玩家的滑屏方向P的角度值,通过角度值即可确定出玩家的滑屏方向P落在哪个象限内。
在步骤803中,客户端读取滑屏参数。
在一些实施例中,针对虚拟枪械的每个部件对应的改造界面,可以预先配置一组滑屏参数(对应于上述的滑动参数)来控制玩家的滑屏跳转,其中,滑屏参数可以包括:Min(float)和Max(float)表示改造部件需要的滑屏区间(即方向区间),例如对于从枪口跳转到护木来说,对应的Min(float)和Max(float)分别为225度和315度(对应于图9的第六象限和第七象限,即如果玩家的滑屏方向P的角度值落在第六象限或第六象限,则会触发从枪口的改造界面至护木的改造界面的跳转)、MinDist(float)表示需要的最小滑动距离、MaxDuration(float)表示触发界面跳转的最大滑动时间、PointType(enum)表示镜头对应的改造部件。可以预先配置每个部件的改造界面分别对应的滑屏参数。
在步骤804中,客户端基于滑屏参数确定滑屏操作的类型,当滑屏操作为虚拟枪械查看操作时,执行步骤805,当滑屏操作为界面跳转触发操作时,执行步骤806至步骤 807。
在步骤805中,客户端在当前部件的改造界面中对虚拟枪械进行旋转。
在一些实施例中,以当前显示的改造界面为枪口的改造界面为例,当客户端基于滑屏参数确定出滑屏操作的类型为虚拟枪械查看操作时,可以在枪口的改造界面中对虚拟枪械进行旋转,并显示旋转后的虚拟枪械,以满足玩家欣赏枪械外观的需求。
在步骤806中,客户端获取目标镜头参数。
在一些实施例中,针对虚拟枪械的每个部件,可以预先配置一组镜头参数来描述部件对应的镜头,其中,镜头参数可以包括:PointType(enum)表示镜头对应的改造部件,Rotation(Vector3)表示镜头的旋转角度、Offset(Vector2)表示镜头相对于初始点的偏移量、CameraDis(float)表示镜头距离焦点的距离、FOV(float)表示镜头视角、LerpSpeed(float)表示与下一个镜头内插过渡的速度。
示例的,以当前显示的改造界面为枪口的改造界面为例,客户端在读取针对枪口的改造界面配置的滑屏参数之后,首先判断当前玩家的滑屏操作的滑动距离是否大于最小滑动距离(MinDist),如果小于MinDist,则判定此次滑屏操作为虚拟枪械查看操作,客户端可以在枪口的改造界面中对虚拟枪械进行旋转,并显示旋转后的虚拟枪械,以满足玩家欣赏枪械外观的需求;如果大于MinDist,则继续判断滑屏操作的滑动时长是否小于最大滑动时间(MaxDuration),如果大于MaxDuration,则判定此次滑屏操作为虚拟枪械查看操作,客户端在枪口的改造界面中对虚拟枪械进行旋转,显示旋转后的虚拟枪械,以满足玩家欣赏枪械外观的需求;如果小于MaxDuration,则判定此次滑屏操作为界面跳转触发操作,并根据玩家的滑屏方向P落在的象限,获取对应的目标镜头参数,例如如果玩家的滑屏方向P的角度值落在第六象限或第六象限,则获取针对护木配置的镜头参数(即目标镜头参数)。
在步骤807中,客户端基于初始镜头参数和目标镜头参数进行插值处理,并顺序显示各个镜头拍摄得到的画面。
在一些实施例中,在获取到目标镜头参数之后,客户端还可以对初始镜头参数(即当前的改造界面对应的镜头参数)中的每一个参数与目标镜头参数进行线性插值处理,其中,进行线性插值处理的插值公式可以是:
P=(1-DeltaTime)*A+DeltaTime*B
其中,DeltaTime表示开始切换后经过的时长,A表示当前改造界面对应的镜头参数,B表示目标镜头参数,P表示经过插值处理得到的中间镜头参数。如此,可以依次基于初始镜头参数、中间镜头参数、目标镜头参数对虚拟摄像机的姿态调整,并调用调整后的虚拟摄像机对虚拟枪械进行拍摄,得到不同姿态的虚拟摄像机拍摄得到的画面(即虚拟摄像机处于不同姿态时镜头拍摄得到的画面),最后顺序显示拍摄得到的多个画面,从而实现平滑地从枪口的改造界面切换至护木的改造界面。
本申请实施例提供的虚拟道具的改造处理方法,根据虚拟枪械上各个部件之间的相对位置,通过不同方向的滑屏操作或者直接点击模型的方式,直接从一个部件的改造界面快捷跳转至另一个部件的改造界面,而不需要反复返回到上一层级(例如整枪界面)去选择新的改造部件,提高了虚拟枪械的改造效率,进而提升了用户的游戏体验。
下面继续说明本申请实施例提供的虚拟道具的改造处理装置555的实施为软件模块的示例性结构,在一些实施例中,如图2所示,存储在存储器550的虚拟道具的改造处理装置555中的软件模块可以包括:显示模块5551和切换模块5552。
显示模块5551,配置为显示针对虚拟场景中的虚拟道具的改造入口;显示模块5551,还配置为响应于针对处理入口的触发操作,显示第一处理界面,其中,第一处理界面至少包括处理控件;显示模块5551,还配置为响应于针对处理控件的触发操作,显示处理后的所述虚拟道具,以替代显示处理前的虚拟道具;切换模块5552,配置为响应于界面 跳转触发操作,从显示所述第一处理界面切换到显示不同于第一处理界面的第二处理界面。。
在一些实施例中,第一处理界面还包括虚拟道具的第一部件;显示模块5551,还配置为响应于针对处理控件的触发操作,显示处理后的第一部件,以替代显示处理前的第一部件,其中,处理后的虚拟道具中除第一部件之外的部件未显示或至少部分显示。
在一些实施例中,第一处理界面还包括虚拟道具的第二部件,界面跳转触发操作是针对第二部件的触发操作;切换模块5552,还配置为响应于针对第一处理界面中的第二部件的触发操作,从显示第一处理界面切换到显示第二处理界面。
在一些实施例中,第一处理界面还包括与至少一个方向分别对应的浏览控件,界面跳转触发操作是针对浏览控件的触发操作;切换模块5552,还配置为响应于在第一处理界面中针对第一方向的浏览控件的触发操作,从显示第一处理界面切换到显示第二处理界面,其中,第二部件相对于第一部件的分布方向为第一方向的反方向,且是在反方向上与第一部件距离最近的部件。
在一些实施例中,界面跳转触发操作是滑动操作;切换模块5552,还配置为响应于在第一处理界面中的滑动操作,且滑动操作的滑动方向位于第一部件的第一方向区间,从显示第一处理界面切换到显示第二处理界面,其中,第一方向区间的反向区间中分布有第二部件,且第二部件与第一部件的距离,与滑动操作的滑动距离成正比例。
在一些实施例中,界面跳转触发操作是滑动操作;切换模块5552,还配置为响应于在第一处理界面中的滑动操作,且滑动操作的滑动方向位于第一部件的第一方向区间,从显示第一处理界面切换到显示第二处理界面,其中,第一方向区间的反向区间中分布有第二部件,且第二部件是在反向区间中与第一部件距离最近的部件。
在一些实施例中,虚拟道具的处理处理装置555还包括获取模块5553,配置为获取针对第一处理界面配置的第一滑动参数,其中,第一滑动参数包括第一部件的至少一个方向区间,至少一个方向区间包括第一方向区间,且每个方向区间的反向区间中分布有虚拟道具的一个部件;以及配置为响应于在第一处理界面中的滑动操作,获取滑动操作的角度值;切换模块5552,还配置为响应于滑动操作的角度值位于至少一个方向区间中的第一方向区间,从显示第一处理界面切换到显示第二处理界面。
在一些实施例中,获取模块5553,还配置为获取滑动操作的起始点和结束点;虚拟道具的处理处理装置555还包括确定模块5554和点乘模块5555,其中,确定模块5554,配置为基于起始点和结束点,确定滑动操作的滑动方向;点乘模块5555,配置为将滑动方向和基准方向进行点乘处理,得到滑动操作的角度值。
在一些实施例中,虚拟道具的处理处理装置555还包括控制模块5556,配置为响应于滑动操作的角度值未位于至少一个角度区间中的任意一个角度区间,控制虚拟道具在第一处理界面中旋转,其中,旋转的角度为固定值,或者,旋转的角度与滑动操作的滑动距离成正比例。
在一些实施例中,虚拟道具的处理处理装置555还包括检测模块5557和转入模块5558,其中,检测模块5557,配置为获取模块5553在获取滑动操作的角度值之前,基于第一滑动参数对滑动操作进行检测,得到检测结果;转入模块5558,配置为响应于检测结果表征滑动操作为界面跳转触发操作,转入执行获取滑动操作的角度值的处理;控制模块5556,还配置为响应于检测结果表征滑动操作为虚拟道具查看操作,控制虚拟道具在第一处理界面中进行旋转,且旋转的角度为固定值,或者,旋转的角度与滑动操作的滑动距离成正比例。
在一些实施例中,第一滑动参数还包括以下参数至少之一:设定的滑动时长、设定的滑动距离、设定的压力参数、设定的触点数量;检测模块5557,还配置为执行以下处 理至少之一:获取滑动操作的滑动时长,将滑动时长与设定的滑动时长进行比较,得到比较结果,其中,当比较结果表征滑动时长满足时长条件时,确定滑动操作为界面跳转触发操作,当比较结果表征滑动时长不满足时长条件时,确定滑动操作为虚拟道具查看操作;获取滑动操作的滑动距离,将滑动距离与设定的滑动距离进行比较,得到比较结果,其中,当比较结果表征滑动距离满足距离条件时,确定滑动操作为界面跳转触发操作,当比较结果表征滑动距离不满足距离条件时,确定滑动操作为虚拟道具查看操作;获取滑动操作的压力参数,将压力参数与设定的压力参数进行比较,得到比较结果,其中,当比较结果表征压力参数满足压力条件时,确定滑动操作为界面跳转触发操作,当比较结果表征压力参数不满足压力条件时,确定滑动操作为虚拟道具查看操作;获取滑动操作的触点数量,将触点数量与设定的触点数量进行比较,得到比较结果,其中,当比较结果表征两者一致时,确定滑动操作为界面跳转触发操作,当比较结果表征两者不一致时,确定滑动操作为虚拟道具查看操作。
在一些实施例中,第一处理界面和第二处理界面是通过虚拟摄像机拍摄得到的,虚拟道具的每个部件均配置有与虚拟摄像机对应的镜头参数;获取模块5553,还配置为切换模块5552在从显示第一处理界面切换到显示第二处理界面之前,获取针对第二部件配置的第二镜头参数;虚拟道具的处理处理装置555还包括调整模块5559、拍摄模块55510和加载模块55511,其中,调整模块5559,配置为基于第二镜头参数调整虚拟场景中的虚拟摄像机的姿态;拍摄模块55510,配置为调用调整后的虚拟摄像机对虚拟道具进行拍摄;加载模块55511,配置为在拍摄得到的画面中加载第二部件的处理控件,得到第二处理界面。
在一些实施例中,镜头参数包括以下参数至少之一:镜头对应的部件、镜头的旋转角度、镜头相对于部件的起点的偏移量、镜头与焦点之间的距离、镜头的视角。
在一些实施例中,获取模块5553,还配置为获取针对第一部件配置的第一镜头参数;虚拟道具的处理处理装置555还包括插值模块55512和插入模块55513,其中,插值模块55512,配置为基于第一镜头参数和第二镜头参数进行插值处理,得到至少一个中间镜头参数,其中,每个中间镜头参数用于对虚拟摄像机的姿态进行调整,并调用调整后的虚拟摄像机对虚拟道具进行拍摄,得到对应的一个中间界面;插入模块55513,配置为切换模块5552在从显示第一处理界面切换到显示第二处理界面的过程中,插入至少一个中间界面。
在一些实施例中,插值模块55512,还配置为对第二镜头参数与t进行相乘处理,得到第一相乘结果,其中,t为开始切换后经过的时长,且t的取值范围满足:0≤t≤T,T为从第一处理界面切换到第二处理界面的总时长,T为大于0的实数;对T与t的相减结果与第一镜头参数进行相乘处理,得到第二相乘结果;将第一相乘结果与第二相乘结果的求和结果,确定为至少一个中间镜头参数。
在一些实施例中,显示模块5551,还配置为在显示第一处理界面之前,显示虚拟道具查看界面,其中,虚拟道具查看界面包括虚拟道具的多个部件;转入模块5558,还配置为响应于针对虚拟道具查看界面中的第一部件的选择操作,转入执行显示第一处理界面的处理。
在一些实施例中,切换模块5552,还配置为响应于虚拟道具的第三部件满足处理条件,从显示第二处理界面切换到显示第三处理界面,其中,第三处理界面包括第三部件、以及第三部件的处理控件。
在一些实施例中,处理条件包括以下至少之一:第三部件的损耗程度大于或等于损耗程度阈值;获取到第三部件能够使用的新配件。
需要说明的是,本申请实施例装置的描述,与上述方法实施例的描述是类似的,具 有同方法实施例相似的有益效果,因此不做赘述。对于本申请实施例提供的虚拟道具的改造处理装置中未尽的技术细节,可以根据图3、或图6任一附图的说明而理解。
本申请实施例提供了一种计算机程序产品,该计算机程序产品包括计算机程序或计算机可执行指令,该计算机程序或计算机可执行指令存储在计算机可读存储介质中。计算机设备的处理器从计算机可读存储介质读取该计算机可执行指令,处理器执行该计算机可执行指令,使得该计算机设备执行本申请实施例上述的虚拟道具的改造处理方法。
本申请实施例提供一种存储有计算机可执行指令的计算机可读存储介质,其中存储有计算机可执行指令,当计算机可执行指令被处理器执行时,将引起处理器执行本申请实施例提供的虚拟道具的改造处理方法,例如,如图3、或图6示出的虚拟道具的改造处理方法。
在一些实施例中,计算机可读存储介质可以是FRAM、ROM、PROM、EPROM、EEPROM、闪存、磁表面存储器、光盘、或CD-ROM等存储器;也可以是包括上述存储器之一或任意组合的各种设备。
在一些实施例中,可执行指令可以采用程序、软件、软件模块、脚本或代码的形式,按任意形式的编程语言(包括编译或解释语言,或者声明性或过程性语言)来编写,并且其可按任意形式部署,包括被部署为独立的程序或者被部署为模块、组件、子例程或者适合在计算环境中使用的其它单元。
作为示例,可执行指令可被部署为在一个电子设备上执行,或者在位于一个地点的多个电子设备上执行,又或者,在分布在多个地点且通过通信网络互连的多个电子设备上执行。
以上所述,仅为本申请的实施例而已,并非用于限定本申请的保护范围。凡在本申请的精神和范围之内所作的任何修改、等同替换和改进等,均包含在本申请的保护范围之内。

Claims (23)

  1. 一种虚拟道具的处理方法,由电子设备执行,所述方法包括:
    显示针对虚拟场景中的虚拟道具的处理入口;
    响应于针对所述处理入口的触发操作,显示第一处理界面,其中,所述第一处理界面至少包括处理控件;
    响应于针对所述处理控件的触发操作,显示处理后的所述虚拟道具,以替代显示处理前的所述虚拟道具;
    响应于界面跳转触发操作,从显示所述第一处理界面切换到显示不同于所述第一处理界面的第二处理界面。
  2. 根据权利要求1所述的方法,其中,
    所述第一处理界面还包括所述虚拟道具的第一部件;
    所述响应于针对所述处理控件的触发操作,显示处理后的所述虚拟道具,以替代显示处理前的所述虚拟道具,包括:
    响应于针对所述处理控件的触发操作,显示处理后的所述第一部件,以替代显示处理前的所述第一部件,其中,处理后的所述虚拟道具中除所述第一部件之外的部件未显示或至少部分显示。
  3. 根据权利要求1或2所述的方法,其中,
    所述第一处理界面还包括所述虚拟道具的第二部件,所述界面跳转触发操作是针对所述第二部件的触发操作;
    所述响应于界面跳转触发操作,从显示所述第一处理界面切换到显示不同于所述第一处理界面的第二处理界面,包括:
    响应于针对所述第一处理界面中的所述第二部件的触发操作,从显示所述第一处理界面切换到显示不同于所述第一处理界面的第二处理界面。
  4. 根据权利要求1或2所述的方法,其中,
    所述第一处理界面还包括与至少一个方向分别对应的浏览控件,所述界面跳转触发操作是针对所述浏览控件的触发操作;
    所述响应于界面跳转触发操作,从显示所述第一处理界面切换到显示不同于所述第一处理界面的第二处理界面,包括:
    响应于在所述第一处理界面中针对第一方向的所述浏览控件的触发操作,从显示所述第一处理界面切换到显示不同于所述第一处理界面的第二处理界面,其中,所述第一处理界面包括第二部件,所述第二部件相对于所述第一部件的分布方向为所述第一方向的反方向,且是在所述反方向上与所述第一部件距离最近的部件。
  5. 根据权利要求1或2所述的方法,其中,
    所述界面跳转触发操作是滑动操作;
    所述响应于界面跳转触发操作,从显示所述第一处理界面切换到显示不同于所述第一处理界面的第二处理界面,包括:
    响应于在所述第一处理界面中的所述滑动操作,且所述滑动操作的滑动方向位于所述第一部件的第一方向区间,从显示所述第一处理界面切换到显示不同于所述第一处理界面的第二处理界面,其中,所述第一方向区间的反向区间中分布有第二部件,且所述第二部件与所述第一部件的距离,与所述滑动操作的滑动距离成正比例。
  6. 根据权利要求1至3任一项所述的方法,其中,
    所述界面跳转触发操作是滑动操作;
    所述响应于界面跳转触发操作,从显示所述第一处理界面切换到显示不同于所述第一处理界面的第二处理界面,包括:
    响应于在所述第一处理界面中的所述滑动操作,且所述滑动操作的滑动方向位于所述第一部件的第一方向区间,从显示所述第一处理界面切换到显示不同于所述第一处理界面的第二处理界面,其中,所述第一方向区间的反向区间中分布有第二部件,且所述第二部件是在所述反向区间中与所述第一部件距离最近的部件。
  7. 根据权利要求3至6任一项所述的方法,其中,
    所述第二处理界面包括所述第二部件、以及所述第二部件的处理控件。
  8. 根据权利要求6所述的方法,其中,所述响应于在所述第一处理界面中的所述滑动操作,且所述滑动操作的滑动方向位于所述第一部件的第一方向区间,从显示所述第一处理界面切换到显示不同于所述第一处理界面的第二处理界面,包括:
    获取针对所述第一处理界面配置的第一滑动参数,其中,所述第一滑动参数包括所述第一部件的至少一个方向区间,所述至少一个方向区间包括所述第一方向区间,且每个所述方向区间的反向区间中分布有所述虚拟道具的一个部件;
    响应于在所述第一处理界面中的所述滑动操作,获取所述滑动操作的角度值;
    响应于所述滑动操作的角度值位于所述至少一个方向区间中的所述第一方向区间,从显示所述第一处理界面切换到显示不同于所述第一处理界面的第二处理界面。
  9. 根据权利要求8所述的方法,其中,所述获取所述滑动操作的角度值,包括:
    获取所述滑动操作的起始点和结束点;
    基于所述起始点和所述结束点,确定所述滑动操作的滑动方向;
    将所述滑动方向和基准方向进行点乘处理,得到所述滑动操作的角度值。
  10. 根据权利要求8或9所述的方法,其中,所述方法还包括:
    响应于所述滑动操作的角度值未位于所述至少一个角度区间中的任意一个角度区间,控制所述虚拟道具在所述第一处理界面中旋转,其中,所述旋转的角度为固定值,或者,所述旋转的角度与所述滑动操作的滑动距离成正比例。
  11. 根据权利要求8至10任一项所述的方法,其中,在获取所述滑动操作的角度值之前,所述方法还包括:
    基于所述第一滑动参数对所述滑动操作进行检测,得到检测结果;
    响应于所述检测结果表征所述滑动操作为所述界面跳转触发操作,转入执行获取所述滑动操作的角度值的处理;
    响应于所述检测结果表征所述滑动操作为虚拟道具查看操作,控制所述虚拟道具在所述第一处理界面中旋转,且所述旋转的角度为固定值,或者,所述旋转的角度与所述滑动操作的滑动距离成正比例。
  12. 根据权利要求11所述的方法,其中,
    所述第一滑动参数还包括以下参数至少之一:设定的滑动时长、设定的滑动距离、设定的压力参数、设定的触点数量;
    所述基于所述第一滑动参数对所述滑动操作进行检测,得到检测结果,包括:
    执行以下处理至少之一:
    获取所述滑动操作的滑动时长,将所述滑动时长与所述设定的滑动时长进行比较,得到比较结果,其中,当所述比较结果表征所述滑动时长满足时长条件时,确定所述滑动操作为界面跳转触发操作,当所述比较结果表征所述滑动时长不满足时长条件时,确定所述滑动操作为虚拟道具查看操作;
    获取所述滑动操作的滑动距离,将所述滑动距离与所述设定的滑动距离进行比较,得到比较结果,其中,当所述比较结果表征所述滑动距离满足距离条件时,确定所述滑 动操作为界面跳转触发操作,当所述比较结果表征所述滑动距离不满足距离条件时,确定所述滑动操作为虚拟道具查看操作;
    获取所述滑动操作的压力参数,将所述压力参数与所述设定的压力参数进行比较,得到比较结果,其中,当所述比较结果表征所述压力参数满足压力条件时,确定所述滑动操作为界面跳转触发操作,当所述比较结果表征所述压力参数不满足压力条件时,确定所述滑动操作为虚拟道具查看操作;
    获取所述滑动操作的触点数量,将所述触点数量与所述设定的触点数量进行比较,得到比较结果,其中,当所述比较结果表征两者一致时,确定所述滑动操作为界面跳转触发操作,当所述比较结果表征两者不一致时,确定所述滑动操作为虚拟道具查看操作。
  13. 根据权利要求1至12任一项所述的方法,其中,
    所述第一处理界面和所述第二处理界面是通过虚拟摄像机拍摄得到的,所述虚拟道具的每个部件均配置有与所述虚拟摄像机对应的镜头参数;
    在从显示所述第一处理界面切换到显示不同于所述第一处理界面的第二处理界面之前,所述方法还包括:
    获取针对所述第二部件配置的第二镜头参数;
    基于所述第二镜头参数调整所述虚拟场景中的所述虚拟摄像机的姿态,并调用调整后的所述虚拟摄像机对所述虚拟道具进行拍摄;
    在拍摄得到的画面中加载所述第二部件的处理控件,得到所述第二处理界面。
  14. 根据权利要求13所述的方法,其中,
    所述镜头参数包括以下参数至少之一:镜头对应的部件、镜头的旋转角度、镜头相对于所述部件的起点的偏移量、镜头与焦点之间的距离、镜头的视角。
  15. 根据权利要求13或14所述的方法,其中,所述方法还包括:
    获取针对所述第一部件配置的第一镜头参数;
    基于所述第一镜头参数和所述第二镜头参数进行插值处理,得到至少一个中间镜头参数,其中,每个所述中间镜头参数用于对所述虚拟摄像机的姿态进行调整,并调用调整后的所述虚拟摄像机对所述虚拟道具进行拍摄,得到对应的一个中间界面;
    在从显示所述第一处理界面切换到显示所述第二处理界面的过程中,插入至少一个所述中间界面。
  16. 根据权利要求15所述的方法,其中,所述基于所述第一镜头参数和所述第二镜头参数进行插值处理,得到至少一个中间镜头参数,包括:
    对所述第二镜头参数与t进行相乘处理,得到第一相乘结果,其中,t为开始切换后经过的时长,且t的取值范围满足:0≤t≤T,T为从所述第一处理界面切换到所述第二处理界面的总时长,T为大于0的实数;
    对T与t的相减结果与所述第一镜头参数进行相乘处理,得到第二相乘结果;
    将所述第一相乘结果与所述第二相乘结果的求和结果,确定为所述至少一个中间镜头参数。
  17. 根据权利要求1至16任一项所述的方法,其中,在显示所述第一处理界面之前,所述方法还包括:
    显示虚拟道具查看界面,其中,所述虚拟道具查看界面包括所述虚拟道具的多个部件;
    响应于针对所述虚拟道具查看界面中的所述第一部件的选择操作,转入执行显示所述第一处理界面的处理。
  18. 根据权利要求1至16任一项所述的方法,其中,所述方法还包括:
    响应于所述虚拟道具的第三部件满足处理条件,从显示所述第二处理界面切换到显示不同于所述第二处理界面的第三处理界面,其中,所述第三处理界面包括所述第三部件、以及所述第三部件的处理控件。
  19. 根据权利要求18所述的方法,其中,
    所述处理条件包括以下至少之一:
    所述第三部件的损耗程度大于或等于损耗程度阈值;
    获取到所述第三部件能够使用的新配件。
  20. 一种虚拟道具的处理装置,所述装置包括:
    显示模块,配置为显示针对虚拟场景中的虚拟道具的处理入口;
    所述显示模块,还配置为响应于针对所述处理入口的触发操作,显示第一处理界面,其中,所述第一处理界面至少包括处理控件;
    所述显示模块,还配置为响应于针对所述处理控件的触发操作,显示处理后的所述虚拟道具;
    切换模块,配置为响应于界面跳转触发操作,从显示所述第一处理界面切换到不同于所述第一处理界面的第二处理界面。
  21. 一种电子设备,包括:
    存储器,用于存储可执行指令;
    处理器,用于执行所述存储器中存储的可执行指令时,实现权利要求1至19任一项所述的虚拟道具的处理方法。
  22. 一种计算机可读存储介质,存储有计算机可执行指令,所述计算机可执行指令被处理器执行时,实现权利要求1至19任一项所述的虚拟道具的处理方法。
  23. 一种计算机程序产品,包括计算机程序或计算机可执行指令,所述计算机程序或计算机可执行指令被处理器执行时,实现权利要求1至19任一项所述的虚拟道具的处理方法。
PCT/CN2023/102688 2022-08-12 2023-06-27 虚拟道具的处理方法、装置、电子设备、存储介质及程序产品 WO2024032176A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210971198.2 2022-08-12
CN202210971198.2A CN117618919A (zh) 2022-08-12 2022-08-12 虚拟道具的改造处理方法、装置、电子设备及存储介质

Publications (1)

Publication Number Publication Date
WO2024032176A1 true WO2024032176A1 (zh) 2024-02-15

Family

ID=89850612

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/102688 WO2024032176A1 (zh) 2022-08-12 2023-06-27 虚拟道具的处理方法、装置、电子设备、存储介质及程序产品

Country Status (2)

Country Link
CN (1) CN117618919A (zh)
WO (1) WO2024032176A1 (zh)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108459811A (zh) * 2018-01-09 2018-08-28 网易(杭州)网络有限公司 虚拟道具的处理方法、装置、电子设备及存储介质
CN110075522A (zh) * 2019-06-04 2019-08-02 网易(杭州)网络有限公司 射击游戏中虚拟武器的控制方法、装置及终端
CN112156458A (zh) * 2020-09-22 2021-01-01 网易(杭州)网络有限公司 显示控制方法及装置、存储介质、电子设备
CN112330823A (zh) * 2020-11-05 2021-02-05 腾讯科技(深圳)有限公司 虚拟道具的显示方法、装置、设备及可读存储介质
CA3133467A1 (en) * 2020-04-23 2021-10-23 Tencent Technology (Shenzhen) Company Limited Virtual object control method and apparatus, device, and storage medium
CN113546404A (zh) * 2021-07-30 2021-10-26 网易(杭州)网络有限公司 游戏中虚拟道具的控制方法、装置以及电子终端
CN114504817A (zh) * 2022-01-04 2022-05-17 腾讯科技(深圳)有限公司 虚拟射击道具的配置方法和装置、存储介质及电子设备
WO2022100712A1 (zh) * 2020-11-16 2022-05-19 Oppo广东移动通信有限公司 真实环境画面中虚拟道具的显示方法、系统及存储介质
US20220155953A1 (en) * 2020-11-19 2022-05-19 Tencent Technology (Shenzhen) Company Limited Virtual object control method and apparatus, device, storage medium, and computer program product

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108459811A (zh) * 2018-01-09 2018-08-28 网易(杭州)网络有限公司 虚拟道具的处理方法、装置、电子设备及存储介质
CN110075522A (zh) * 2019-06-04 2019-08-02 网易(杭州)网络有限公司 射击游戏中虚拟武器的控制方法、装置及终端
CA3133467A1 (en) * 2020-04-23 2021-10-23 Tencent Technology (Shenzhen) Company Limited Virtual object control method and apparatus, device, and storage medium
CN112156458A (zh) * 2020-09-22 2021-01-01 网易(杭州)网络有限公司 显示控制方法及装置、存储介质、电子设备
CN112330823A (zh) * 2020-11-05 2021-02-05 腾讯科技(深圳)有限公司 虚拟道具的显示方法、装置、设备及可读存储介质
WO2022100712A1 (zh) * 2020-11-16 2022-05-19 Oppo广东移动通信有限公司 真实环境画面中虚拟道具的显示方法、系统及存储介质
US20220155953A1 (en) * 2020-11-19 2022-05-19 Tencent Technology (Shenzhen) Company Limited Virtual object control method and apparatus, device, storage medium, and computer program product
CN113546404A (zh) * 2021-07-30 2021-10-26 网易(杭州)网络有限公司 游戏中虚拟道具的控制方法、装置以及电子终端
CN114504817A (zh) * 2022-01-04 2022-05-17 腾讯科技(深圳)有限公司 虚拟射击道具的配置方法和装置、存储介质及电子设备

Also Published As

Publication number Publication date
CN117618919A (zh) 2024-03-01

Similar Documents

Publication Publication Date Title
CN112691377B (zh) 虚拟角色的控制方法、装置、电子设备及存储介质
TWI818343B (zh) 虛擬場景的適配顯示方法、裝置、電子設備、儲存媒體及電腦程式產品
CN112402960B (zh) 虚拟场景中状态切换方法、装置、设备及存储介质
JP7447296B2 (ja) 仮想道具のインタラクティブ処理方法、装置、電子機器及びコンピュータプログラム
WO2022267512A1 (zh) 信息发送方法、信息发送装置、计算机可读介质及设备
CN112569599B (zh) 虚拟场景中虚拟对象的控制方法、装置及电子设备
CN112306351B (zh) 虚拟按键的位置调整方法、装置、设备及存储介质
CN112416196B (zh) 虚拟对象的控制方法、装置、设备及计算机可读存储介质
US20230321543A1 (en) Control method and apparatus of virtual skill, device, storage medium and program product
CN112057860B (zh) 虚拟场景中激活操作控件的方法、装置、设备及存储介质
US20230078440A1 (en) Virtual object control method and apparatus, device, storage medium, and program product
WO2023109288A1 (zh) 虚拟场景中开局操作的控制方法、装置、设备、存储介质及程序产品
CN112138385B (zh) 虚拟射击道具的瞄准方法、装置、电子设备及存储介质
CN114344896A (zh) 基于虚拟场景的合拍处理方法、装置、设备及存储介质
US20230330525A1 (en) Motion processing method and apparatus in virtual scene, device, storage medium, and program product
WO2024032176A1 (zh) 虚拟道具的处理方法、装置、电子设备、存储介质及程序产品
WO2022156629A1 (zh) 虚拟对象的控制方法、装置、电子设备、存储介质及计算机程序产品
CN113018862B (zh) 虚拟对象的控制方法、装置、电子设备及存储介质
CN114146414A (zh) 虚拟技能的控制方法、装置、设备、存储介质及程序产品
CN113599829B (zh) 虚拟对象的选择方法、装置、终端及存储介质
US11995311B2 (en) Adaptive display method and apparatus for virtual scene, electronic device, storage medium, and computer program product
WO2024060924A1 (zh) 虚拟场景的互动处理方法、装置、电子设备及存储介质
WO2024021792A1 (zh) 虚拟场景的信息处理方法、装置、设备、存储介质及程序产品
CN114191817A (zh) 虚拟角色的射击控制方法、装置、电子设备及存储介质
CN117398691A (zh) 虚拟对象的控制方法及装置、存储介质、电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23851403

Country of ref document: EP

Kind code of ref document: A1