WO2024021750A1 - 虚拟场景中的交互方法、装置、电子设备、计算机可读存储介质及计算机程序产品 - Google Patents
虚拟场景中的交互方法、装置、电子设备、计算机可读存储介质及计算机程序产品 Download PDFInfo
- Publication number
- WO2024021750A1 WO2024021750A1 PCT/CN2023/092695 CN2023092695W WO2024021750A1 WO 2024021750 A1 WO2024021750 A1 WO 2024021750A1 CN 2023092695 W CN2023092695 W CN 2023092695W WO 2024021750 A1 WO2024021750 A1 WO 2024021750A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- virtual
- natural
- negative impact
- scene
- target
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 119
- 230000003993 interaction Effects 0.000 title claims abstract description 101
- 238000004590 computer program Methods 0.000 title claims abstract description 19
- 230000002452 interceptive effect Effects 0.000 claims abstract description 99
- 230000000694 effects Effects 0.000 claims abstract description 23
- 230000000875 corresponding effect Effects 0.000 claims description 43
- 230000015654 memory Effects 0.000 claims description 28
- 230000004044 response Effects 0.000 claims description 23
- 230000001276 controlling effect Effects 0.000 claims description 17
- 230000036541 health Effects 0.000 claims description 10
- 230000009466 transformation Effects 0.000 claims description 6
- 230000001066 destructive effect Effects 0.000 claims description 5
- 230000007704 transition Effects 0.000 claims description 5
- 230000002596 correlated effect Effects 0.000 claims description 4
- 230000008569 process Effects 0.000 abstract description 42
- 230000002829 reductive effect Effects 0.000 abstract description 4
- 238000010586 diagram Methods 0.000 description 28
- 238000013459 approach Methods 0.000 description 8
- 238000007654 immersion Methods 0.000 description 7
- 230000008447 perception Effects 0.000 description 7
- 239000000463 material Substances 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 239000008280 blood Substances 0.000 description 5
- 210000004369 blood Anatomy 0.000 description 5
- 238000004891 communication Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 230000002147 killing effect Effects 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 4
- 230000009191 jumping Effects 0.000 description 4
- 230000001960 triggered effect Effects 0.000 description 4
- 230000002159 abnormal effect Effects 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000008030 elimination Effects 0.000 description 2
- 238000003379 elimination reaction Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000004806 packaging method and process Methods 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 238000013515 script Methods 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 241000196324 Embryophyta Species 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 240000004050 Pentaglottis sempervirens Species 0.000 description 1
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000033228 biological regulation Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000002860 competitive effect Effects 0.000 description 1
- 238000005336 cracking Methods 0.000 description 1
- 230000009193 crawling Effects 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000008450 motivation Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000000746 purification Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000004083 survival effect Effects 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
- A63F13/69—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by enabling or updating specific game elements, e.g. unlocking hidden features, items, levels or versions
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/537—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/537—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
- A63F13/5378—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for displaying an additional top view, e.g. radar screens or maps
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
- A63F13/58—Controlling game characters or game objects based on the game progress by computing conditions of game characters, e.g. stamina, strength, motivation or energy level
Definitions
- the present application relates to the technical fields of virtualization and human-computer interaction, and in particular to an interaction method, device, electronic device, computer-readable storage medium and computer program product in a virtual scene.
- Embodiments of the present application provide an interaction method, device, electronic device, computer-readable storage medium, and computer program product in a virtual scene, which can improve the efficiency of human-computer interaction and the utilization of hardware processing resources.
- Embodiments of the present application provide an interaction method in a virtual scene, which is executed by an electronic device and includes:
- the virtual natural elements are used to cause a negative impact on the environment where the virtual natural elements are located;
- the negative impact area of the virtual natural element includes the sensing area
- the virtual object and the target object are controlled to interact in the virtual scene.
- An embodiment of the present application provides an interactive device in a virtual scene, including:.
- a display module configured to display virtual objects and virtual natural elements belonging to virtual natural phenomena in a virtual scene; wherein the virtual natural elements are used to cause a negative impact on the environment in which the virtual natural elements are located;
- a first control module configured to control the virtual natural element to transform into an interactive target object when the virtual object is within the sensing area of the virtual natural element; wherein the negative impact area of the virtual natural element includes The sensing area;
- the second control module is configured to control the virtual object and the target object to interact in the virtual scene when receiving an interaction instruction for the target object.
- An embodiment of the present application provides an electronic device, including:
- the processor is configured to execute the computer-executable instructions stored in the memory to implement the methods provided by the embodiments of the present application. Interaction methods in virtual scenes.
- Embodiments of the present application provide a computer-readable storage medium that stores computer-executable instructions.
- the computer-executable instructions are executed by a processor, the interaction method in the virtual scene provided by the embodiments of the present application is implemented.
- Embodiments of the present application provide a computer program product or computer program.
- the computer program product or computer program includes computer-executable instructions, and the computer-executable instructions are stored in a computer-readable storage medium.
- the processor of the electronic device reads the computer-executable instructions from the computer-readable storage medium, and the processor executes the computer-executable instructions, so that the electronic device executes the interaction method in the virtual scene provided by the embodiment of the present application.
- the virtual natural elements there are virtual objects in the virtual scene and virtual natural elements that have a negative impact on the virtual environment in the virtual scene.
- the virtual natural elements When the virtual objects enter the negative impact area of the virtual natural elements, the virtual natural elements are converted into A target object that a virtual object can interact with, allowing the virtual object to interact with the target object.
- the virtual natural element when the virtual object is close to the virtual natural element, the virtual natural element is converted into an interactive object to realize the interaction process, which enriches the interactive objects during free exploration and reduces the exploration time, that is, it reduces the time required to realize the interaction process.
- the time of human-computer interaction operation is reduced, thereby improving the efficiency of human-computer interaction and the utilization rate of hardware resources of electronic devices.
- Figure 1 is a schematic architectural diagram of an interactive system 100 in a virtual scene provided by an embodiment of the present application
- Figure 2 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
- Figure 3 is a schematic flowchart of an interaction method in a virtual scene provided by an embodiment of the present application
- Figure 4 is a schematic diagram of the relative positional relationship between virtual objects and virtual natural elements provided by the embodiment of the present application.
- Figure 5 is a schematic diagram of the relative positional relationship between virtual objects and virtual natural elements provided by the embodiment of the present application.
- Figure 6 is a schematic diagram of a virtual object traveling to a target virtual natural element provided by an embodiment of the present application
- Figure 7 is a schematic diagram of the negative impact of virtual natural elements on virtual objects provided by the embodiment of the present application.
- Figure 8 is a superimposed schematic diagram of the negative impact areas of virtual natural elements provided by the embodiment of the present application.
- Figure 9 is a schematic diagram of the relationship between the second negative impact value and the duration provided by the embodiment of the present application.
- Figure 10 is a schematic diagram of advanced prompt information of global influence sources provided by an embodiment of the present application.
- Figure 11 is a schematic diagram of a virtual natural element provided by an embodiment of the present application.
- Figure 12 is a schematic diagram of a target object provided by an embodiment of the present application.
- Figure 13 is a schematic diagram of the first object provided by the embodiment of the present application.
- Figure 14 is a schematic diagram of the process of converting target objects into virtual natural elements provided by the embodiment of the present application.
- Figure 15 is a schematic diagram of advanced prompt information of virtual natural elements provided by an embodiment of the present application.
- Figure 16 is a flow chart of the switching logic between pollution sources and key monsters provided by the embodiment of the present application.
- first ⁇ second ⁇ third are only used to distinguish similar objects and do not represent a specific ordering of objects. It is understandable that “first ⁇ second ⁇ third” is used in Where appropriate, the specific order or sequence may be interchanged so that the embodiments of the application described herein can be implemented in an order other than that illustrated or described herein.
- Open world refers to a game virtual scene where the battle scenes in the game are completely free and open. In the open world, players can freely explore in any direction, and the distance between the boundaries in each direction is very large.
- Response is used to indicate the conditions or states on which the performed operations depend.
- the dependent conditions or states are met, the one or more operations performed may be in real time or may have a set delay; Unless otherwise specified, there is no restriction on the execution order of the multiple operations performed.
- Virtual scene is a virtual scene displayed (or provided) when the application runs on the terminal.
- the virtual scene can be a simulation environment of the real world, a semi-simulation and semi-fictitious virtual environment, or a purely fictitious virtual environment.
- the virtual scene can be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene or a three-dimensional virtual scene.
- the virtual scene can include the sky, land, ocean, etc.
- the land can include environmental elements such as deserts and cities.
- the user can control virtual objects to perform activities in the virtual scene.
- the activities include but are not limited to: adjusting body posture, crawling, At least one of walking, running, riding, jumping, driving, picking up, shooting, attacking, and throwing.
- the virtual scene can be displayed from a first-person perspective (for example, the user plays a virtual object in the game from his or her own perspective); it can also be displayed from a third-person perspective (for example, the user is chasing virtual objects in the game to play the game). ); it can also display the virtual scene from a bird's-eye view, and the above-mentioned perspectives can be switched at will.
- Virtual objects images of various people and objects that can interact in the virtual scene, or movable objects in the virtual scene.
- the movable object may be a virtual character, a virtual animal, an animation character, etc., such as: characters, animals, plants, oil barrels, walls, stones, etc. displayed in the virtual scene.
- the virtual object may be a virtual avatar representing the user in the virtual scene.
- the virtual scene may include multiple virtual objects. Each virtual object has its own shape and volume in the virtual scene and occupies a part of the space in the virtual scene.
- the virtual object can be a user character controlled through operations on the client, it can be an artificial intelligence (Artificial Intelligence, AI) set in the virtual scene battle through training, or it can be set in the virtual scene interaction.
- AI Artificial Intelligence
- NPC Non-Player Character
- the number of virtual objects participating in the interaction in the virtual scene can be set in advance, or can be dynamically determined based on the number of clients participating in the interaction.
- embodiments of the present application provide an interaction method, device, electronic device, computer-readable storage medium, and computer program product in a virtual scene.
- By creating several player-selectable virtual natural elements in a map when the target is felt When confused, players are guided to challenge the virtual natural elements closest to them on the big map.
- the virtual natural elements will transform into monsters to fight with the players when they get close.
- Players can get rich rewards as long as the challenge is successful, which enriches the in-game experience while avoiding Player loss.
- Figure 1 is a schematic architectural diagram of an interactive system 100 in a virtual scene provided by an embodiment of the present application.
- the interactive application scenarios in the virtual scene may be based on a game APP) (Application, APP) to interact with the virtual scene in the application scene.
- game APP Application, APP
- APP virtual natural elements
- FIG. 1 when the player is playing a game APP, virtual natural elements such as tornadoes are displayed in the virtual scene.
- the terminal 400 When the player approaches the tornado, the tornado transforms into an interactive monster, thus Allowing players to use projectile props or skills to kill monsters), the terminal (terminal 400 is shown as an example) is provided with an interactive client 401 (i.e., game APP) in the virtual scene, and the terminal 400 is connected to the server 200 through the network 300,
- the network 300 may be a wide area network or a local area network, or a combination of the two, using wireless or wired links to implement data transmission.
- the terminal 400 is configured to send a display request of the virtual scene to the server 200 in response to a triggering operation for a virtual scene containing virtual objects and virtual natural elements belonging to virtual natural phenomena;
- the server 200 is configured to, based on the received virtual scene display request, send a virtual scene including virtual objects and virtual natural elements belonging to virtual natural phenomena to the terminal 400;
- the terminal 400 is further configured to receive a virtual scene including virtual objects and virtual natural elements belonging to the virtual natural phenomenon; present the virtual scene, and display the virtual objects and the virtual natural elements belonging to the virtual natural phenomenon in the virtual scene; wherein , virtual natural elements, are used to have a negative impact on the environment where the virtual natural elements are located; when the virtual object is within the sensing area of the virtual natural elements, control the virtual natural elements to transform into interactive target objects; among them, the virtual natural elements
- the negative influence area includes a sensing area; when an interaction instruction for the target object is received, the virtual object is controlled to interact with the target object in the virtual scene.
- the server 200 may be an independent physical server, a server cluster or a distributed system composed of multiple physical servers, or may provide cloud services, cloud databases, cloud computing, cloud functions, cloud storage, and networks. Services, cloud communications, middleware services, domain name services, security services, Content Delivery Network (CDN), and cloud servers for basic cloud computing services such as big data and artificial intelligence platforms.
- cloud databases cloud databases
- cloud computing cloud functions
- cloud storage and networks.
- Services cloud communications, middleware services, domain name services, security services, Content Delivery Network (CDN), and cloud servers for basic cloud computing services such as big data and artificial intelligence platforms.
- CDN Content Delivery Network
- the terminal 400 may be a smartphone, a tablet computer, a notebook computer, a desktop computer, a set-top box, an intelligent voice interaction device, a smart home appliance, a virtual reality device, a vehicle-mounted terminal, an aircraft, and a mobile device (for example, a mobile phone, a portable music player, a personal Digital assistants, dedicated messaging devices, portable gaming devices, smart speakers and smart watches), etc., but are not limited to these.
- the terminal device and the server can be connected directly or indirectly through wired or wireless communication methods, which are not limited in the embodiments of this application.
- FIG. 2 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
- the electronic device shown in Figure 2 includes: at least one processor 410 and a memory 450 , at least one network interface 420 and user interface 430.
- the various components in terminal 400 are coupled together by bus system 440.
- bus system 440 is used to implement connection communication between these components.
- the bus system 440 also includes a power bus, a control bus, and a status signal bus.
- the various buses are labeled bus system 440 in FIG. 2 .
- the processor 410 may be an integrated circuit chip with signal processing capabilities, such as a general-purpose processor, a digital signal processor (Digital Signal Processor, DSP), or other programmable logic devices, discrete gate or transistor logic devices, or discrete hardware Components, etc., wherein the general processor can be a microprocessor or any conventional processor, etc.
- DSP Digital Signal Processor
- User interface 430 includes one or more output devices 431 that enable the display of media content, including one or more speakers and/or one or more visual displays.
- User interface 430 also includes one or more input devices 432, including user interface components that facilitate user input, such as a keyboard, mouse, microphone, touch screen display, camera, and other input buttons and controls.
- Memory 450 may be removable, non-removable, or a combination thereof.
- Exemplary hardware devices include solid state memory, hard disk drives, optical disk drives, etc.
- Memory 450 optionally includes one or more storage devices physically located remotely from processor 410 .
- Memory 450 includes volatile memory or non-volatile memory, and may include both volatile and non-volatile memory.
- the non-volatile memory may be a read-only memory (ROM, Read Only Memory), and the volatile memory may be a random-access memory (RAM, Random Access Memory).
- ROM read-only memory
- RAM random-access memory
- the memory 450 described in the embodiments of this application is intended to include Includes any suitable type of memory.
- the memory 450 is capable of storing data to support various operations, examples of which include programs, modules, and data structures, or subsets or supersets thereof, as exemplarily described below.
- the operating system 451 includes system programs configured to process various basic system services and perform hardware-related tasks, such as framework layer, core library layer, driver layer, etc., configured to implement various basic services and process hardware-based tasks;
- a network communication module 452 configured to reach other electronic devices via one or more (wired or wireless) network interfaces 420.
- Exemplary network interfaces 420 include: Bluetooth, Wireless Compliance Certified (WiFi), and Universal Serial Bus ( USB, Universal Serial Bus), etc.;
- Presentation module 453 configured to enable the display of information (e.g., a user interface for operating peripheral devices and displaying content and information) via one or more output devices 431 (e.g., display screens, speakers, etc.) associated with user interface 430 );
- information e.g., a user interface for operating peripheral devices and displaying content and information
- output devices 431 e.g., display screens, speakers, etc.
- Input processing module 454 is configured to detect one or more user inputs or interactions from input device 432 and translate the detected inputs or interactions.
- the device provided by the embodiment of the present application can be implemented in software.
- Figure 2 shows the interactive device 455 stored in the virtual scene of the memory 450, which can be software in the form of programs, plug-ins, etc., including the following Software module: display module 4551, first control module 4552, and second control module 4553. These modules are logical, so they can be combined or further split at will according to the functions implemented. The functions of each module are explained below.
- the device provided by the embodiment of the present application can be implemented in hardware.
- the interactive device in the virtual scene provided by the embodiment of the present application can be a processor in the form of a hardware decoding processor, which is Programming to execute the interactive method in the virtual scene provided by the embodiments of the present application.
- a processor in the form of a hardware decoding processor can use one or more Application Specific Integrated Circuits (Application Specific Integrated Circuit, ASIC), DSP, programmable Logic device (Programmable Logic Device, PLD), Complex Programmable Logic Device (Complex Programmable Logic Device, CPLD), Field-Programmable Gate Array (FPGA) or other electronic components.
- a terminal or server can implement the interaction method in the virtual scene provided by the embodiments of this application by running a computer program.
- a computer program can be a native program or software module in the operating system; it can be a native (Native) application (Application, APP), that is, a program that needs to be installed in the operating system to run, such as an instant messaging APP, a web page Browser APP; it can also be a small program, that is, a program that only needs to be downloaded to the browser environment to run; it can also be a small program that can be embedded in any APP.
- the computer program described above can be any form of application, module or plug-in.
- the interaction method in the virtual scene provided by the embodiment of the present application can be implemented by the terminal or the server alone, or by the terminal and the server collaboratively, so that the terminal 400 in Figure 1 alone executes the virtual scene provided by the embodiment of the present application.
- the interaction method in the scene is explained as an example.
- Figure 3 is a schematic flowchart of an interaction method in a virtual scene provided by an embodiment of the present application, which will be described in conjunction with the steps shown in Figure 3.
- Step 101 The terminal displays virtual objects and virtual natural elements belonging to virtual natural phenomena in the virtual scene; where the virtual natural elements are used to cause a negative impact on the environment where the virtual natural elements are located.
- applications supporting virtual scenes are installed on the terminal.
- the application can be any one of a first-person shooting game, a third-person shooting game, a multiplayer online tactical competitive game, a virtual reality application, a three-dimensional map program, or a multiplayer gun battle survival game. Users can use the terminal to operate virtual objects located in the virtual scene to perform activities.
- the terminal displays a picture of the virtual scene.
- the picture of the virtual scene is observed from the first-person perspective of the object, or from the third-person perspective of the virtual scene.
- the virtual scene pictures include virtual objects and virtual natural elements belonging to virtual natural phenomena.
- the virtual object can be a player character controlled by the current player, or a player character controlled by other players (teammates) who belong to the same group as the current player.
- the virtual natural phenomena to which the virtual natural elements belong can be such as tornadoes, volcanoes and other natural phenomena that have a negative impact on the environment where they are located; and areas where virtual natural elements have a negative impact on the environment where they are located are negative impact areas.
- the virtual natural phenomenon is a virtual tornado
- virtual natural elements belonging to the virtual tornado are displayed; where the negative impact area of the virtual natural element is the area affected by the virtual tornado, and the negative impact of the virtual natural element
- the light intensity in the area is lower than the non-negatively affected area in the virtual scene and is destructive to virtual objects in the negatively affected area.
- the virtual natural phenomenon is a virtual volcano
- virtual natural elements belonging to the virtual volcano are displayed in the virtual scene; among them, the negative impact area of the virtual natural element is the area affected by the virtual volcano, and the negative impact of the virtual natural element
- the degree of ground cracking and ambient temperature in the area are higher than those in the non-negatively affected area in the virtual scene, and are destructive to virtual objects in the negatively affected area.
- multiple virtual natural elements belong to at least two virtual natural phenomena, such as volcanoes, tornadoes, etc.
- virtual nature belonging to virtual natural phenomena is displayed.
- the element process includes displaying multiple virtual natural elements belonging to at least two virtual natural phenomena in a virtual scene; among them, virtual natural elements of different virtual natural phenomena have different negative impacts on the environment where they are located, for example, tornadoes are used for Reduce the light intensity of the environment where virtual natural elements are located and destroy virtual objects in the environment where virtual natural elements are located; and volcanoes are used to crack the ground of the environment where virtual natural elements are located, increase the temperature of the environment where virtual natural elements are located, and destroy virtual objects. Virtual objects in an environment surrounded by natural elements.
- the multiple virtual natural elements belong to at least two virtual natural phenomena
- the multiple virtual natural elements belonging to at least two virtual natural phenomena are displayed.
- the diversity of the interaction process in the virtual scene is increased, the user's immersion and interactive experience are improved, thereby improving the efficiency of human-computer interaction and the utilization of hardware resources of electronic devices.
- a map of the virtual scene is also displayed in the virtual scene, and the relationship between virtual objects and virtual natural elements is dynamically displayed in the map.
- relative positional relationship Exemplarily, refer to Figure 4, which is a schematic diagram of the relative positional relationship between virtual objects and virtual natural elements provided by the embodiment of the present application. Based on Figure 4, the virtual object is in the dotted box 401, such as the hatched circle in the dotted box 402. It is a virtual natural element. As the virtual object moves, the relative positional relationship between the virtual object and the virtual natural element is dynamically displayed on the map.
- a map of the virtual scene is displayed, thereby dynamically displaying the relative positional relationship between virtual objects and virtual natural elements in the map, making it clearer to users
- the distance between itself and virtual natural elements facilitates users to find corresponding virtual natural elements, improves user experience, and reduces the time users spend searching for virtual natural elements.
- the method of dynamically displaying the relative positional relationship between the virtual object and the virtual natural element in the map can also be to dynamically display the traveling path between the virtual object and the virtual natural element in the map, and marking the virtual object at one end of the traveling path. , marking the virtual natural element at the other end of the traveling path; where the traveling path is used to guide the virtual object to move along the traveling path to the sensing area of the virtual natural element.
- Figure 5 is a schematic diagram of the relative positional relationship between virtual objects and virtual natural elements provided by an embodiment of the present application. Based on Figure 5, the traveling path of the virtual object and the virtual natural element in the dotted box 502 is shown.
- the dotted line box 501 is a supply location set in the virtual scene, such as a virtual store, etc.
- the supply location here is used to increase the status value of the virtual object, such as blood volume, travel speed, etc.
- the traveling path can be a path that includes a supply location, or it can be the path with the shortest distance. In this way, different routes are presented for the virtual object to choose, so that the virtual object can determine whether it needs supplies according to its own status, or directly choose the route with the shortest distance. , speed up travel efficiency and avoid wasting travel time.
- the virtual object when the virtual object travels to the virtual natural element based on the travel path, it can directly follow the travel path on the map, or it can highlight the path under the virtual object or under the vehicle in the virtual scene for the virtual object to Travel to virtual natural elements.
- the target virtual natural element needs to be determined from multiple virtual natural elements.
- the target virtual natural element can be determined directly by the terminal or by the user.
- the determination method of the target virtual natural element includes but is not limited to the following two methods, and the embodiment of the present application is not limited to this.
- the distance between the virtual object and each virtual natural element in the virtual scene is obtained; the virtual natural element with the smallest distance from the virtual object is selected as the target virtual natural element, so that The relative positional relationship between virtual objects and target virtual natural elements is dynamically displayed on the map.
- the number of virtual natural elements is multiple, options corresponding to each virtual natural element are presented; in response to the selection operation for the target option corresponding to the target virtual natural element, the virtual natural element corresponding to the target option is used as Target virtual natural elements, thereby dynamically displaying the relative positional relationship between virtual objects and target virtual natural elements on the map.
- the traveling path between the virtual object and the target virtual natural element is presented, and the direction movement control corresponding to the virtual object is displayed.
- the direction movement control can be a joystick control, and then In response to the movement control instruction received by the direction-based movement control, the virtual object is controlled to move toward the target virtual natural element in the direction indicated by the movement control instruction (including forward, backward, left and right).
- Figure 6, is a schematic diagram of a virtual object traveling to a target virtual natural element provided by an embodiment of the present application.
- the dotted box 601 is a direction movement control.
- the direction movement control receives a response based on the direction movement control
- the virtual object receives a movement control instruction (such as a left movement control instruction)
- the virtual object is controlled to move to the left in response to the movement control instruction.
- virtual natural elements not only have a negative impact on the environment in which they are located, but also have a negative impact on virtual objects.
- the virtual object When the virtual object is within the negative impact area of the virtual natural element, it shows that the virtual natural element has a negative impact on the virtual object.
- the first negative impact value where the first negative impact value is used to indicate the degree of obstruction caused by the virtual natural element to the movement of the virtual object in the negative impact area, that is, the size of the obstruction.
- the first negative impact value on the virtual object here is used to indicate the negative impact on the status value of the virtual object, for example, reducing the traveling speed of the virtual object, reducing the blood volume of the virtual object, etc.
- Figure 7 is a schematic diagram of the negative impact of virtual natural elements on virtual objects provided by the embodiment of the present application. Based on Figure 7, when the virtual object is within the negative impact area of the virtual natural element, it will appear on the map.
- the first negative impact value caused by the virtual natural element on the virtual object as shown in the dotted box 701 is displayed below. In this way, the virtual object can determine the negative impact caused by the virtual natural element on itself in real time, thereby performing appropriate actions. Solutions to eliminate negative impacts, for example, if the first negative impact value is high, quickly find supplies to replenish the state; if the first negative impact value is low, continue to move toward the virtual natural element.
- the level of the first negative impact value can be compared with a preset negative impact threshold.
- the virtual object by displaying the first negative impact value indicating the degree of obstruction caused by the virtual natural element to the movement of the virtual object in the negative impact area, the virtual object can perceive and determine the impact of the virtual natural element on itself in real time. negative impact, so as to implement appropriate solutions to eliminate the negative impact in a timely manner, thereby improving human-machine Interaction efficiency and hardware resource utilization of electronic devices.
- the first negative impact values caused by the multiple virtual natural elements on the virtual object can be superimposed.
- the number of virtual natural elements is multiple, and the multiple virtual natural elements are When the negative impact areas of at least two virtual natural elements in the element overlap, the corresponding overlapping area is determined; when the virtual object is in the overlapping area, for at least two virtual natural elements that make up the overlapping area, the impact caused by each virtual natural element is The first negative impact values are superimposed to obtain the target first negative impact value for the virtual object; thereby displaying the target first negative impact value caused by at least two virtual natural elements on the virtual object.
- the first negative impact values caused by each virtual natural element are superimposed to obtain the target first negative impact on the virtual object.
- users who join the virtual scene can better perceive their own location, improve the scene effect of the virtual scene and the user's sense of immersion.
- FIG 8 is a schematic diagram of the overlay of the negative impact areas of virtual natural elements provided by the embodiment of the present application. Based on Figure 8, the non-overlay area is only negatively affected by one virtual natural element, while the overlay area is affected by two virtual natural elements. The negative impact of the superposition of natural elements, and when the virtual object is in the superposition area, what is displayed is the sum of the first negative impact values corresponding to the two virtual natural elements.
- the virtual natural phenomenon corresponding to two virtual natural elements in an overlapping area is a virtual tornado
- the light intensity in the overlapping area is lower than the non-overlapping area in the negative impact area, and the destructive effect on the virtual object is high.
- the virtual natural phenomena corresponding to the two virtual natural elements in the overlapping area are virtual tornadoes and virtual volcanoes
- there are both virtual natural phenomena corresponding to virtual tornadoes in the overlapping area There will also be virtual natural phenomena corresponding to virtual volcanoes, and the damage to virtual objects will be higher than the non-overlapping areas in the negative impact area.
- the virtual object in addition to the negative impact of virtual natural elements on virtual objects, when the virtual object is not within the negative impact range of the virtual natural elements, it will also have a negative impact value.
- the second negative impact value on the virtual object here is used to indicate the negative impact on the status value of the virtual object, for example, reducing the traveling speed of the virtual object, reducing the blood volume of the virtual object, etc.
- Figure 9 is a schematic diagram of the relationship between the second negative impact value and the duration provided by the embodiment of the present application. Based on Figure 9, the longer the virtual object is in the virtual scene, the greater the second negative impact value. The faster the growth rate.
- the second negative impact value is the value of the negative impact caused by the global impact source of the virtual scene on the virtual object
- the global impact source has hidden attributes and advanced attributes.
- the advanced attributes make the global impact source have the following attributes: At least two stages of the first stage and the second stage, the first stage corresponds to the first correlation coefficient between the second negative impact value and the duration, the second stage corresponds to the second correlation coefficient between the second negative impact value and the duration, and the second correlation
- the value of the coefficient is greater than the value of the first correlation coefficient; when it is determined that the global influence source is in the first stage, the process of dynamically displaying the second negative influence value of the virtual object includes obtaining the initial value of the negative influence caused by the global influence source on the virtual object.
- the growth rate of the negative impact value is changed through the duration of the virtual object in the virtual scene.
- the diversity of the interaction process in the virtual scene is improved, and the user's enthusiasm for eliminating the negative impact is improved, thereby improving human-computer interaction. efficiency and hardware resource utilization of electronic devices.
- the advanced prompt information corresponding to the global influence source will also be displayed to obtain the duration of the virtual object in the virtual scene.
- the global influence source is determined based on the duration
- the advanced prompt information corresponding to the global influence source is displayed.
- the advanced prompt information is used to remind that the global influence source has advanced.
- Figure 10 is a schematic diagram of the advanced prompt information of the global influence source provided by the embodiment of the present application. Based on Figure 10, when the global influence source is determined to advance from the first stage to the second stage based on the duration. , display the advanced prompt information as shown in box 1001.
- the target duration threshold corresponding to the second stage is preset here, so that the duration is compared with the target duration threshold, and then the The global influence source advances from the first stage to the second stage.
- the obtained duration is compared with the target duration threshold.
- the comparison result represents a duration greater than the target duration threshold, it is determined that the global influence source advances from the first stage to the second stage.
- the first negative impact value and the second negative impact value of the virtual object can also be superimposed.
- the first negative impact value caused by the virtual natural element to the virtual object is determined.
- the negative impact value, and the second negative impact value caused by the global impact source on the virtual object superimpose the first negative impact value and the second negative impact value to obtain the total negative impact value of the virtual object, and dynamically display the virtual object's negative impact value. Total negative impact value.
- the total negative impact value of each virtual object also has an upper limit, so as to avoid the overflow of negative impact effects after superposition; at the same time, the total negative impact value of a virtual object can also represent the position of the virtual object in the virtual scene.
- the intensity of virtual natural phenomena can be used to render the environment and enhance the user’s sense of immersion.
- the number of virtual natural elements can also be determined before displaying virtual objects and virtual natural elements belonging to virtual natural phenomena in a virtual scene.
- the number of virtual natural elements can also be determined before displaying virtual objects and virtual natural elements belonging to virtual natural phenomena in a virtual scene.
- the process of determining the number of virtual natural elements will be explained using two determination methods as examples.
- level options corresponding to at least two difficulty levels are displayed. Different difficulty levels correspond to different numbers of virtual natural elements.
- the at least two difficulty levels include a target difficulty level, and the target difficulty level corresponds to a target number of virtual natural elements.
- Natural elements in response to the selection operation of the level option for the target difficulty level, determine a target number of virtual natural elements, thereby displaying virtual objects and the target number of virtual natural elements belonging to the virtual natural phenomenon in the virtual scene.
- the target number of virtual natural elements in the virtual scene is directly obtained, so that the virtual objects and the target number of virtual natural phenomena are displayed in the virtual scene. of virtual natural elements.
- the relevant developers will determine the alternative locations of at least two virtual natural elements in the virtual scene, and then based on the determined target number of virtual natural elements, start from the alternative locations of the at least two virtual natural elements.
- an alternative position corresponding to the number of virtual natural elements is selected as the position to display the virtual natural elements, so that the virtual object is displayed in the virtual scene, and the virtual nature belonging to the virtual natural phenomenon is displayed at the selected position. element.
- the location process can be chosen randomly.
- the factors to be considered in determining the alternative locations of virtual natural elements from the virtual scene include but are not limited to the distance between each alternative location and the birth point of the virtual object, and the distance between each alternative location and the NPC in the virtual scene. Wait, same For different virtual scenes or virtual scenes of different difficulty levels, the number of alternative positions of virtual natural elements can be different.
- Step 102 When the virtual object is within the sensing area of the virtual natural element, control the virtual natural element to transform into an interactive target object; wherein the negative impact area of the virtual natural element includes the sensing area.
- the sensing area of virtual natural elements can be a circular area with the location of the virtual natural element as the center and the target distance as the radius.
- the target distance here is preset, such as 5 meters; and the negative impact area It is an area where virtual natural elements have a negative impact on the environment.
- the sensing area and the negative impact area can be the same or different.
- the sensing area can be within the area of the negative impact area.
- the sensing area can be based on virtual natural elements.
- the location is a circular area with the center of the circle and a target distance of 5 meters as a radius.
- the negative impact area can be a circular area with the location of the virtual natural element as the center and a target distance of 7 meters as a radius.
- Figure 11 is a schematic diagram of a virtual natural element provided by an embodiment of the present application.
- Figure 12 is a schematic diagram of a target object provided by an embodiment of the present application. Based on Figures 11 and 12, the virtual object is After entering the sensing area of virtual natural elements, the virtual natural elements are transformed into target objects as shown in Figure 12.
- the target object after the virtual natural element is converted into an interactive target object, the target object pursues the virtual object in the sensing area, and a search screen showing the target object searching for the virtual object in the virtual scene is displayed; when the target object is within the target duration, When the virtual object is not searched in the sensing area, the target object is controlled to be transformed into a virtual natural element; and the situation when the target object searches for the virtual object will be described in step 103.
- the target object will chase the virtual object, as the target object moves, the position of the virtual natural element will also move, thereby ensuring that the position of the virtual natural element is consistent with the target object.
- the target objects chase the virtual objects in the sensing area.
- the target objects do not search for the virtual objects, they will also be converted into virtual natural elements.
- the virtual scene is added The diversity of the interaction process improves the user's interactive experience, thereby improving the efficiency of human-computer interaction and the utilization of hardware resources of electronic devices.
- At least one interactive first object in an idle state is also generated. See Figure 13.
- Figure 13 is the first interactive object provided by the embodiment of the present application.
- the schematic diagram of the object is based on Figure 11 and Figure 13. After the virtual object enters the sensing area of the virtual natural element, the virtual natural element is transformed into an interactive target object, and the position of the virtual natural element in the virtual scene is also generated as shown in the figure.
- the idle state here is used to indicate a non-interactive state such as a non-combat state, that is, the first object will not actively pursue the virtual object but is used to interfere with the virtual object, while the interactive state is used to indicate that the first object will actively pursue the virtual object.
- Virtual object Based on this, the virtual object can choose whether to interact with the first object. When the virtual object chooses to interact with the first object, the first object will interact with the virtual object. In response to the interaction instruction for the virtual object, The virtual object is controlled to interact with the first object in the virtual scene, and the interaction is used to transform the first object from an idle state to an interactive state.
- At least one interactive first object in an idle state will be generated.
- the virtual object can choose whether to interact with the first object.
- the first object will interact with the virtual object only when the first object interacts. In this way, the diversity of interactive objects and interaction methods in the virtual scene is increased, and the user's interactive experience is improved.
- the object type of the first object corresponds to the level of the virtual object.
- the generation process of the first object includes, when the virtual object is within the sensing area of the virtual natural element, obtaining the level of the virtual object. , and based on the level, determine the object type corresponding to the level; based on the object type, generate at least one interactive first object in an idle state.
- the object type of the first object here includes but is not limited to the level of the first object, Form, etc., for example, when the level of the virtual object is high level, a high-level first object is generated, and when the level of the virtual object is low-level, a low-level first object is generated; or when the level of the virtual object is When the level is high level, the generated The first object with the ability to fly, when the level of the virtual object is low, the first object without the ability to fly is generated. Based on this, the process of determining the object type corresponding to the level based on the level may be to obtain the corresponding relationship between the level and the object type, and determine the object type corresponding to the level based on the level and the corresponding relationship.
- the object corresponding to the level is determined, thereby generating at least one interactive first object in an idle state corresponding to the object type.
- the diversity of the interaction process in the virtual scene is improved, and the user's enthusiasm for upgrading the level is improved, thereby improving the efficiency of human-computer interaction and the utilization rate of hardware resources of electronic devices.
- virtual resources that can be used in the virtual scene as rewards can be displayed in the virtual scene.
- virtual resources may be supplies used to reduce the negative impact value of virtual objects, props used to perform interactive operations on target objects, or experience values that increase the level of virtual objects, etc.
- the virtual objects can also leave the target object's perception area, so that the target object cannot perceive itself, and then be converted into virtual natural elements in response to the virtual object's sensing area.
- the area leaving operation controls the virtual object to leave the sensing area; when the duration of the virtual object leaving the sensing area reaches the target duration, the target object is controlled to be transformed into a virtual natural element.
- the target object is transformed into a virtual natural element
- the virtual object does not interact with the first object or does not kill all the first objects, the first object in the virtual scene will also disappear.
- Step 103 When receiving an interaction instruction for the target object, control the virtual object and the target object to interact in the virtual scene.
- the target object when the target object searches for a virtual object, since the target object will chase the virtual object, based on this, when an interaction instruction for the target object is received, the virtual object and the target object are controlled to interact in the virtual scene.
- the process includes displaying a search screen in which the target object searches for a virtual object in the virtual scene; when the target object searches for the virtual object and performs an interactive operation on the virtual object, when an interactive instruction for the target object is received, control Virtual objects interact with target objects in the virtual scene.
- the target objects chase the virtual objects in the sensing area.
- the target objects search for the virtual objects, they will actively interact with the virtual objects. In this way, the performance in the virtual scene is improved.
- the diversity of interaction processes and the enthusiasm of users to interact with target objects are improved.
- the virtual object after controlling the target object to perform interactive operations on the virtual object or controlling the virtual object to interact with the target object, the virtual object can also be hidden based on the virtual building in the virtual scene, so that the target object cannot perceive itself. , and then converted into virtual natural elements, in response to the hidden instructions for the virtual object, controlling the virtual object to change from the interactive state to the hidden state, where the hidden state makes the target object unable to perceive the virtual object; showing the target object searching in the virtual scene The picture of the virtual object; when the target object does not search for the hidden virtual object within the target time period, the target object is controlled to be transformed into a virtual natural element.
- the target object since the virtual object changes from the interactive state to the hidden state, the target object will move and search to the position where the virtual object last appeared.
- the process of the target object searching for the virtual object in the virtual scene can be Yes, it shows the positional movement of the target object when it changes state to a virtual object, and the screen showing the target object searching for virtual objects in the virtual scene.
- the target object will move toward the virtual object at an alert state speed. Search the place where the virtual object last disappeared, and then after the preset target time, if the virtual object cannot be searched or a new virtual object is discovered, it will actively transform into the form of a virtual natural element.
- the virtual object after controlling the target object to perform interactive operations on the virtual object or controlling the virtual object to interact with the target object, the virtual object can also be hidden based on the virtual building in the virtual scene, so that the target object cannot perceive itself, Then they are transformed into virtual natural elements. This increases the diversity of interaction methods in virtual scenes, improves the user's interactive experience, thereby improving the efficiency of human-computer interaction and the utilization of hardware resources of electronic devices.
- the virtual object after controlling the virtual object to interact with the target object, can also leave the target object's perception area, so that the target object cannot perceive itself, and then be transformed into a virtual natural element, affecting
- the virtual object In response to the area leaving operation for the virtual object, the virtual object is controlled to leave the sensing area; when the duration of the virtual object leaving the sensing area reaches the target duration, the target object is controlled to transform into a virtual natural element.
- controlling the transformation of target objects into virtual natural elements is a process of gradual transformation over time.
- the target object gradually becomes transparent over time from the first end to the second end, and the virtual natural elements are transformed from the second end to the second end.
- the first segment is gradually displayed over time, and the rate of display and becoming transparent is positively correlated with time.
- Figure 14 which is a schematic diagram of the process of converting a target object into a virtual natural element according to an embodiment of the present application. Based on Figure 14, when the target object is converted into a virtual natural element, the model material of the target object changes over time. Increase and become transparent.
- the gradient of material transparency can match the effect of special effects from bottom to top or from top to bottom, that is, by gradually making the target object transparent from bottom to top or from top to bottom, and at the same time, the virtual natural elements will be correspondingly
- the transparency is gradually displayed from top to bottom or bottom to top.
- the target object is converted into a virtual natural element
- the virtual natural element will be converted into the target object again, and the first object can also be generated.
- the target objects that appear are also refreshed, as well as the type and number of the first object.
- the process of converting the target object into a virtual natural element is protected by a target duration, which starts from the moment when the target object is transformed.
- the target object's perception will be turned off, even if a virtual object approaches the target object again or the virtual Natural elements will not immediately interrupt the conversion process of the target object into virtual natural elements, and will not allow the newly converted virtual natural elements to be re-converted into the target object. In this way, when the extreme situation of the virtual object repeatedly jumping at the edge of the sensing area occurs, the abnormal performance of repeated switching between the virtual natural element and the target object is avoided.
- the virtual object can also kill the target object.
- virtual resources used as rewards are displayed in the virtual scene; wherein, the virtual resources , for application in virtual scenes.
- virtual resources may be supplies used to reduce the negative impact value of virtual objects, props used to perform interactive operations on target objects, or experience values that increase the level of virtual objects, etc.
- the negative impact caused by the virtual natural elements in the negative impact area is eliminated at a preset rate with the location of the target object as the center.
- the virtual natural phenomenon corresponding to the virtual natural element is a virtual tornado
- the virtual object kills the target object with the position of the target object as the center, in the form of water ripples or light, from the feet of the target object.
- virtual natural elements also have advanced attributes, and there are differences in the state attributes of the interactive target objects transformed by the advanced virtual natural elements and the non-advanced virtual natural elements.
- the advanced prompt information corresponding to the virtual natural element is displayed; among them, the advanced prompt information is used to prompt that the virtual natural element has been advanced, and the advanced object obtained by converting the advanced virtual natural element satisfies the following At least one of the conditions: the health value is higher than the target object's health value; the damage caused to the virtual object by performing the interactive operation is higher than the damage caused by the target object performing the interactive operation to the virtual object.
- Figure 15 is a schematic diagram of advanced prompt information of virtual natural elements provided by the embodiment of the present application. Based on Figure 15, when the advanced conditions of the virtual natural elements are met, the display is as shown in box 1501. Advanced prompt information is displayed.
- the advanced conditions for virtual natural elements include but are not limited to the duration of the virtual object in the virtual scene or the number of virtual natural elements.
- the virtual natural element is determined.
- the advanced conditions of natural elements are met; or, because the number of corresponding virtual natural elements decreases after the virtual object kills the target object, when the number of virtual natural elements is less than the target number, the advanced conditions of virtual natural elements Be satisfied.
- the first object may also advance as the virtual natural elements advance, and at the same time, the advanced first object has the same characteristics as the above-mentioned advanced target object.
- the transformed target object and the first object will not advance, that is, the transformed target object and the first object's health value and the interaction operation will not advance. Damage caused by virtual objects does not change.
- an interactive second object may also be presented, and in response to the movement instruction for the virtual object, the virtual object is controlled to move to the virtual natural element and move in the virtual scene.
- the second object is an interactive object in the virtual scene that has nothing to do with virtual natural elements.
- the second object will also chase the virtual object, and as the time increases, the health value of the second object will be related to the execution of the interactive operation. Damage caused by virtual objects is also increased.
- an interactive second object is presented. This increases the diversity of interactive objects and interaction methods in the virtual scene, improves the user's interactive experience, and thereby improves the user's interactive experience. It improves the efficiency of human-computer interaction and the utilization rate of hardware resources of electronic devices.
- a picture of the second object searching for the virtual object in the virtual scene is displayed; when the second object searches for the virtual object and performs an interactive operation on the virtual object, the response Based on the interaction instructions for the virtual object, the virtual object and the second object are controlled to interact in the virtual scene; and when the virtual object enters the negative impact area, the virtual object is marked and the virtual object carrying the mark is displayed; wherein, the mark is This makes it impossible for the second object to search for the virtual object, thereby ensuring that the virtual object entering the negative impact area is not interfered by the second object.
- the second object cannot search for the virtual object, thereby ensuring that the virtual object entering the negative impact area is not interfered by the second object. In this way, the diversity of the interaction process in the virtual scene is improved. and user interaction experience.
- virtual resources that can be used in the virtual scene as rewards can also be displayed in the virtual scene.
- virtual resources may be supplies used to reduce the negative impact value of virtual objects, props used to perform interactive operations on target objects, or experience values that increase the level of virtual objects, etc.
- virtual objects and virtual natural elements that have a negative impact on the virtual environment in the virtual scene are displayed in the virtual scene.
- the virtual natural elements When the virtual objects enter the negative impact area of the virtual natural elements, the virtual natural elements are converted into available
- the virtual object interacts with the target object, so that the virtual object interacts with the target object.
- the virtual natural element is converted into an interactive object to realize the interaction process, which enriches the interactive objects during free exploration, reduces the exploration time, and improves the diversity of interactive objects in the virtual scene. At the same time, it also improves the efficiency of human-computer interaction and the utilization rate of hardware resources of electronic devices.
- players lack direction guidance to find various random tasks in the open game world, which can easily lead to repeated meaningless running maps, and even miss the task objectives; at the same time, due to the lack of sufficient filling content, when players When you quickly find and kill the target in the game, the game time will end prematurely, causing the game content to be consumed faster than expected; moreover, the actual types of in-game tasks are generally based on monotonous experiences such as dialogue and collection, and lack certain packaging. and open world generation Create an immersive atmosphere.
- embodiments of the present application provide an interaction method in a virtual scene, by creating several sub-events that can be selected by players (virtual objects) in the map, that is, undertide pollution sources (virtual natural elements), to guide players to interact in the large map.
- players virtual objects
- undertide pollution sources virtual natural elements
- the interaction method in the virtual scene will focus on optimizing the following experiences: 1. Randomly generate a certain number of undertide pollution sources in the open world; 2. The system will use a small radar (map) to detect and detect pollution at any time. can guide players to find the nearest pollution source; 3. The undertide pollution source is a dangerous area like a tornado at the default time. When a player approaches, it will transform into a configured specified type of monster and fight with the player; 4. Players As long as you defeat the designated type of key monster, you can win, purify the pollution source and obtain generous rewards; 5. When the player chooses to stay away from the battle with the key monster, the key monster will return to the undercurrent pollution source state at its current location; 6. As time passes in a single round, the pollution sources in the entire map will slowly upgrade and increase in intensity.
- undertide pollution sources of different intensities (discrete levels); 2.
- the undertide pollution sources will radiate the undertide value to the players within a certain range (negative impact area) around them ( The first negative impact value), the curve can be used to define the multiplier of the undertide value received at different distances from the pollution source; 3.
- the pollution source When a player approaches the pollution source within a certain range (sensing area), the pollution source will materialize into a key monster (target Object), will have a larger range of perception and chase the player; 4.
- target Object When transformed into the key monster state, the center of the pollution source will move with the key monster to ensure that the position is consistent; 5.
- the player Every time the player clears a pollution source by killing the key monster, it will Purify an area affected by the pollution source, and obtain rewards (virtual resources) corresponding to the pollution source, which are dropped when the key monster dies; 6.
- the undertide concentration When the undertide concentration is cleared, there will be a purification effect from near to far; 7 , the same undertide pollution source can be defined by a curve.
- the intensity of the undertide pollution source changes, and changes the undertide value radiated to surrounding players; 8.
- the level also has a huge global undertide pollution source (global influence source), which will affect the environment.
- Players in the entire map radiate undertide values, and the curve changes with the time of a single round. This undertide pollution source has no center, entity, and early warning. 9.
- an undertide value (target negative impact value) will be maintained for each player.
- this undertide value can be the result of the addition of the radiation values of various pollution sources at the current player's location (such as a single pollution source plus the global pollution source); 10.
- the undertide value of each player has an upper limit (such as 100), thus avoiding When multiple pollution sources overlap, the effect overflows; 11.
- the player's real-time undertide value represents the intensity of the undertide concentration at the player's current location, and is reflected in the client's dark zone effect (virtual natural phenomenon), thereby rendering the worldview atmosphere. The purpose is to enhance the sense of immersion in the big world game; 12.
- the player's undertide value will be displayed in real time under the radar in the upper right corner, as shown in Figure 7; 13.
- the undertide pollution source will send a mark to all players within a certain range to ensure entry Players within the range will not be interfered by irrelevant monsters (second objects); 14.
- Players in non-pollution source radiation areas will still be affected by the dynamic monster spawning system.
- a certain number of monsters (second objects) will be generated to fight the player.
- Level planning can pre-place several generation points (target positions) of undertide pollution sources in the level; 2. When the level is initialized, it will be at the preset point (backup position). Randomly select [a, b] (target number) locations (a ⁇ b) as the generation points of undertide pollution sources. When planning the configuration, ensure that the number of all preset points is n ⁇ b. Here, if n ⁇ a occurs , then relevant operations are performed with the logic of all n positions activated; 3. Each different level can support different [a, b] interval configurations; 4.
- Subsequent iterations will add rules for the selection of preset points, and factors to be considered include: And it is not limited to the distance between each preset point and the birth point of the character (virtual object), as well as the distance between each preset point and the level target, etc.; 5. As the single game time advances, every certain interval t seconds (the target duration ), the global undertide pollution source will trigger an evolution. The evolution will cause all monsters to gain a certain multiplier of blood volume (health value) and attack power (damage caused to virtual objects by performing interactive operations) when the next trigger is generated; 6. You can pre-configure how many times the undertide pollution source can evolve in total, as well as the corresponding blood volume and attack power multiplier after each evolution; 7.
- the undertide pollution source has already generated a batch of monsters (target objects and/or first object), then these monsters are not Affected by evolution, it will be postponed to the next wave; 8. Every time global evolution is triggered, all players need to be given interface prompts, as shown in Figure 10; 9.
- the undertide pollution source is just a tornado special effect at the performance level under normal conditions. , no collision; 10. When a player approaches the pollution source R meters (sensing area), the pollution source will generate a specified type and number of monster groups (the first object) within a certain unit of itself, and the original central tornado special effects will be converted into A key monster (target object); 11.
- Killing the key monster can cause the pollution source to explode, purify the nearby area and obtain drop rewards (only one key monster is defined here to prevent the player from killing some of them in the case of multiple monsters) key monsters, and then trigger the out-of-combat logic. It will be more difficult to deal with these key monsters if they are resurrected again) 12.
- the key monsters when the key monsters are generated, the location of the pollution source will always be consistent with the key monsters; 13.
- Other monsters in the monster group The monster (the first object) only has an interference effect. Players can choose to kill or not kill, and it will not affect the cleanup of the pollution source. 14. All monsters generated by the pollution source (the target object and the first object), once the exit logic is triggered, It will enter an idle state.
- the key monster will return to the pollution source state (non-entity), and the pollution source will be centered on the current location; 15. After returning to the pollution source state, once the pollution source is around If other players re-enter the R meter range, the generation will be re-triggered, and all the specified types and numbers of monsters generated will also be refreshed; 16. It can be considered that the state of the tornado (non-entity) in the center of the pollution source and the state of key monsters are switched between each other. ; 17. As long as players kill key monsters, they can purify the source of undercurrent pollution and obtain corresponding generous rewards; 18. Players can freely choose whether to kill interference monsters (the first object and/or the second object), and it will not affect the outcome. judge.
- Figure 16 is a flow chart of the switching logic between pollution sources and key monsters provided by the embodiment of the present application. Based on Figure 16, the interaction method in the virtual scene provided by the embodiment of the present application is implemented by steps 1601 to 1606.
- the pollution source tornado and the key monster switch to each other, 1. It is necessary to always detect that there is no player within the R radius centered on the current position; 2.
- the key monster must be in a non-combat state, that is, the target is lost or dead, and there is no other player around that can be used as a combat target. 3.
- the key monster If the key monster suddenly loses sight of the target (virtual object) during the battle (for example, the target hides behind a wall), the key monster will use the animation and movement speed of the alert state to search for the place where the target last disappeared. This state will last for the preset t seconds. After t seconds, if the old target cannot be searched or a new target is found, it will return to the normal idle state. At this time, it will actively transform into the undertide pollution source form; 4. Switch to In the tornado state, the model material of the key monster needs to have a process of slowly hiding over time, controlled by a function curve, and regenerated around the tornado, as shown in Figure 14; 5.
- the gradient of the material transparency is as follows
- the mask gradually becomes transparent from bottom to top or gradually displays from transparency; 6.
- To switch from tornado to key monster there is also a function to control the curve of the monster model material from hiding to display; 7.
- a key monster is switched to a pollution source after the battle, there will be a T-time protection, starting from the moment when the monster is switched. During this T time, the monster's perception will be turned off. Even if a player approaches the monster or pollution source again, it will It will not immediately interrupt the transformation process of monsters into pollution sources, nor will it turn the center of the undertide that has just been converted into a pollution source into a monster again. This can avoid the extreme situation of players jumping repeatedly on the edge of the undertide, causing the undertide to switch repeatedly with monsters. Abnormal performance.
- virtual objects and virtual natural elements that have a negative impact on the virtual environment in the virtual scene are displayed in the virtual scene.
- the virtual natural elements When the virtual objects enter the negative impact area of the virtual natural elements, the virtual natural elements are converted into available
- the virtual object interacts with the target object, so that the virtual object interacts with the target object.
- the virtual natural element when the virtual object is close to the virtual natural element, the virtual natural element is converted into an interactive object to realize the interaction process, which enriches the interactive objects during free exploration and reduces the exploration time, that is, it reduces the time required to realize the interaction process.
- the time of human-computer interaction operation is reduced, thereby improving the efficiency of human-computer interaction and the utilization rate of hardware resources of electronic devices.
- the interactive device 455 in the virtual scene is implemented as a software module.
- the interactive device 455 in the virtual scene is stored in the memory 440
- Software modules in can include:
- the display module 4551 is configured to display virtual objects and virtual natural elements belonging to virtual natural phenomena in a virtual scene; wherein the virtual natural elements are used to cause a negative impact on the environment in which the virtual natural elements are located;
- the first control module 4552 is configured to control the virtual natural element to transform into an interactive target object when the virtual object is within the sensing area of the virtual natural element; wherein, the negative impact area of the virtual natural element including the sensing area;
- the second control module 4553 is configured to control the virtual object and the target object to interact in the virtual scene when receiving an interaction instruction for the target object.
- the device further includes a first display module configured to display a map of the virtual scene, and dynamically display the virtual object and the virtual nature in the map. The relative position of elements.
- the first display module is further configured to dynamically display a travel path between the virtual object and the virtual natural element in the map, and identify the virtual object at one end of the travel path.
- the object identifies the virtual natural element at the other end of the traveling path; wherein the traveling path is used to guide the virtual object to move along the traveling path into the sensing area of the virtual natural element.
- the device further includes a target virtual natural element selection module configured to obtain the virtual object and each virtual natural element when the number of the virtual natural elements is multiple. The distance between the virtual natural elements in the virtual scene; select the virtual natural element with the smallest distance from the virtual object as the target virtual natural element; the first display module is also configured to dynamically display all virtual natural elements in the map. The relative positional relationship between the virtual object and the target virtual natural element.
- the device further includes a second display module configured to display the negative impact of the virtual natural element on the virtual natural element when the virtual object is within the negative impact area of the virtual natural element.
- the device further includes an overlapping module configured to: when the number of the virtual natural elements is multiple and the negatives of at least two virtual natural elements among the multiple virtual natural elements are When the affected area overlaps, the corresponding overlapping area is determined; when the virtual object is in the overlapping area, for the at least two virtual natural elements that make up the overlapping area, the effects caused by each of the virtual natural elements are The first negative impact value is superimposed to obtain the target first negative impact value for the virtual object; the second display module is also configured to display the impact caused by the at least two virtual natural elements on the virtual object. The first negative impact value of the target.
- the device further includes a third display module configured to obtain the duration of the virtual object in the virtual scene; and dynamically display the second negative impact of the virtual object.
- the second negative impact value is positively correlated with the duration; wherein the second negative impact value is used to indicate the degree of obstruction caused to the movement of the virtual object in the virtual scene.
- the second negative impact value is the value of the negative impact caused by the global impact source of the virtual scene on the virtual object
- the global impact source has hidden attributes and advanced attributes
- the advanced attribute causes the global influence source to have at least two stages including a first stage and a second stage.
- the first stage corresponds to the first correlation coefficient between the second negative impact value and the duration, so The second stage corresponds to the second correlation coefficient between the second negative impact value and the duration, and the value of the second correlation coefficient is greater than the value of the first correlation coefficient;
- the third display module is also configured In order to obtain the initial value of the negative impact caused by the global influence source on the virtual object, and multiply the initial value with the value of the first correlation coefficient to obtain a second negative impact value; dynamically display the first negative impact value Two negative impact values; when it is determined that the global impact source advances from the first stage to the second stage, multiply the initial value and the value of the second correlation coefficient to obtain the target negative impact value; adjust the displayed second negative impact value to the target negative impact value.
- the virtual natural element has advanced attributes
- the device further includes an advanced module configured to display the corresponding virtual natural element when an advanced condition of the virtual natural element is met.
- the advanced prompt information of the virtual natural element wherein the advanced prompt information is used to prompt that the virtual natural element has been advanced, and the advanced object obtained by the transformation of the advanced virtual natural element satisfies the following conditions At least one of: the health value is higher than the health value of the target object; the damage caused to the virtual object by performing the interactive operation is higher than the damage caused to the virtual object by the target object performing the interactive operation.
- the device further includes an enhancement module configured to: when a new virtual natural element is generated in the virtual scene, and the negative impact area of the generated new virtual natural element is consistent with the virtual natural element. element's negative When there is an overlapping area in the surface influence area, the new virtual natural element is displayed and the effect of the virtual natural phenomenon in the overlapping area is enhanced.
- the device further includes a generating module configured to generate at least one interactive first interactive object in an idle state when the virtual object is within the sensing area of the virtual natural element. Object; in response to an interaction instruction for the virtual object, controlling the virtual object to interact with the first object in the virtual scene, the interaction being used to transition the first object from the idle state for interactive state.
- a generating module configured to generate at least one interactive first interactive object in an idle state when the virtual object is within the sensing area of the virtual natural element. Object; in response to an interaction instruction for the virtual object, controlling the virtual object to interact with the first object in the virtual scene, the interaction being used to transition the first object from the idle state for interactive state.
- the generation module is further configured to obtain the level of the virtual object when the virtual object is within the sensing area of the virtual natural element, and based on the level, determine the level corresponding to the level.
- Corresponding object type based on the object type, generate at least one interactive first object in an idle state.
- the second control module 4552 is also configured to display a search screen in which the target object searches for the virtual object in the virtual scene; when the target object searches for the virtual object, and When an interactive operation for the virtual object is performed, when an interaction instruction for the target object is received, the virtual object and the target object are controlled to interact in the virtual scene.
- the device further includes a third control module configured to control when the target object does not search for the virtual object in the sensing area within a target duration.
- the target object is transformed into the virtual natural element.
- the device further includes a state transition module configured to control the virtual object to transition from an interactive state to a hidden state in response to a hiding instruction for the virtual object, and the hiding state
- the state makes it impossible for the target object to perceive the virtual object; a picture showing the target object searching for the virtual object in the virtual scene; when the target object does not search for all the hidden objects within the target time period.
- the target object is controlled to be transformed into the virtual natural element.
- the device further includes a fourth control module configured to control the virtual object to leave the sensing area in response to an area leaving operation for the virtual object; when the When the duration of the virtual object leaving the sensing area reaches a target duration, the target object is controlled to transform into the virtual natural element.
- the device further includes a killing module configured to display virtual resources used as rewards in the virtual scene when the virtual object kills the target object; Wherein, the virtual resource is used for application in the virtual scene.
- a killing module configured to display virtual resources used as rewards in the virtual scene when the virtual object kills the target object; Wherein, the virtual resource is used for application in the virtual scene.
- the device further includes an elimination module, and the elimination module is further configured to, when the virtual object kills the target object, center on the position of the target object and move around in a predetermined direction. Set a rate to eliminate the negative impact caused by the virtual natural element in the negative impact area.
- the device further includes a selection module configured to display level options corresponding to at least two difficulty levels, and the number of virtual natural elements corresponding to different difficulty levels is different, so The at least two difficulty levels include a target difficulty level, and the target difficulty level corresponds to a target number of the virtual natural elements; in response to the selection operation of the level option for the target difficulty level, in the virtual scene, display the virtual object, and the attribution A target number of said virtual natural elements for a virtual natural phenomenon.
- a selection module configured to display level options corresponding to at least two difficulty levels, and the number of virtual natural elements corresponding to different difficulty levels is different, so
- the at least two difficulty levels include a target difficulty level, and the target difficulty level corresponds to a target number of the virtual natural elements; in response to the selection operation of the level option for the target difficulty level, in the virtual scene, display the virtual object, and the attribution A target number of said virtual natural elements for a virtual natural phenomenon.
- the display module 4551 is also configured to display in the virtual scene , displaying multiple virtual natural elements belonging to at least two virtual natural phenomena; wherein the virtual natural elements of different virtual natural phenomena have different negative impacts on the environment where they are located.
- the device further includes a movement module configured to, in response to a movement instruction for the virtual object, control the virtual object to move toward the virtual natural element, and move the virtual object to the virtual natural element.
- a movement module configured to, in response to a movement instruction for the virtual object, control the virtual object to move toward the virtual natural element, and move the virtual object to the virtual natural element. Displaying an interactive second object in the scene; when the virtual object does not enter the negative impact area, displaying a picture of the second object searching for the virtual object in the virtual scene; when the second object When the virtual object is searched for and an interactive operation for the virtual object is performed, in response to the interaction instruction for the virtual object, the virtual object and the second object are controlled to interact in the virtual scene. .
- the device further includes a marking module configured to mark the virtual object when the virtual object enters the negative impact area, and display the virtual object carrying the mark. Object; wherein the mark is used to prevent the second object from searching for the virtual object.
- the display module 4551 is also configured to display virtual natural elements belonging to the virtual tornado in the virtual scene when the virtual natural phenomenon is a virtual tornado; wherein, the negative side of the virtual natural element The light intensity in the affected area is lower than the non-negatively affected area in the virtual scene and is destructive to the virtual objects in the negatively affected area.
- An embodiment of the present application also provides an electronic device, where the electronic device includes:
- the processor is configured to implement the virtual object control method provided by the embodiment of the present application when executing computer-executable instructions stored in the memory.
- Embodiments of the present application provide a computer program product or computer program.
- the computer program product or computer program includes computer-executable instructions, and the computer-executable instructions are stored in a computer-readable storage medium.
- the processor of the electronic device reads the computer-executable instructions from the computer-readable storage medium, and the processor executes the computer-executable instructions, so that the electronic device executes the interaction method in the virtual scene described above in the embodiment of the present application.
- Embodiments of the present application provide a computer-readable storage medium storing computer-executable instructions.
- the computer-executable instructions are stored therein.
- the computer-executable instructions When executed by a processor, they will cause the processor to execute the steps provided by the embodiments of the present application.
- the interaction method in the virtual scene for example, the interaction method in the virtual scene as shown in Figure 3.
- the computer-readable storage medium may be a memory such as FRAM, ROM, PROM, EPROM, EEPROM, flash memory, magnetic surface memory, optical disk, or CD-ROM; it may also include one or any combination of the above memories.
- Various equipment may be a memory such as FRAM, ROM, PROM, EPROM, EEPROM, flash memory, magnetic surface memory, optical disk, or CD-ROM; it may also include one or any combination of the above memories.
- Various equipment may be a memory such as FRAM, ROM, PROM, EPROM, EEPROM, flash memory, magnetic surface memory, optical disk, or CD-ROM; it may also include one or any combination of the above memories.
- computer-executable instructions may take the form of a program, software, software module, script, or code, written in any form of programming language, including compiled or interpreted languages, or declarative or procedural languages, and It may be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
- computer-executable instructions may, but do not necessarily correspond to, files in a file system and may be stored as part of a file holding other programs or data, for example, in Hyper Text Markup Language (HTML)
- HTML Hyper Text Markup Language
- scripts in the document stored in a single file specific to the program in question, or, stored in multiple collaborative files (for example, a file storing one or more modules, subroutines, or portions of code) .
- computer-executable instructions may be deployed to execute on one electronic device, or on multiple electronic devices located at one location, or on multiple electronic devices distributed across multiple locations and interconnected by a communications network. executed on the device.
- the virtual natural element When the virtual object is close to the virtual natural element, the virtual natural element is converted into an interactive object to realize the interaction process, which enriches the interactive objects during free exploration and reduces the exploration time, that is, the time required to realize the interaction process is reduced.
- the time of performing human-computer interaction operations is improved, thereby improving the efficiency of human-computer interaction and the utilization rate of hardware resources of electronic devices.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Human Computer Interaction (AREA)
- Radar, Positioning & Navigation (AREA)
- Theoretical Computer Science (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
一种虚拟场景中的交互方法,包括:在虚拟场景中,展示虚拟对象、以及归属于虚拟自然现象的虚拟自然元素;其中,虚拟自然元素,用于对虚拟自然元素所处的环境造成负面影响;当虚拟对象处于虚拟自然元素的感应区域内时,控制虚拟自然元素转化为可交互的目标对象;其中,虚拟自然元素的负面影响区域包括感应区域;当接收到针对目标对象的交互指令时,控制虚拟对象与目标对象在虚拟场景中进行交互。还提供了一种虚拟场景中的交互装置、电子设备、计算机可读存储介质及计算机程序产品。该交互方法丰富了自由探索期间的交互对象,减少了探索时间,也即减少了用于实现交互过程所执行的人机交互操作的时间,从而提高了人机交互效率以及电子设备的硬件资源利用率。
Description
相关申请的交叉引用
本申请实施例基于申请号为202210876584.3、申请日为2022年07月25日的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此引入本申请实施例作为参考。
本申请涉及虚拟化和人机交互技术领域,尤其涉及一种虚拟场景中的交互方法、装置、电子设备、计算机可读存储介质以及计算机程序产品。
相关技术中的大部分3D(Three Dimensions)开放世界游戏中,玩家在地图里是可以完全自由探索的。在自由探索期间,玩家需要主动去寻找任务目标或者怪物进行交互,但是因为缺乏引导或交互对象过于单一等原因,实际体验中玩家容易感到迷茫、缺失游戏目标,从而导致玩家流失;同时,由于大部分时间都花在寻找任务目标或者怪物上,玩家不能在游戏中与任务目标或者怪物进行充分地交互,因此导致人机交互效率过低,从而造成了硬件处理资源的浪费。
发明内容
本申请实施例提供一种虚拟场景中的交互方法、装置、电子设备、计算机可读存储介质以及计算机程序产品,能够提高人机交互效率以及硬件处理资源的利用率。
本申请实施例的技术方案是这样实现的:
本申请实施例提供一种虚拟场景中的交互方法,该方法由电子设备执行,包括:
在虚拟场景中,展示虚拟对象、以及归属于虚拟自然现象的虚拟自然元素;
其中,所述虚拟自然元素,用于对所述虚拟自然元素所处的环境造成负面影响;
当所述虚拟对象处于所述虚拟自然元素的感应区域内时,控制所述虚拟自然元素转化为可交互的目标对象;
其中,所述虚拟自然元素的负面影响区域包括所述感应区域;
当接收到针对所述目标对象的交互指令时,控制所述虚拟对象与所述目标对象在所述虚拟场景中进行交互。
本申请实施例提供一种虚拟场景中的交互装置,包括:。
展示模块,配置为在虚拟场景中,展示虚拟对象、以及归属于虚拟自然现象的虚拟自然元素;其中,所述虚拟自然元素,用于对所述虚拟自然元素所处的环境造成负面影响;
第一控制模块,配置为当所述虚拟对象处于所述虚拟自然元素的感应区域内时,控制所述虚拟自然元素转化为可交互的目标对象;其中,所述虚拟自然元素的负面影响区域包括所述感应区域;
第二控制模块,配置为当接收到针对所述目标对象的交互指令时,控制所述虚拟对象与所述目标对象在所述虚拟场景中进行交互。
本申请实施例提供一种电子设备,包括:
存储器,配置为存储计算机可执行指令;
处理器,配置为执行所述存储器中存储的计算机可执行指令时,实现本申请实施例提供
的虚拟场景中的交互方法。
本申请实施例提供一种计算机可读存储介质,存储有计算机可执行指令,所述计算机可执行指令被处理器执行时,实现本申请实施例提供的虚拟场景中的交互方法。
本申请实施例提供了一种计算机程序产品或计算机程序,该计算机程序产品或计算机程序包括计算机可执行指令,该计算机可执行指令存储在计算机可读存储介质中。电子设备的处理器从计算机可读存储介质读取该计算机可执行指令,处理器执行该计算机可执行指令,使得该电子设备执行本申请实施例提供的虚拟场景中的交互方法。
本申请实施例具有以下有益效果:
在本申请上述实施例中,在虚拟场景中具有虚拟对象、以及对虚拟场景中虚拟环境造成负面影响的虚拟自然元素,当虚拟对象进入虚拟自然元素的负面影响区域时,将虚拟自然元素转化为可供虚拟对象进行交互的目标对象,从而使得虚拟对象与目标对象进行交互。如此,在虚拟对象靠近虚拟自然元素时,将虚拟自然元素转化为可交互对象以实现交互过程,丰富了自由探索期间的交互对象,减少了探索时间,也即减少了用于实现交互过程所执行的人机交互操作的时间,从而提高了人机交互效率以及电子设备的硬件资源利用率。
图1是本申请实施例提供的虚拟场景中的交互系统100的架构示意图;
图2是本申请实施例提供的电子设备的结构示意图;
图3是本申请实施例提供的虚拟场景中的交互方法的流程示意图;
图4是本申请实施例提供的虚拟对象与虚拟自然元素的相对位置关系的示意图;
图5是本申请实施例提供的虚拟对象与虚拟自然元素的相对位置关系的示意图;
图6是本申请实施例提供的虚拟对象向目标虚拟自然元素行进的示意图;
图7是本申请实施例提供的虚拟自然元素对虚拟对象造成负面影响的示意图;
图8是本申请实施例提供的虚拟自然元素的负面影响区域的叠加示意图;
图9是本申请实施例提供的第二负面影响值与时长的关系的示意图;
图10是本申请实施例提供的全局影响源的进阶提示信息的示意图;
图11是本申请实施例提供的虚拟自然元素的示意图;
图12是本申请实施例提供的目标对象的示意图;
图13是本申请实施例提供的第一对象的示意图;
图14是本申请实施例提供的目标对象转化为虚拟自然元素的过程示意图;
图15是本申请实施例提供的虚拟自然元素的进阶提示信息的示意图;
图16是本申请实施例提供的污染源与关键怪切换逻辑的流程图。
为了使本申请的目的、技术方案和优点更加清楚,下面将结合附图对本申请实施例作详细描述,所描述的实施例不应视为对本申请实施例的限制,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其它实施例,都属于本申请保护的范围。
在以下的描述中,涉及到“一些实施例”,其描述了所有可能实施例的子集,但是可以理解,“一些实施例”可以是所有可能实施例的相同子集或不同子集,并且可以在不冲突的情况下相互结合。
在以下的描述中,所涉及的术语“第一\第二\第三”仅仅是区别类似的对象,不代表针对对象的特定排序,可以理解地,“第一\第二\第三”在允许的情况下可以互换特定的顺序或先后次序,以使这里描述的本申请实施例能够以除了在这里图示或描述的以外的顺序实施。
除非另有定义,本文所使用的所有的技术和科学术语与属于本申请的技术领域的技术人
员通常理解的含义相同。本文中所使用的术语只是为了描述本申请实施例的目的,不是旨在限制本申请。
对本申请实施例进行进一步详细说明之前,对本申请实施例中涉及的名词和术语进行说明,本申请实施例中涉及的名词和术语适用于如下的解释。
1)射击游戏,包含第一人称射击游戏、第三人称射击游戏等包含但不仅限于此的所有使用热兵器类进行远程攻击的游戏。
2)第三人称视角,游戏内摄像机在玩家角色后方一定距离的位置,画面中可以看到角色以及周围一定环境内的所有战斗要素的视角。
3)开放世界,指游戏中的战斗场景完全自由开放的游戏虚拟场景,在开放世界中,玩家可以朝任何方向自由前进探索,各个方位的边界之间距离非常大。
4)响应于,用于表示所执行的操作所依赖的条件或者状态,当满足所依赖的条件或状态时,所执行的一个或多个操作可以是实时的,也可以具有设定的延迟;在没有特别说明的情况下,所执行的多个操作不存在执行先后顺序的限制。
5)虚拟场景,是应用程序在终端上运行时显示(或提供)的虚拟场景。该虚拟场景可以是对真实世界的仿真环境,也可以是半仿真半虚构的虚拟环境,还可以是纯虚构的虚拟环境。虚拟场景可以是二维虚拟场景、2.5维虚拟场景或者三维虚拟场景中的任意一种。
例如,虚拟场景可以包括天空、陆地、海洋等,该陆地可以包括沙漠、城市等环境元素,用户可以控制虚拟对象在该虚拟场景中进行活动,该活动包括但不限于:调整身体姿态、爬行、步行、奔跑、骑行、跳跃、驾驶、拾取、射击、攻击、投掷中的至少一种。虚拟场景可以是以第一人称视角显示虚拟场景(例如以用户自己的视角来扮演游戏中的虚拟对象);也可以是以第三人称视角显示虚拟场景(例如用户追着游戏中的虚拟对象来进行游戏);还可以是以鸟瞰大视角显示虚拟场景,上述的视角之间可以任意切换。
6)虚拟对象,虚拟场景中可以进行交互的各种人和物的形象,或在虚拟场景中的可活动对象。该可活动对象可以是虚拟人物、虚拟动物、动漫人物等,比如:在虚拟场景中显示的人物、动物、植物、油桶、墙壁、石块等。该虚拟对象可以是该虚拟场景中的一个虚拟的用于代表用户的虚拟形象。虚拟场景中可以包括多个虚拟对象,每个虚拟对象在虚拟场景中具有自身的形状和体积,占据虚拟场景中的一部分空间。
例如,该虚拟对象可以是通过客户端上的操作进行控制的用户角色,也可以是通过训练设置在虚拟场景对战中的人工智能(Artificial Intelligence,AI),还可以是设置在虚拟场景互动中的非用户角色(Non-Player Character,NPC)。其中,虚拟场景中参与互动的虚拟对象的数量可以是预先设置的,也可以是根据加入互动的客户端的数量动态确定的。
相关技术的开放世界射击游戏中,因为要让玩家可以自由探索,所以一般不会有线性或强提示的目标牵引,即基本是在玩家靠近战斗目标或自己找到关键NPC接到任务了,才会有一定的提示或引导高速玩家应该往哪个方向前进,实际体验也大多是找到若干个指定怪物击杀或者受击物品、NPC对话等等,游戏体验较为单调,缺乏一定的包装以及开放世界代入感的氛围营造,这样,则会导致玩家可能会在大地图中感到目标缺失,进而流失;同时由于缺乏足够的填充内容,当玩家迅速找到局内目标并击杀时,游戏时长也会过早结束,使游戏内容消耗快于预期,进一步导致玩家的流失。
基于此,本申请实施例提供一种虚拟场景中的交互方法、装置、电子设备、计算机可读存储介质以及计算机程序产品,通过在地图中创建若干个玩家可选的虚拟自然元素,当感到目标迷茫时引导玩家在大地图中挑战距离自己最近的虚拟自然元素,虚拟自然元素在玩家靠近时会转化成怪物与玩家战斗,玩家只要挑战成功就可以获得丰厚奖励,这样在丰富局内体验的同时避免玩家流失。
参见图1,图1是本申请实施例提供的虚拟场景中的交互系统100的架构示意图,为实现虚拟场景中的交互的应用场景(例如,虚拟场景中的交互的应用场景可以是基于游戏APP
(Application,APP)中的虚拟场景来进行交互的应用场景,比如玩家在玩游戏APP时,在虚拟场景中展现虚拟自然元素如龙卷风,当玩家靠近龙卷风时,龙卷风转换成可交互的怪物,从而使得玩家采用投射道具或者技能对怪物进行击杀),终端(示例性示出了终端400)上设置有虚拟场景中的交互客户端401(即游戏APP),终端400通过网络300连接服务器200,其中,网络300可以是广域网或者局域网,又或者是二者的组合,使用无线或有线链路实现数据传输。
其中,终端400配置为,响应于针对包含虚拟对象以及归属于虚拟自然现象的虚拟自然元素的虚拟场景的触发操作,发送虚拟场景的展示请求至服务器200;
服务器200配置为,基于接收到的虚拟场景的展示请求,发送包含虚拟对象以及归属于虚拟自然现象的虚拟自然元素的虚拟场景至终端400;
终端400还配置为,接收包含虚拟对象以及归属于虚拟自然现象的虚拟自然元素的虚拟场景;呈现虚拟场景,并在虚拟场景中,展示虚拟对象、以及归属于虚拟自然现象的虚拟自然元素;其中,虚拟自然元素,用于对虚拟自然元素所处的环境造成负面影响;当虚拟对象处于虚拟自然元素的感应区域内时,控制虚拟自然元素转化为可交互的目标对象;其中,虚拟自然元素的负面影响区域包括感应区域;当接收到针对目标对象的交互指令时,控制虚拟对象与目标对象在虚拟场景中进行交互。
一些实施例中,服务器200可以是独立的物理服务器,也可以是多个物理服务器构成的服务器集群或者分布式系统,还可以是提供云服务、云数据库、云计算、云函数、云存储、网络服务、云通信、中间件服务、域名服务、安全服务、内容分发网络(Content Deliver Network,CDN)、以及大数据和人工智能平台等基础云计算服务的云服务器。终端400可以是智能手机、平板电脑、笔记本电脑、台式计算机、机顶盒、智能语音交互设备、智能家电、虚拟现实设备、车载终端、飞行器、以及移动设备(例如,移动电话,便携式音乐播放器,个人数字助理,专用消息设备,便携式游戏设备,智能音箱及智能手表)等,但并不局限于此。终端设备以及服务器可以通过有线或无线通信方式进行直接或间接地连接,本申请实施例中不做限制。
接下来对实施本申请实施例提供的虚拟场景中的交互方法的电子设备进行说明。参见图2,图2是本申请实施例提供的电子设备的结构示意图,以电子设备为图1中所示的终端为例,图2所示的电子设备包括:至少一个处理器410、存储器450、至少一个网络接口420和用户接口430。终端400中的各个组件通过总线系统440耦合在一起。可理解,总线系统440用于实现这些组件之间的连接通信。总线系统440除包括数据总线之外,还包括电源总线、控制总线和状态信号总线。但是为了清楚说明起见,在图2中将各种总线都标为总线系统440。
处理器410可以是一种集成电路芯片,具有信号的处理能力,例如通用处理器、数字信号处理器(Digital Signal Processor,DSP),或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等,其中,通用处理器可以是微处理器或者任何常规的处理器等。
用户接口430包括使得能够显示媒体内容的一个或多个输出装置431,包括一个或多个扬声器和/或一个或多个视觉显示屏。用户接口430还包括一个或多个输入装置432,包括有助于用户输入的用户接口部件,比如键盘、鼠标、麦克风、触屏显示屏、摄像头、其他输入按钮和控件。
存储器450可以是可移除的,不可移除的或其组合。示例性的硬件设备包括固态存储器,硬盘驱动器,光盘驱动器等。存储器450可选地包括在物理位置上远离处理器410的一个或多个存储设备。
存储器450包括易失性存储器或非易失性存储器,也可包括易失性和非易失性存储器两者。非易失性存储器可以是只读存储器(ROM,Read Only Memory),易失性存储器可以是随机存取存储器(RAM,Random Access Memory)。本申请实施例描述的存储器450旨在包
括任意适合类型的存储器。
在一些实施例中,存储器450能够存储数据以支持各种操作,这些数据的示例包括程序、模块和数据结构或者其子集或超集,下面示例性说明。
操作系统451,包括配置为处理各种基本系统服务和执行硬件相关任务的系统程序,例如框架层、核心库层、驱动层等,配置为实现各种基础业务以及处理基于硬件的任务;
网络通信模块452,配置为经由一个或多个(有线或无线)网络接口420到达其他电子设备,示例性的网络接口420包括:蓝牙、无线相容性认证(WiFi)、和通用串行总线(USB,Universal Serial Bus)等;
呈现模块453,配置为经由一个或多个与用户接口430相关联的输出装置431(例如,显示屏、扬声器等)使得能够显示信息(例如,用于操作外围设备和显示内容和信息的用户接口);
输入处理模块454,配置为对一个或多个来自输入装置432的用户输入或互动进行检测以及翻译所检测的输入或互动。
在一些实施例中,本申请实施例提供的装置可以采用软件方式实现,图2示出了存储在存储器450的虚拟场景中的交互装置455,其可以是程序和插件等形式的软件,包括以下软件模块:展示模块4551、第一控制模块4552以及第二控制模块4553,这些模块是逻辑上的,因此根据所实现的功能可以进行任意的组合或进一步拆分。将在下文中说明各个模块的功能。
在另一些实施例中,本申请实施例提供的装置可以采用硬件方式实现,作为示例,本申请实施例提供的虚拟场景中的交互装置可以是采用硬件译码处理器形式的处理器,其被编程以执行本申请实施例提供的虚拟场景中的交互方法,例如,硬件译码处理器形式的处理器可以采用一个或多个应用专用集成电路(Application Specific Integrated Circuit,ASIC)、DSP、可编程逻辑器件(Programmable Logic Device,PLD)、复杂可编程逻辑器件(Complex Programmable Logic Device,CPLD)、现场可编程门阵列(Field-Programmable Gate Array,FPGA)或其他电子元件。
在一些实施例中,终端或服务器可以通过运行计算机程序来实现本申请实施例提供的虚拟场景中的交互方法。举例来说,计算机程序可以是操作系统中的原生程序或软件模块;可以是本地(Native)应用程序(Application,APP),即需要在操作系统中安装才能运行的程序,如即时通信APP、网页浏览器APP;也可以是小程序,即只需要下载到浏览器环境中就可以运行的程序;还可以是能够嵌入至任意APP中的小程序。总而言之,上述计算机程序可以是任意形式的应用程序、模块或插件。
基于上述对本申请实施例提供的虚拟场景中的交互系统及电子设备的说明,下面说明本申请实施例提供的虚拟场景中的交互方法。在实际实施时,本申请实施例提供的虚拟场景中的交互方法可以由终端或服务器单独实现,或者由终端及服务器协同实现,以由图1中的终端400单独执行本申请实施例提供的虚拟场景中的交互方法为例进行说明。参见图3,图3是本申请实施例提供的虚拟场景中的交互方法的流程示意图,将结合图3示出的步骤进行说明。
步骤101,终端在虚拟场景中,展示虚拟对象、以及归属于虚拟自然现象的虚拟自然元素;其中,虚拟自然元素用于对虚拟自然元素所处的环境造成负面影响。
在实际实施时,终端上安装有支持虚拟场景的应用程序。该应用程序可以是第一人称射击游戏、第三人称射击游戏、多人在线战术竞技游戏、虚拟现实应用程序、三维地图程序、或者多人枪战类生存游戏中的任意一种。用户可以使用终端操作位于虚拟场景中的虚拟对象进行活动。
当用户打开终端上的应用程序,且终端运行该应用程序时,终端呈现虚拟场景的画面,这里,虚拟场景的画面是以第一人称对象视角对虚拟场景观察得到,或是以第三人称视角对虚拟场景观察得到,虚拟场景的画面中包括虚拟对象以及归属于虚拟自然现象的虚拟自然元
素,虚拟对象可以是由当前玩家控制的玩家角色,也可以是与当前玩家属于相同群组的其他玩家(队友)控制的玩家角色,虚拟自然元素所归属的虚拟自然现象可以是如龙卷风、火山等对所处的环境造成负面影响的自然现象;而虚拟自然元素对所处的环境所造成负面影响的区域为负面影响区域。
作为示例,当虚拟自然现象为虚拟龙卷风时,在虚拟场景中,展示归属于虚拟龙卷风的虚拟自然元素;其中,虚拟自然元素的负面影响区域为虚拟龙卷风所影响的区域,虚拟自然元素的负面影响区域内的光照强度低于虚拟场景中的非负面影响区域、且对负面影响区域内的虚拟物体有破坏性。
作为示例,当虚拟自然现象为虚拟火山时,在虚拟场景中,展示归属于虚拟火山的虚拟自然元素;其中,虚拟自然元素的负面影响区域为虚拟火山所影响的区域,虚拟自然元素的负面影响区域内的地裂程度以及环境温度高于虚拟场景中的非负面影响区域、且对负面影响区域内的虚拟物体有破坏性。
需要说明的是,在虚拟自然元素的负面影响范围内,距离该虚拟自然元素距离越近,虚拟自然元素对相应环境所造成的负面影响越强。
在实际实施时,当虚拟自然元素的数量是多个时,多个虚拟自然元素至少归属于两种虚拟自然现象,例如火山、龙卷风等,在虚拟场景中,展示归属于虚拟自然现象的虚拟自然元素的过程包括,在虚拟场景中,展示归属于至少两个虚拟自然现象的多个虚拟自然元素;其中,不同虚拟自然现象的虚拟自然元素对所处环境造成的负面影响不同,例如龙卷风用于降低虚拟自然元素所处环境的光照强度以及破坏虚拟自然元素所处环境中的虚拟物体;而火山用于干裂虚拟自然元素所处环境的地面、升高虚拟自然元素所处环境的温度以及破坏虚拟自然元素所处环境中的虚拟物体。
应用上述实施例,当虚拟自然元素的数量是多个时,多个虚拟自然元素至少归属于两种虚拟自然现象,在虚拟场景中,展示归属于至少两个虚拟自然现象的多个虚拟自然元素,如此,增加了虚拟场景中交互过程的多样性,提高了用户的沉浸感和交互体验,从而提高了人机交互效率以及电子设备的硬件资源利用率。
在一些实施例中,在展示虚拟对象、以及归属于虚拟自然现象的虚拟自然元素的同时,在虚拟场景中,还会显示虚拟场景的地图,并在地图中动态显示虚拟对象与虚拟自然元素的相对位置关系。示例性地,参见图4,图4是本申请实施例提供的虚拟对象与虚拟自然元素的相对位置关系的示意图,基于图4,虚线框401中为虚拟对象,如虚线框402中的阴影圆为虚拟自然元素,随着虚拟对象的移动,在地图中动态显示虚拟对象与虚拟自然元素的相对位置关系。
在实际应用中,在展示虚拟对象、以及归属于虚拟自然现象的虚拟自然元素的同时,显示虚拟场景的地图,从而在地图中动态显示虚拟对象与虚拟自然元素的相对位置关系,使得用户更清楚自身与虚拟自然元素间的距离,方便用户找到相应虚拟自然元素,提高了用户体验,同时减少了用户寻找虚拟自然元素时间。
在实际实施时,在地图中动态显示虚拟对象与虚拟自然元素的相对位置关系的方式还可以是在地图中动态显示虚拟对象与虚拟自然元素间的行进路径,并在行进路径的一端标识虚拟对象,在行进路径的另一端标识虚拟自然元素;其中,行进路径,用于引导虚拟对象沿行进路径移动至虚拟自然元素的感应区域内。示例性地,参见图5,图5是本申请实施例提供的虚拟对象与虚拟自然元素的相对位置关系的示意图,基于图5,显示虚拟对象与虚线框502中虚拟自然元素的行进路径,这里,虚线框501中为设置于虚拟场景中的补给位置如虚拟商店等,这里的补给位置用于提高虚拟对象的状态值如血量、行进速度等。这里,行进路径可以是包含补给位置的路径,也可以是距离最短的路径,如此,呈现不同路线供虚拟对象选择,以使虚拟对象依据自身状态,确定是否需要补给,还是直接选择距离最短的路线,加快行进效率,从而避免浪费路途时间。
需要说明的是,虚拟对象在基于行进路径向虚拟自然元素行进时,可以是直接依据地图上的行进路径,还可以是在虚拟场景中高亮虚拟对象脚下或者载具下的路径,以供虚拟对象向虚拟自然元素行进。
应用上述实施例,通过在地图中动态显示虚拟对象与虚拟自然元素间的行进路径,使得用户更清楚自身与虚拟自然元素间的相对位置,方便用户找到相应虚拟自然元素,提高了用户体验,同时减少了用户寻找虚拟自然元素时间,提高了人机交互效率以及硬件处理资源的利用率。
在实际实施时,当虚拟自然元素的数量为多个时,还需从多个虚拟自然元素中确定目标虚拟自然元素,这里,目标虚拟自然元素可以直接由终端确定,也可以由用户来确定,接下来,通过以下两个示例来说明目标虚拟自然元素的确定过程,需要说明的是,目标虚拟自然元素的确定方式包括但不限于以下两种方式,本申请实施例对此不做限制。
在一些实施例中,当虚拟自然元素的数量为多个时,获取虚拟对象与各虚拟自然元素在虚拟场景中的距离;选取与虚拟对象的距离最小的虚拟自然元素为目标虚拟自然元素,从而在地图中动态显示虚拟对象与目标虚拟自然元素的相对位置关系。
在实际应用中,当虚拟自然元素的数量为多个时,只显示虚拟对象与距虚拟对象最近的虚拟自然元素的相对位置关系,方便用户找到离自己最近的虚拟自然元素,减少了用户寻找虚拟自然元素时间。
在另一些实施例中,当虚拟自然元素的数量为多个时,呈现对应各虚拟自然元素的选项;响应于针对目标虚拟自然元素对应目标选项的选择操作,将目标选项对应的虚拟自然元素作为目标虚拟自然元素,从而在地图中动态显示虚拟对象与目标虚拟自然元素的相对位置关系。
需要说明的是,当确定了目标虚拟自然元素之后,即呈现虚拟对象与目标虚拟自然元素间的行进路径,同时显示虚拟对象对应的方向移动控件,该方向移动控件可以是一个摇杆控件,然后响应于基于方向移动控件接收的移动控制指令,控制虚拟对象按照移动控制指令所指示的方向(包括前进、后退、左移和右移)向目标虚拟自然元素进行移动。示例性地,参见图6,图6是本申请实施例提供的虚拟对象向目标虚拟自然元素行进的示意图,基于图6,虚线框601中是方向移动控件,当基于该方向移动控件接收到针对虚拟对象的移动控制指令(如左移控制指令)时,响应于移动控制指令,控制虚拟对象向左移动。
在实际实施时,虚拟自然元素除了对自身所处环境有负面影响,还会对虚拟对象产生负面影响,当虚拟对象处于虚拟自然元素的负面影响区域内时,显示虚拟自然元素对虚拟对象所造成的第一负面影响值;其中,第一负面影响值,用于指示虚拟自然元素对虚拟对象在负面影响区域内的移动所造成的阻碍程度,也即阻碍大小。需要说明的是,这里的对虚拟对象产生的第一负面影响值用于指示对虚拟对象的状态值造成的负面影响,例如,降低虚拟对象的行进速度、降低虚拟对象的血量等。
示例性地,参见图7,图7是本申请实施例提供的虚拟自然元素对虚拟对象造成负面影响的示意图,基于图7,当虚拟对象处于虚拟自然元素的负面影响区域内时,会在地图下方显示如虚线框701中所示的虚拟自然元素对虚拟对象所造成的第一负面影响值,如此,使得虚拟对象可以实时确定虚拟自然元素对自身所造成的负面影响,从而执行合适的用于消除负面影响的解决方法,例如,若第一负面影响值较高则快速找补给进行状态补充;若第一负面影响值较低则继续向虚拟自然元素行进。这里,第一负面影响值的高低可以与预先设定的负面影响阈值进行比较,当比较结果表征当前第一负面影响值大于该负面影响阈值时,确定当前第一负面影响值较高;当比较结果表征当前第一负面影响值小于该负面影响阈值时,确定当前第一负面影响值较低。
应用上述实施例,通过显示用于指示虚拟自然元素对虚拟对象在负面影响区域内的移动所造成的阻碍程度的第一负面影响值,使得虚拟对象可以实时感知并确定虚拟自然元素对自身所造成的负面影响,从而执行合适的解决方式,以及时消除该负面影响,从而提高了人机
交互效率以及电子设备的硬件资源利用率。
在一些实施例中,当虚拟自然元素存在多个时,多个虚拟自然元素对虚拟对象所造成的第一负面影响值可以进行叠加,当虚拟自然元素的数量为多个、且多个虚拟自然元素中至少两个虚拟自然元素的负面影响区域存在区域重叠时,确定相应的重叠区域;当虚拟对象处于重叠区域时,针对组成重叠区域的至少两个虚拟自然元素,将各虚拟自然元素所造成的第一负面影响值进行叠加,得到针对虚拟对象的目标第一负面影响值;从而显示至少两个虚拟自然元素对虚拟对象所造成的目标第一负面影响值。
在实际应用中,虚拟对象处于至少两个虚拟自然元素的负面影响区域的重叠区域时,将各虚拟自然元素所造成的第一负面影响值进行叠加,以得到针对虚拟对象的目标第一负面影响值,如此,使得加入虚拟场景的用户更好感知到自身所处位置,提高虚拟场景的场景效果以及用户的沉浸感。
需要说明的是,除了对虚拟对象所造成的第一负面影响值可以进行叠加,多个虚拟自然元素对所处环境造成的负面影响也可以进行叠加,当虚拟场景中有新虚拟自然元素生成、且生成的新虚拟自然元素的负面影响区域与虚拟自然元素的负面影响区域存在重叠区域时,展示新虚拟自然元素,并增强重叠区域内的虚拟自然现象的效果。参见图8,图8是本申请实施例提供的虚拟自然元素的负面影响区域的叠加示意图,基于图8,非叠加区域只受一个虚拟自然元素的负面影响,而叠加区域则是受两个虚拟自然元素的叠加的负面影响,且当虚拟对象处于叠加区域时,显示的是两个虚拟自然元素所对应的第一负面影响值之和。
示例性地,当存在重叠区域的两个虚拟自然元素对应的虚拟自然现象为虚拟龙卷风时,则在该重叠区域内光照强度低于负面影响区域中非重叠区域、且对虚拟物体的破坏性高于负面影响区域中非重叠区域;又或者,当存在重叠区域的两个虚拟自然元素对应的虚拟自然现象为虚拟龙卷风以及虚拟火山时,则在该重叠区域既存在对应虚拟龙卷风的虚拟自然现象,也会存在对应虚拟火山的虚拟自然现象、且对虚拟物体的破坏性高于负面影响区域中非重叠区域。
应用上述实施例,虚拟对象处于至少两个虚拟自然元素的负面影响区域的重叠区域时,将各虚拟自然元素所处环境造成的负面影响进行叠加,如此,使得加入虚拟场景的用户更好感知到自身所处位置,提高虚拟场景的场景效果以及用户的沉浸感,从而提高了人机交互效率以及电子设备的硬件资源利用率。
在一些实施例中,除了虚拟自然元素会对虚拟对象的产生负面影响,当虚拟对象未处于虚拟自然元素的负面影响范围内时,自身也会存在负面影响值,获取虚拟对象在虚拟场景中的时长;动态显示虚拟对象的第二负面影响值,第二负面影响值与时长呈正相关关系;其中,第二负面影响值,用于指示对虚拟对象在虚拟场景内的移动所造成的阻碍程度,也即阻碍大小。需要说明的是,这里的对虚拟对象产生的第二负面影响值用于指示对虚拟对象的状态值造成的负面影响,例如,降低虚拟对象的行进速度、降低虚拟对象的血量等。示例性地,参见图9,图9是本申请实施例提供的第二负面影响值与时长的关系的示意图,基于图9,虚拟对象处于虚拟场景中的时间越长,第二负面影响值的增长速率越快。
在实际应用中,除了虚拟自然元素会对虚拟对象的产生负面影响,当虚拟对象未处于虚拟自然元素的负面影响范围内时,也会存在与虚拟对象在虚拟场景中的时长相关联的负面影响值,如此,提高用户的消除负面影响的积极性以及用户的沉浸感。
在实际实施时,第二负面影响值为虚拟场景的全局影响源对虚拟对象所造成的负面影响的值,且全局影响源具有隐藏属性及进阶属性,进阶属性使得全局影响源具有包括第一阶段及第二阶段的至少两个阶段,第一阶段对应第二负面影响值与时长的第一相关系数,第二阶段对应第二负面影响值与时长的第二相关系数,且第二相关系数的值大于第一相关系数的值;当确定全局影响源处于第一阶段时,动态显示虚拟对象的第二负面影响值的过程包括,获取全局影响源对虚拟对象所造成的负面影响的初始值,并将初始值与第一相关系数的值进行相
乘,得到第二负面影响值;动态显示第二负面影响值;而当确定全局影响源由第一阶段进阶至第二阶段时,将初始值与第二相关系数的值进行相乘,得到目标负面影响值;调整显示的第二负面影响值至目标负面影响值。
应用上述实施例,通过虚拟对象在虚拟场景中的时长改变负面影响值的增长速率,如此,提高在虚拟场景中交互过程的多样性、以及用户的消除负面影响的积极性,从而提高了人机交互效率以及电子设备的硬件资源利用率。
需要说明的是,当确定全局影响源由第一阶段进阶至第二阶段时,还会显示对应全局影响源的进阶提示信息,获取虚拟对象在虚拟场景中的时长,当基于时长确定全局影响源由第一阶段进阶至第二阶段时,显示对应全局影响源的进阶提示信息,其中,该进阶提示信息,用于提示全局影响源已实现进阶。示例性地,参见图10,图10是本申请实施例提供的全局影响源的进阶提示信息的示意图,基于图10,当基于时长确定全局影响源由第一阶段进阶至第二阶段时,显示如框1001中所示的进阶提示信息。
需要说明的是,对于基于时长确定全局影响源由第一阶段进阶至第二阶段的过程,这里预先设定第二阶段对应的目标时长阈值,从而将时长与目标时长阈值进行比较,进而确定全局影响源由第一阶段进阶至第二阶段,将获取的时长与目标时长阈值进行比对,当比对结果表征时长大于目标时长阈值时,确定全局影响源由第一阶段进阶至第二阶段。
在实际应用中,通过显示用于提示虚拟自然元素已实现进阶的进阶提示信息,及时提醒用户负面影响值的增长速率得到增加,提高用户的交互体验。
在实际实施时,虚拟对象的第一负面影响值与第二负面影响值也可以进行叠加,当虚拟对象处于虚拟自然元素的负面影响区域内时,确定虚拟自然元素对虚拟对象所造成的第一负面影响值、以及全局影响源对虚拟对象所造成的第二负面影响值;将第一负面影响值与第二负面影响值进行叠加,得到虚拟对象的总负面影响值,并动态显示虚拟对象的总负面影响值。
需要说明的是,各虚拟对象的总负面影响值也是存在上限的,从而避免叠加后的负面影响效果溢出;同时,虚拟对象的总负面影响值也可以代表该虚拟对象在虚拟场景中所处位置的虚拟自然现象的强弱,从而渲染环境氛围,提升用户的代入感。
在实际实施时,在虚拟场景中,展示虚拟对象、以及归属于虚拟自然现象的虚拟自然元素之前,还可以确定虚拟自然元素的数量,这里,虚拟自然元素的数量的确定方式存在多种,接下来,以两种确定方式为例,对虚拟自然元素的数量的确定过程进行说明。
在一些实施例中,显示至少两个难度等级所分别对应的等级选项,不同难度等级所对应的虚拟自然元素的数量不同,至少两个难度等级包括目标难度等级,目标难度等级对应目标数量的虚拟自然元素;响应于针对目标难度等级的等级选项的选择操作,确定虚拟自然元素的目标数量,从而在虚拟场景中,展示虚拟对象、以及归属于虚拟自然现象的目标数量的虚拟自然元素。
在另一些实施例中,当虚拟自然元素的数量为预先设定时,直接获取虚拟场景中虚拟自然元素的目标数量,从而在虚拟场景中,展示虚拟对象、以及归属于虚拟自然现象的目标数量的虚拟自然元素。
在实际实施时,在虚拟场景初始化之前,相关开发人员会确定虚拟场景中至少两个虚拟自然元素的备选位置,然后基于确定的虚拟自然元素的目标数量,从至少两个虚拟自然元素的备选位置中,选取对应虚拟自然元素的数量的备选位置,作为展示虚拟自然元素的位置,从而在虚拟场景中,展示虚拟对象,并在选取的位置上,展示归属于虚拟自然现象的虚拟自然元素。这里,为了避免进入相同难度等级的虚拟场景中虚拟自然元素的位置相同,可以从至少两个虚拟自然元素的备选位置中,选取对应虚拟自然元素的数量的备选位置,作为展示虚拟自然元素的位置的过程可以是随机选取。
需要说明的是,从虚拟场景中确定虚拟自然元素的备选位置的考虑要素包括但不限于每个备选位置与虚拟对象出生点的距离,以及每个备选位置与虚拟场景中NPC的距离等,同
时针对不同虚拟场景或者不同难度等级的虚拟场景,虚拟自然元素的备选位置的数量可以不同。
步骤102,当虚拟对象处于虚拟自然元素的感应区域内时,控制虚拟自然元素转化为可交互的目标对象;其中,虚拟自然元素的负面影响区域包括感应区域。
需要说明的是,虚拟自然元素的感应区域可以是以虚拟自然元素所处位置为圆心,目标距离为半径的圆形区域,这里的目标距离为预先设定的如5米等;而负面影响区域为虚拟自然元素对所处的环境所造成负面影响的区域,感应区域与负面影响区域可以相同或不同,例如,感应区域可以在负面影响区域的区域范围内,如感应区域可以是以虚拟自然元素所处位置为圆心,目标距离5米为半径的圆形区域,而负面影响区域可以是以虚拟自然元素所处位置为圆心,目标距离7米为半径的圆形区域。
示例性地,参见图11以及图12,图11是本申请实施例提供的虚拟自然元素的示意图,图12是本申请实施例提供的目标对象的示意图,基于图11以及图12,虚拟对象在进入虚拟自然元素的感应区域后,虚拟自然元素即转化为如图12中所示的目标对象。
在一些实施例中,当虚拟自然元素转化为可交互的目标对象之后,目标对象在感应区域内追击虚拟对象,展示目标对象在虚拟场景中搜索虚拟对象的搜索画面;当在目标时长内目标对象未在感应区域内搜索到虚拟对象时,控制目标对象转化为虚拟自然元素;而对于当目标对象搜索到虚拟对象时的情况,将在步骤103中进行说明。
需要说明的是,由于目标对象会追击虚拟对象,因此,随着目标对象的移动,虚拟自然元素的位置也会随着移动,从而保证虚拟自然元素与目标对象位置的一致。
应用上述实施例,虚拟自然元素转化为可交互的目标对象之后,目标对象在感应区域内追击虚拟对象,当目标对象未搜索到虚拟对象时还会转化为虚拟自然元素,如此,增加了虚拟场景中交互过程的多样性,提高了用户的交互体验,从而提高了人机交互效率以及电子设备的硬件资源利用率。
在一些实施例中,当虚拟对象处于虚拟自然元素的感应区域内时,还会生成至少一个处于空闲状态的可交互的第一对象,参见图13,图13是本申请实施例提供的第一对象的示意图,基于图11以及图13,虚拟对象在进入虚拟自然元素的感应区域后,虚拟自然元素在转化为可交互的目标对象的同时,虚拟场景中虚拟自然元素的位置还会生成如图13中所示的第一对象。
需要说明的是,这里的空闲状态用于指示非交互状态如非战斗状态,即第一对象不会主动追击虚拟对象而是用于干扰虚拟对象,而交互状态用于指示第一对象会主动追击虚拟对象,基于此,虚拟对象可选择是否与第一对象进行交互,当虚拟对象选择与第一对象进行交互时,第一对象才会与虚拟对象进行交互,响应于针对虚拟对象的交互指令,控制虚拟对象与第一对象在所述虚拟场景中进行交互,该交互用于将第一对象从空闲状态转变为交互状态。
在实际应用中,当虚拟对象处于虚拟自然元素的感应区域内时,会生成至少一个处于空闲状态的可交互的第一对象,虚拟对象可选择是否与第一对象进行交互,当虚拟对象选择与第一对象进行交互时,第一对象才会与虚拟对象进行交互,如此,增加了虚拟场景中交互对象以及交互方式的多样性,提高了用户的交互体验。
在实际实施时,第一对象的对象类型与虚拟对象的等级对应,基于虚拟对象的等级,第一对象的生成过程包括,当虚拟对象处于虚拟自然元素的感应区域内时,获取虚拟对象的等级,并基于等级,确定与等级对应的对象类型;基于对象类型,生成至少一个处于空闲状态的可交互的第一对象。
需要说明的是,对于基于等级,确定与等级对应的对象类型的过程,这里会预先设定等级与对象类型的对应关系,这里的第一对象的对象类型包括但不限于第一对象的等级、形态等,例如,当虚拟对象的等级为高等级时,则生成高等级的第一对象,当虚拟对象的等级为低等级时,则生成低等级的第一对象;又或者,当虚拟对象的等级为高等级时,则生成具有
飞行能力的第一对象,当虚拟对象的等级为低等级时,则生成无飞行能力的第一对象。基于此,基于等级,确定与等级对应的对象类型的过程,可以是,获取等级与对象类型的对应关系,基于等级以及对应关系,确定与等级对应的对象类型。
应用上述实施例,基于等级,确定与等级对应的对象,从而生成对应该对象类型的至少一个处于空闲状态的可交互的第一对象。如此,提高在虚拟场景中交互过程的多样性、以及用户提升等级的积极性,从而提高了人机交互效率以及电子设备的硬件资源利用率。
在实际实施时,当虚拟对象击杀全部第一对象后,可以在虚拟场景中展示用作奖励的能够运用于虚拟场景的虚拟资源。示例性地,虚拟资源可以是用于降低虚拟对象负面影响值的补给、或者是用于执行针对目标对象的交互操作的道具、又或者是提高虚拟对象等级的经验值等。
在一些实施例中,在虚拟自然元素转化为目标对象之后,虚拟对象还可以脱离目标对象的感知区域,从而使得目标对象无法感知到自身,进而再转化为虚拟自然元素,响应于针对虚拟对象的区域离开操作,控制虚拟对象离开所述感应区域;当虚拟对象离开感应区域的时长达到目标时长时,控制目标对象转化为虚拟自然元素。
需要说明的是,在目标对象转化为虚拟自然元素之后,当虚拟对象没有与第一对象进行交互或者没有击杀全部第一对象时,在虚拟场景中的第一对象也会随之消失。
步骤103,当接收到针对目标对象的交互指令时,控制虚拟对象与目标对象在虚拟场景中进行交互。
在实际实施时,对于目标对象搜索到虚拟对象的情况,由于目标对象会追击虚拟对象,基于此,当接收到针对目标对象的交互指令时,控制虚拟对象与目标对象在虚拟场景中进行交互的过程包括,展示目标对象在虚拟场景中搜索虚拟对象的搜索画面;在目标对象搜索到虚拟对象、且执行了针对虚拟对象的交互操作的情况下,当接收到针对目标对象的交互指令时,控制虚拟对象与目标对象在虚拟场景中进行交互。
在实际应用中,虚拟自然元素转化为可交互的目标对象之后,目标对象在感应区域内追击虚拟对象,当目标对象搜索到虚拟对象时会主动与虚拟对象进行交互,如此,提高了在虚拟场景中交互过程的多样性、以及用户与目标对象交互的积极性。
在一些实施例中,在控制目标对象执行针对虚拟对象的交互操作或者控制虚拟对象与目标对象进行交互之后,虚拟对象还可以基于虚拟场景中的虚拟建筑进行隐藏,从而使得目标对象无法感知到自身,进而再转化为虚拟自然元素,响应于针对虚拟对象的隐藏指令,控制虚拟对象从交互状态转变为隐藏状态,其中,隐藏状态使得目标对象无法感知到虚拟对象;展示目标对象在虚拟场景中搜索虚拟对象的画面;当目标对象在目标时长内未搜索到处于隐藏状态的虚拟对象时,控制目标对象转化为虚拟自然元素。需要说明的是,由于虚拟对象是从交互状态转变为隐藏状态的,因此,目标对象会向虚拟对象最后出现的位置进行移动并搜索,则目标对象在虚拟场景中搜索虚拟对象的画面的过程可以是,展示目标对象向虚拟对象转变状态时的位置移动、以及目标对象在虚拟场景中搜索虚拟对象的画面。
示例性地,如果目标对象在与虚拟对象的交互过程中突然无法感知到虚拟对象(例如虚拟对象躲到墙后,目标对象失去虚拟对象的视线),目标对象会以警戒状态的移动速度,朝虚拟对象最后消失的地方去搜寻过去,然后在预设的目标时长之后,还不能搜索到该虚拟对象或发现新的虚拟对象,就会主动转化为虚拟自然元素的形态。
应用上述实施例,在控制目标对象执行针对虚拟对象的交互操作或者控制虚拟对象与目标对象进行交互之后,虚拟对象还可以基于虚拟场景中的虚拟建筑进行隐藏,从而使得目标对象无法感知到自身,进而再转化为虚拟自然元素,如此,增加了虚拟场景中及交互方式的多样性,提高了用户的交互体验,从而提高了人机交互效率以及电子设备的硬件资源利用率。
在一些实施例中,在控制虚拟对象与目标对象进行交互之后,虚拟对象同样也可以脱离目标对象的感知区域,从而使得目标对象无法感知到自身,进而再转化为虚拟自然元素,响
应于针对虚拟对象的区域离开操作,控制虚拟对象离开感应区域;当虚拟对象离开感应区域的时长达到目标时长时,控制目标对象转化为虚拟自然元素。
在实际实施时,控制目标对象转化为虚拟自然元素为一个随时间慢慢转化的过程,将目标对象从第一端到第二端随时间逐渐变透明,并将虚拟自然元素从第二段到第一段随时间逐渐显示,其中,显示与变透明的速率与时间均成正相关关系。示例性地,参见图14,图14是本申请实施例提供的目标对象转化为虚拟自然元素的过程示意图,基于图14,当目标对象转化为虚拟自然元素时,目标对象的模型材质随时间的增加而变透明,这里,材质透明度的渐变可以配合特效由下到上或由上往下的效果,即通过从下方到上方或者从上方到下方逐步将目标对象透明化,同时将虚拟自然元素相应的从上往下或者从下往上从透明化逐步显示出来。需要说明的是,对于从虚拟自然元素转化为目标对象的过程,同样存在控制目标对象的模型材质从隐藏到显示的过程,与上述过程为互逆过程,即逐步将虚拟自然元素从下往上从下方到上方或者从上方到下方透明化显示,同时将目标对象相应的从上往下或者从下往上从透明化逐步显示出来。
需要说明的是,在目标对象转化为虚拟自然元素之后,一旦虚拟自然元素周围有虚拟对象重新进入感知范围,则会重新将虚拟自然元素转化为目标对象,同时也可以生成第一对象,这里,出现的目标对象以及第一对象的类型和数量也会被刷新。这里,将目标对象转化为虚拟自然元素的过程存在一个目标时长的保护,从目标对象转化时刻开始计时,在这个目标时间内,会关闭目标对象的感知,即便有虚拟对象重新接近目标对象或者虚拟自然元素,也不会立刻打断目标对象到虚拟自然元素的转化过程,同时也不会让刚刚转化的虚拟自然元素重新转化为目标对象。如此,避免当虚拟对象在感知区域边缘反复横跳的极端情况出现时,出现虚拟自然元素与目标对象反复切换的非正常表现。
在一些实施例中,当虚拟对象与目标对象进行交互后,虚拟对象还可击杀目标对象,当虚拟对象击杀目标对象时,在虚拟场景中展示用作奖励的虚拟资源;其中,虚拟资源,用于运用于虚拟场景。示例性地,虚拟资源可以是用于降低虚拟对象负面影响值的补给、或者是用于执行针对目标对象的交互操作的道具、又或者是提高虚拟对象等级的经验值等。
在实际应用中,当虚拟对象击杀目标对象时,在虚拟场景中展示用作奖励的虚拟资源,如此,增加了虚拟场景中交互过程的多样性,提高用户击杀目标对象的积极性。
在实际实施时,当虚拟对象击杀目标对象时,以目标对象所处位置为中心,向周围以预设速率消除虚拟自然元素在负面影响区域内所造成的负面影响。示例性地,当虚拟自然元素对应的虚拟自然现象为虚拟龙卷风时,在虚拟对象击杀目标对象之后,以目标对象所处位置为中心,以水波纹或者光线的形式,从目标对象的脚下由近及远地增强负面影响区域内的光照强度、并修复虚拟龙卷风对负面影响区域内的虚拟物体所造成的破坏。
应用上述实施例,当虚拟对象击杀目标对象时,以目标对象所处位置为中心,向周围以预设速率消除虚拟自然元素在负面影响区域内所造成的负面影响,如此,增加了虚拟场景中交互过程的多样性,提高了用户的沉浸感和交互体验。
在一些实施例中,虚拟自然元素同样具有进阶属性,而进阶后的虚拟自然元素与未进阶的虚拟自然元素所转化的可交互目标对象的状态属性存在差异,当虚拟自然元素的进阶条件得到满足时,显示对应虚拟自然元素的进阶提示信息;其中,进阶提示信息,用于提示虚拟自然元素已实现进阶,进阶后的虚拟自然元素转化得到的进阶对象满足以下条件至少之一:生命值高于目标对象的生命值;执行交互操作对虚拟对象所造成的伤害,高于目标对象执行交互操作对虚拟对象所造成的伤害。示例性地,参见图15,图15是本申请实施例提供的虚拟自然元素的进阶提示信息的示意图,基于图15,当虚拟自然元素的进阶条件得到满足时,显示如框1501中所示的进阶提示信息。
需要说明的是,虚拟自然元素的进阶条件包括但不限于虚拟对象处于虚拟场景中的时长或者虚拟自然元素的数量,当虚拟对象处于虚拟场景中的时长满足目标时长时,确定虚拟自
然元素的进阶条件得到满足;又或者,由于虚拟对象在击杀目标对象后,导致相应虚拟自然元素数量的减少,当虚拟自然元素的数量少于目标数量时,虚拟自然元素的进阶条件得到满足。这里,第一对象也可以随着虚拟自然元素的进阶而进阶,同时进阶后的第一对象与上述进阶后的目标对象存在相同特性。
需要说明的是,虚拟对象处于虚拟场景中的时长越长或者虚拟自然元素的数量越少,虚拟自然元素的进阶速度可以越快。这里,可以预先设定虚拟自然元素的进阶次数,以及每次进阶后,目标对象的生命值以及执行交互操作对虚拟对象所造成的伤害的加成倍率。需要说明的是,当虚拟自然元素的进阶条件得到满足时,已经转化的目标对象以及第一对象不会进阶,也即已经转化的目标对象以及第一对象的生命值以及执行交互操作对虚拟对象所造成的伤害不会改变。
在一些实施例中,在虚拟对象向虚拟自然元素的行进路程中,还可以呈现可交互的第二对象,响应于针对虚拟对象的移动指令,控制虚拟对象向虚拟自然元素移动、并在虚拟场景中展示可交互的第二对象。这里,第二对象为与虚拟自然元素无关的虚拟场景中的可交互对象,同时,该第二对象同样会追击虚拟对象,且随着时长的增加,第二对象的生命值与执行交互操作对虚拟对象所造成的伤害同样会增加。
在实际应用中,在虚拟对象向虚拟自然元素的行进路程中,呈现可交互的第二对象,如此,增加了虚拟场景中交互对象以及交互方式的多样性,提高了用户的交互体验,从而提高了人机交互效率以及电子设备的硬件资源利用率。
在实际实施时,当虚拟对象未进入负面影响区域时,展示第二对象在虚拟场景中搜索虚拟对象的画面;当第二对象搜索到虚拟对象、且执行了针对虚拟对象的交互操作时,响应于针对虚拟对象的交互指令,控制虚拟对象与第二对象在虚拟场景中进行交互;而当虚拟对象进入负面影响区域时,对虚拟对象进行标记,并展示携带标记的虚拟对象;其中,标记用于使得第二对象无法搜索到虚拟对象,从而保证进入负面影响区域的虚拟对象不受第二对象的干扰。
应用上述实施例,通过对虚拟对象进行标记,使得第二对象无法搜索到虚拟对象,从而保证进入负面影响区域的虚拟对象不受第二对象的干扰,如此,提高了虚拟场景中交互过程的多样性以及用户的交互体验。
需要说明的是,当虚拟对象击杀了第二对象时,同样可以在虚拟场景中展示用作奖励的能够运用于虚拟场景的虚拟资源。示例性地,虚拟资源可以是用于降低虚拟对象负面影响值的补给、或者是用于执行针对目标对象的交互操作的道具、又或者是提高虚拟对象等级的经验值等。
可以理解的是,在本申请实施例中,涉及到用户的触发操作、用户信息等相关的数据,当本申请实施例运用到实际产品或技术中时,需要获得用户许可或者同意,且相关数据的收集、使用和处理需要遵守相关国家和地区的相关法律法规和标准。
应用本申请上述实施例,在虚拟场景中展示虚拟对象以及对虚拟场景中虚拟环境造成负面影响的虚拟自然元素,当虚拟对象进入虚拟自然元素的负面影响区域时,将虚拟自然元素转化为可供虚拟对象进行交互的目标对象,从而使得虚拟对象与目标对象进行交互。如此,在虚拟对象靠近虚拟自然元素时,将虚拟自然元素转化为可交互对象以实现交互过程,丰富了自由探索期间的交互对象,减少了探索时间,提高了虚拟场景中交互对象的多样性的同时,也提高了人机交互效率以及电子设备的硬件资源利用率。
下面,将说明本申请实施例在一个实际的应用场景中的示例性应用。
相关技术中,玩家在开放游戏世界中,缺乏找到各种随机任务的方向引导,容易导致重复无意义的跑图,甚至于任务目标擦肩而过;同时,因为缺乏足够的填充内容,当玩家迅速找到局内目标并击杀时,游戏时长也会过早结束,使游戏内容消耗快于预期;而且,实际的局内任务类型也一般以对话,收集等较为单调的体验为主,缺乏一定的包装以及开放世界代
入感的氛围营造。
基于此,本申请实施例提供一种虚拟场景中的交互方法,通过在地图中创建若干个玩家(虚拟对象)可选的子事件即暗潮污染源(虚拟自然元素),引导玩家在大地图中当感到目标迷茫时,也可以挑战距离自己最近的暗潮污染源事件,暗潮污染源在玩家靠近时会转化成怪物(目标对象)与玩家战斗,玩家只要挑战成功就可以获得丰厚奖励,这样在丰富局内体验的同时避免玩家流失。
在实际实施时,本申请实施例提供的虚拟场景中的交互方法将着重优化下列体验:1、在开放世界中随机生成一定数量的暗潮污染源;2、系统会通过小雷达(地图),任何时可都能引导玩家找到距离最近的污染源;3、暗潮污染源在默认时刻是一个像龙卷风一样的危险区域,当有玩家靠近时,会转化为配置好的指定类型怪物,与玩家战斗;4、玩家只要战胜指定类型的关键怪,就可以获得胜利,净化污染源并取得丰厚奖励;5、当玩家选择远脱离与关键怪的战斗时,关键怪会在当前位置,回到暗潮污染源状态;6、随着单局时间推移,全图的污染源会缓慢升级,提升强度。
接下来,从产品侧说明本申请实施例提供的虚拟场景中的交互方法。
首先,进行基础流程简介,1、全关卡地图中,存在着多个不同强度(离散档位)的暗潮污染源;2、暗潮污染源会向周围一定范围(负面影响区域)内的玩家辐射暗潮值(第一负面影响值),通过曲线可以定义在与污染源不同距离位置所受到的暗潮值倍率;3、当有玩家靠近污染源一定范围(感应区域),污染源就会实体化为一只关键怪(目标对象),会有较大范围感知,追击玩家;4、当转化成关键怪状态,污染源的中心会随关键怪移动,保证位置一致;5、玩家每通过击杀关键怪清理一个污染源,就会净化该污染源影响的一片区域,并且获的污染源对应的奖励(虚拟资源),该奖励在关键怪死亡时掉落;6、暗潮浓度被清理时,会有一个由近及远的净化效果;7、同一个暗潮污染源,可以通过曲线定义随着时间推移,暗潮污染源的强度变化,并改变辐射给周围玩家的暗潮值;8、关卡的也有一个巨大的全局暗潮污染源(全局影响源),会向全图玩家辐射暗潮值,且通过曲线随单局时间而变化,这个暗潮污染源没有中心,没有实体及预警等;9、单局中,会为每个玩家维护一个暗潮值(目标负面影响值),这个暗潮值可以是由当前玩家所处位置受到的各个污染源辐射值相加的结果(如单个污染源加全局污染源);10、每个玩家的暗潮值是有上限的(如100),从而避免当多个污染源重叠时效果溢出;11、玩家的实时暗潮值,代表了玩家当前所处位置的暗潮浓度强弱,并且反映到客户端的暗区效果(虚拟自然现象),从而作为世界观氛围渲染,目的是提升大世界游戏的代入感;12、玩家的暗潮值会实时显示在右上角雷达下方,如图7所示;13、暗潮污染源会向周围一定范围内的所有玩家发送一个标记,保证进入范围的玩家周围不会受不相干怪物(第二对象)的干扰;14、处于非污染源辐射区域(非负面影响区域)的玩家依然会受动态刷怪系统的影响,随着单局时间推移,会生成一定数量的怪物(第二对象)与玩家战斗。
其次,对暗潮污染源的交互玩法进行说明,1、关卡策划可以在关卡中预先摆放若干个暗潮污染源的生成点(目标位置);2、关卡初始化时,会在预设点(备用位置)中随机选取[a,b]个(目标数量)位置(a≤b),作为暗潮污染源的生成点,策划在配置时要保证所有预设点数量n≥b,这里,假如出现n<a的情况,则以n个位置全部激活的逻辑来执行相关操作;3、各个不同关卡,可以支持不同的[a,b]区间配置;4、后续迭代会对预设点的选取追加规则,考虑要素包括且不限于每个预设点与角色(虚拟对象)出生点的距离,以及每个预设点与关卡目标的距离等;5、随着单局时间推进,每过一定间隔t秒(目标时长),全局的暗潮污染源都会触发一次进化,进化会导致下一次触发生成时,所有怪物获得一定倍率的血量(生命值)和攻击力(执行交互操作对虚拟对象所造成的伤害)加成;6、可以预先配置暗潮污染源总共可以进化多少次,以及每次进化后对应的血量和攻击力加成倍率;7、如果触发进化时,暗潮污染源已经生成了一批怪物(目标对象和/或第一对象),那么这批怪物不
受进化影响,递延到下一波才会;8、每次触发全局进化,需要给所有玩家界面提示,如图10所示;9、暗潮污染源在一般状态下只是表现层面的一团龙卷风特效,没有碰撞;10、当有玩家接近污染源R米(感应区域),污染源会在自身一定单位内生成一个指定类型与数量的怪物群(第一对象),并且原有的中心龙卷风特效会转化为一个关键怪(目标对象);11、击杀关键怪就可以导致污染源爆炸,净化附近区域并且获得掉落奖励(这里只定义唯一一只关键怪,防止多只情况下,玩家打死了一部分关键怪,然后触发脱战逻辑,这些关键怪再次复活的话会比较难处理)12、按照前文提及,当关键怪生成后,污染源的位置会时刻保证跟关键怪一致;13、怪物群中其他怪(第一对象)只是干扰作用,玩家可以选择击杀或不击杀,不会影响对污染源的清理;14、污染源的所有生成怪(目标对象和第一对象),一旦触发脱战逻辑,会进入空闲状态,在整个状态中如果目标范围内没有任何玩家,关键怪则会回到污染源状态(非实体),且污染源以当前位置为中心;15、在回到污染源状态后,一旦污染源周围有其他玩家重新进入R米范围,会重新触发生成,生成出来所有指定类型和数量的怪也会被刷新;16、可以认为污染源中心的龙卷风(非实体)状态与关键怪状态是相互切换的关系;17、玩家只要击杀关键怪,就可以净化暗潮污染源,获得对应的丰厚奖励;18、干扰怪(第一对象和/或第二对象)玩家可以自由选择击杀与否,不影响胜负判断。
接下来从技术侧说明本申请实施例提供的虚拟场景中的交互方法。首先,参见图16,图16是本申请实施例提供的污染源与关键怪切换逻辑的流程图,基于图16,本申请实施例提供的虚拟场景中的交互方法由步骤1601至步骤1606来实现。当污染源龙卷风与关键怪互相切换时,1、需要时刻检测当前位置为中心的R半径内没有任何玩家;2、关键怪必须处于非战斗状态即目标丢失或死亡,且周围没有其他可以作为战斗对象的目标;3、如果关键怪在战斗中突然丢失目标(虚拟对象)视线(例如目标躲到墙后面),关键怪会以警戒状态的动画和移动速度,朝目标最后消失的地方去搜寻过去,这个状态会持续预设的t秒钟,t秒之后如果还不能搜索到旧的或发现新的目标,就会回到正常的空闲状态,这时会主动转化为暗潮污染源形态;4、切换到龙卷风状态时,关键怪的模型材质需要有一个随时间慢慢隐藏的过程,通过一个函数曲线控制,配合龙卷风在周围重新生成,如图14所示;5、材质透明度的渐变,配合特效由下到上的而效果,通过遮罩从下方到上方逐步的透明化或从透明化逐步显示;6、龙卷风到关键怪的切换,同样有一个函数控制怪模型材质从隐藏到显示的曲线;7、如果在脱战后关键怪切换成污染源的过程,会有一个T时长的保护,从怪切换时刻开始计时,在这个T时间内,会关闭怪的感知,即便有玩家重新接近怪或者污染源,也不会立刻打断怪到污染源的转化过程,也不会让刚刚转化为污染源的暗潮中心重新变成怪,如此可以避免玩家在暗潮边缘反复横跳的极端情况出现时,暗潮与怪反复切换的非正常表现。
应用本申请上述实施例,在虚拟场景中展示虚拟对象以及对虚拟场景中虚拟环境造成负面影响的虚拟自然元素,当虚拟对象进入虚拟自然元素的负面影响区域时,将虚拟自然元素转化为可供虚拟对象进行交互的目标对象,从而使得虚拟对象与目标对象进行交互。如此,在虚拟对象靠近虚拟自然元素时,将虚拟自然元素转化为可交互对象以实现交互过程,丰富了自由探索期间的交互对象,减少了探索时间,也即减少了用于实现交互过程所执行的人机交互操作的时间,从而提高了人机交互效率以及电子设备的硬件资源利用率。
下面继续说明本申请实施例提供的虚拟场景中的交互装置455的实施为软件模块的示例性结构,在一些实施例中,如图2所示,存储在存储器440的虚拟场景中的交互装置455中的软件模块可以包括:
展示模块4551,配置为在虚拟场景中,展示虚拟对象、以及归属于虚拟自然现象的虚拟自然元素;其中,所述虚拟自然元素,用于对所述虚拟自然元素所处的环境造成负面影响;
第一控制模块4552,配置为当所述虚拟对象处于所述虚拟自然元素的感应区域内时,控制所述虚拟自然元素转化为可交互的目标对象;其中,所述虚拟自然元素的负面影响区域包括所述感应区域;
第二控制模块4553,配置为当接收到针对所述目标对象的交互指令时,控制所述虚拟对象与所述目标对象在所述虚拟场景中进行交互。
在一些实施例中,所述装置还包括第一显示模块,所述第一显示模块,配置为显示所述虚拟场景的地图,并在所述地图中动态显示所述虚拟对象与所述虚拟自然元素的相对位置关系。
在一些实施例中,所述第一显示模块,还配置为在所述地图中动态显示所述虚拟对象与所述虚拟自然元素间的行进路径,并在所述行进路径的一端标识所述虚拟对象,在所述行进路径的另一端标识所述虚拟自然元素;其中,所述行进路径,用于引导所述虚拟对象沿所述行进路径移动至所述虚拟自然元素的感应区域内。
在一些实施例中,所述装置还包括目标虚拟自然元素选取模块,所述目标虚拟自然元素选取模块,配置为当所述虚拟自然元素的数量为多个时,获取所述虚拟对象与各所述虚拟自然元素在所述虚拟场景中的距离;选取与所述虚拟对象的距离最小的虚拟自然元素为目标虚拟自然元素;所述第一显示模块,还配置为在所述地图中动态显示所述虚拟对象与所述目标虚拟自然元素的相对位置关系。
在一些实施例中,所述装置还包括第二显示模块,所述第二显示模块,配置为当所述虚拟对象处于虚拟自然元素的负面影响区域内时,显示所述虚拟自然元素对所述虚拟对象所造成的第一负面影响值;其中,所述第一负面影响值,用于指示所述虚拟自然元素对所述虚拟对象在所述负面影响区域内的移动所造成的阻碍程度。
在一些实施例中,所述装置还包括重叠模块,所述重叠模块,配置为当所述虚拟自然元素的数量为多个、且多个所述虚拟自然元素中至少两个虚拟自然元素的负面影响区域存在区域重叠时,确定相应的重叠区域;当所述虚拟对象处于所述重叠区域时,针对组成所述重叠区域的所述至少两个虚拟自然元素,将各所述虚拟自然元素所造成的第一负面影响值进行叠加,得到针对所述虚拟对象的目标第一负面影响值;所述第二显示模块,还配置为显示所述至少两个虚拟自然元素对所述虚拟对象所造成的目标第一负面影响值。
在一些实施例中,所述装置还包括第三显示模块,所述第三显示模块,配置为获取所述虚拟对象在所述虚拟场景中的时长;动态显示所述虚拟对象的第二负面影响值,所述第二负面影响值与所述时长呈正相关关系;其中,所述第二负面影响值,用于指示对所述虚拟对象在所述虚拟场景内的移动所造成的阻碍程度。
在一些实施例中,所述第二负面影响值,为所述虚拟场景的全局影响源对所述虚拟对象所造成的负面影响的值,且所述全局影响源具有隐藏属性及进阶属性,所述进阶属性使得所述全局影响源具有包括第一阶段及第二阶段的至少两个阶段,所述第一阶段对应所述第二负面影响值与所述时长的第一相关系数,所述第二阶段对应所述第二负面影响值与所述时长的第二相关系数,且所述第二相关系数的值大于所述第一相关系数的值;所述第三显示模块,还配置为获取全局影响源对所述虚拟对象所造成的负面影响的初始值,并将所述初始值与所述第一相关系数的值进行相乘,得到第二负面影响值;动态显示所述第二负面影响值;当确定所述全局影响源由所述第一阶段进阶至所述第二阶段时,将所述初始值与所述第二相关系数的值进行相乘,得到目标负面影响值;调整显示的所述第二负面影响值至目标负面影响值。
在一些实施例中,所述虚拟自然元素具有进阶属性,所述装置还包括进阶模块,所述进阶模块,配置为当所述虚拟自然元素的进阶条件得到满足时,显示对应所述虚拟自然元素的进阶提示信息;其中,所述进阶提示信息,用于提示所述虚拟自然元素已实现进阶,进阶后的所述虚拟自然元素转化得到的进阶对象满足以下条件至少之一:生命值高于所述目标对象的生命值;执行交互操作对所述虚拟对象所造成的伤害,高于所述目标对象执行所述交互操作对所述虚拟对象所造成的伤害。
在一些实施例中,所述装置还包括增强模块,所述增强模块,配置为当所述虚拟场景中有新虚拟自然元素生成、且生成的新虚拟自然元素的负面影响区域与所述虚拟自然元素的负
面影响区域存在重叠区域时,展示所述新虚拟自然元素,并增强所述重叠区域内的虚拟自然现象的效果。
在一些实施例中,所述装置还包括生成模块,所述生成模块,配置为当所述虚拟对象处于所述虚拟自然元素的感应区域内时,生成至少一个处于空闲状态的可交互的第一对象;响应于针对所述虚拟对象的交互指令,控制所述虚拟对象与所述第一对象在所述虚拟场景中进行交互,所述交互用于将所述第一对象从所述空闲状态转变为交互状态。
在一些实施例中,所述生成模块,还配置为当所述虚拟对象处于所述虚拟自然元素的感应区域内时,获取所述虚拟对象的等级,并基于所述等级,确定与所述等级对应的对象类型;基于所述对象类型,生成至少一个处于空闲状态的可交互的第一对象。
在一些实施例中,所述第二控制模块4552,还配置为展示所述目标对象在所述虚拟场景中搜索所述虚拟对象的搜索画面;在所述目标对象搜索到所述虚拟对象、且执行了针对所述虚拟对象的交互操作的情况下,当接收到针对所述目标对象的交互指令时,控制所述虚拟对象与所述目标对象在所述虚拟场景中进行交互。
在一些实施例中,所述装置还包括第三控制模块,所述第三控制模块,配置为当在目标时长内所述目标对象未在所述感应区域内搜索到所述虚拟对象时,控制所述目标对象转化为所述虚拟自然元素。
在一些实施例中,所述装置还包括状态转变模块,所述状态转变模块,配置为响应于针对所述虚拟对象的隐藏指令,控制所述虚拟对象从交互状态转变为隐藏状态,所述隐藏状态使得所述目标对象无法感知到所述虚拟对象;展示所述目标对象在所述虚拟场景中搜索所述虚拟对象的画面;当所述目标对象在目标时长内未搜索到处于隐藏状态的所述虚拟对象时,控制所述目标对象转化为所述虚拟自然元素。
在一些实施例中,所述装置还包括第四控制模块,所述第四控制模块,配置为响应于针对所述虚拟对象的区域离开操作,控制所述虚拟对象离开所述感应区域;当所述虚拟对象离开所述感应区域的时长达到目标时长时,控制所述目标对象转化为所述虚拟自然元素。
在一些实施例中,所述装置还包括击杀模块,所述击杀模块,配置为当所述虚拟对象击杀所述目标对象时,在所述虚拟场景中展示用作奖励的虚拟资源;其中,所述虚拟资源,用于运用于所述虚拟场景。
在一些实施例中,所述装置还包括消除模块,所述消除模块,还配置为当所述虚拟对象击杀所述目标对象时,以所述目标对象所处位置为中心,向周围以预设速率消除所述虚拟自然元素在所述负面影响区域内所造成的所述负面影响。
在一些实施例中,所述装置还包括选择模块,所述选择模块,配置为显示至少两个难度等级所分别对应的等级选项,不同难度等级所对应的所述虚拟自然元素的数量不同,所述至少两个难度等级包括目标难度等级,所述目标难度等级对应目标数量的所述虚拟自然元素;响应于针对目标难度等级的等级选项的选择操作,在虚拟场景中,展示虚拟对象、以及归属于虚拟自然现象的目标数量的所述虚拟自然元素。
在一些实施例中,当所述虚拟自然元素的数量是多个时,多个所述虚拟自然元素至少归属于两种虚拟自然现象,所述展示模块4551,还配置为在所述虚拟场景中,展示归属于至少两个虚拟自然现象的多个虚拟自然元素;其中,不同所述虚拟自然现象的所述虚拟自然元素对所处环境造成的负面影响不同。
在一些实施例中,所述装置还包括移动模块,所述移动模块,配置为响应于针对所述虚拟对象的移动指令,控制所述虚拟对象向所述虚拟自然元素移动、并在所述虚拟场景中展示可交互的第二对象;当所述虚拟对象未进入所述负面影响区域时,展示所述第二对象在所述虚拟场景中搜索所述虚拟对象的画面;当所述第二对象搜索到所述虚拟对象、且执行了针对所述虚拟对象的交互操作时,响应于针对所述虚拟对象的交互指令,控制所述虚拟对象与所述第二对象在所述虚拟场景中进行交互。
在一些实施例中,所述装置还包括标记模块,所述标记模块,配置为当所述虚拟对象进入所述负面影响区域时,对所述虚拟对象进行标记,并展示携带标记的所述虚拟对象;其中,所述标记用于使得所述第二对象无法搜索到所述虚拟对象。
在一些实施例中,所述展示模块4551,还配置为当所述虚拟自然现象为虚拟龙卷风时,在虚拟场景中,展示归属于虚拟龙卷风的虚拟自然元素;其中,所述虚拟自然元素的负面影响区域内的光照强度低于所述虚拟场景中的非负面影响区域、且对所述负面影响区域内的虚拟物体有破坏性。
本申请实施例还提供一种电子设备,所述电子设备包括:
存储器,配置为存储计算机可执行指令;
处理器,配置为执行所述存储器中存储的计算机可执行指令时,实现本申请实施例提供的虚拟对象的控制方法。
本申请实施例提供了一种计算机程序产品或计算机程序,该计算机程序产品或计算机程序包括计算机可执行指令,该计算机可执行指令存储在计算机可读存储介质中。电子设备的处理器从计算机可读存储介质读取该计算机可执行指令,处理器执行该计算机可执行指令,使得该电子设备执行本申请实施例上述的虚拟场景中的交互方法。
本申请实施例提供一种存储有计算机可执行指令的计算机可读存储介质,其中存储有计算机可执行指令,当计算机可执行指令被处理器执行时,将引起处理器执行本申请实施例提供的虚拟场景中的交互方法,例如,如图3示出的虚拟场景中的交互方法。
在一些实施例中,计算机可读存储介质可以是FRAM、ROM、PROM、EPROM、EEPROM、闪存、磁表面存储器、光盘、或CD-ROM等存储器;也可以是包括上述存储器之一或任意组合的各种设备。
在一些实施例中,计算机可执行指令可以采用程序、软件、软件模块、脚本或代码的形式,按任意形式的编程语言(包括编译或解释语言,或者声明性或过程性语言)来编写,并且其可按任意形式部署,包括被部署为独立的程序或者被部署为模块、组件、子例程或者适合在计算环境中使用的其它单元。
作为示例,计算机可执行指令可以但不一定对应于文件系统中的文件,可以可被存储在保存其它程序或数据的文件的一部分,例如,存储在超文本标记语言(HTML,Hyper Text Markup Language)文档中的一个或多个脚本中,存储在专用于所讨论的程序的单个文件中,或者,存储在多个协同文件(例如,存储一个或多个模块、子程序或代码部分的文件)中。
作为示例,计算机可执行指令可被部署为在一个电子设备上执行,或者在位于一个地点的多个电子设备上执行,又或者,在分布在多个地点且通过通信网络互连的多个电子设备上执行。
综上所述,通过本申请实施例具有以下有益效果:
(1)在虚拟对象靠近虚拟自然元素时,将虚拟自然元素转化为可交互对象以实现交互过程,丰富了自由探索期间的交互对象,减少了探索时间,也即减少了用于实现交互过程所执行的人机交互操作的时间,从而提高了人机交互效率以及电子设备的硬件资源利用率。
(2)从目标对象转化时刻开始计时,在这个目标时间内,会关闭目标对象的感知,即便有虚拟对象重新接近目标对象或者虚拟自然元素,也不会立刻打断目标对象到虚拟自然元素的转化过程,同时也不会让刚刚转化的虚拟自然元素重新转化为目标对象。如此,避免当虚拟对象在感知区域边缘反复横跳的极端情况出现时,出现虚拟自然元素与目标对象反复切换的非正常表现。
以上所述,仅为本申请的实施例而已,并非用于限定本申请的保护范围。凡在本申请的精神和范围之内所作的任何修改、等同替换和改进等,均包含在本申请的保护范围之内。
Claims (25)
- 一种虚拟场景中的交互方法,所述方法由电子设备执行,所述方法包括:在虚拟场景中,展示虚拟对象、以及归属于虚拟自然现象的虚拟自然元素;其中,所述虚拟自然元素,用于对所述虚拟自然元素所处的环境造成负面影响;当所述虚拟对象处于所述虚拟自然元素的感应区域内时,控制所述虚拟自然元素转化为可交互的目标对象;其中,所述虚拟自然元素的负面影响区域包括所述感应区域;当接收到针对所述目标对象的交互指令时,控制所述虚拟对象与所述目标对象在所述虚拟场景中进行交互。
- 如权利要求1所述的方法,其中,所述当所述虚拟对象处于所述虚拟自然元素的感应区域内时,控制所述虚拟自然元素转化为可交互的目标对象之前,所述方法还包括:显示所述虚拟场景的地图,并在所述地图中动态显示所述虚拟对象与所述虚拟自然元素的相对位置关系。
- 如权利要求2所述的方法,其中,所述在所述地图中动态显示所述虚拟对象与所述虚拟自然元素的相对位置关系,包括:在所述地图中动态显示所述虚拟对象与所述虚拟自然元素间的行进路径,并在所述行进路径的一端标识所述虚拟对象,在所述行进路径的另一端标识所述虚拟自然元素;其中,所述行进路径,用于引导所述虚拟对象沿所述行进路径移动至所述虚拟自然元素的感应区域内。
- 如权利要求2所述的方法,其中,所述显示所述虚拟场景的地图之前,所述方法还包括:当所述虚拟自然元素的数量为多个时,获取所述虚拟对象与各所述虚拟自然元素在所述虚拟场景中的距离;选取与所述虚拟对象的距离最小的虚拟自然元素为目标虚拟自然元素;所述在所述地图中动态显示所述虚拟对象与所述虚拟自然元素的相对位置关系,包括:在所述地图中动态显示所述虚拟对象与所述目标虚拟自然元素的相对位置关系。
- 如权利要求1至4中任一项所述的方法,其中,所述在虚拟场景中,展示虚拟对象、以及归属于虚拟自然现象的虚拟自然元素之后,所述方法还包括:当所述虚拟对象处于虚拟自然元素的负面影响区域内时,显示所述虚拟自然元素对所述虚拟对象所造成的第一负面影响值;其中,所述第一负面影响值,用于指示所述虚拟自然元素对所述虚拟对象在所述负面影响区域内的移动所造成的阻碍程度。
- 如权利要求5所述的方法,其中,所述方法还包括:当所述虚拟自然元素的数量为多个、且多个所述虚拟自然元素中至少两个虚拟自然元素的负面影响区域存在区域重叠时,确定相应的重叠区域;当所述虚拟对象处于所述重叠区域时,针对组成所述重叠区域的所述至少两个虚拟自然元素,将各所述虚拟自然元素所造成的第一负面影响值进行叠加,得到针对所述虚拟对象的目标第一负面影响值;所述显示所述虚拟自然元素对所述虚拟对象所造成的第一负面影响值,包括:显示所述至少两个虚拟自然元素对所述虚拟对象所造成的目标第一负面影响值。
- 如权利要求1至6中任一项所述的方法,其中,所述在虚拟场景中,展示虚拟对象、以及归属于虚拟自然现象的虚拟自然元素之后,所述方法还包括:获取所述虚拟对象在所述虚拟场景中的时长;动态显示所述虚拟对象的第二负面影响值,所述第二负面影响值与所述时长呈正相关关 系;其中,所述第二负面影响值,用于指示对所述虚拟对象在所述虚拟场景内的移动所造成的阻碍程度。
- 如权利要求7所述的方法,其中,所述第二负面影响值,为所述虚拟场景的全局影响源对所述虚拟对象所造成的负面影响的值,且所述全局影响源具有隐藏属性及进阶属性,所述进阶属性使得所述全局影响源具有包括第一阶段及第二阶段的至少两个阶段,所述第一阶段对应所述第二负面影响值与所述时长的第一相关系数,所述第二阶段对应所述第二负面影响值与所述时长的第二相关系数,且所述第二相关系数的值大于所述第一相关系数的值;当确定所述全局影响源处于所述第一阶段时,所述动态显示所述虚拟对象的第二负面影响值,包括:获取全局影响源对所述虚拟对象所造成的负面影响的初始值,并将所述初始值与所述第一相关系数的值进行相乘,得到第二负面影响值;动态显示所述第二负面影响值;所述方法还包括:当确定所述全局影响源由所述第一阶段进阶至所述第二阶段时,将所述初始值与所述第二相关系数的值进行相乘,得到目标负面影响值;调整显示的所述第二负面影响值至目标负面影响值。
- 如权利要求1至8中任一项所述的方法,其中,所述虚拟自然元素具有进阶属性,所述控制所述虚拟自然元素转化为可交互的目标对象之前,所述方法还包括:当所述虚拟自然元素的进阶条件得到满足时,显示对应所述虚拟自然元素的进阶提示信息;其中,所述进阶提示信息,用于提示所述虚拟自然元素已实现进阶,进阶后的所述虚拟自然元素转化得到的进阶对象满足以下条件至少之一:生命值高于所述目标对象的生命值;执行交互操作对所述虚拟对象所造成的伤害,高于所述目标对象执行所述交互操作对所述虚拟对象所造成的伤害。
- 如权利要求1至9中任一项所述的方法,其中,所述展示归属于虚拟自然现象的虚拟自然元素之后,所述方法还包括:当所述虚拟场景中有新虚拟自然元素生成、且生成的新虚拟自然元素的负面影响区域与所述虚拟自然元素的负面影响区域存在重叠区域时,展示所述新虚拟自然元素,并增强所述重叠区域内的虚拟自然现象的效果。
- 如权利要求1至10中任一项所述的方法,其中,所述方法还包括:当所述虚拟对象处于所述虚拟自然元素的感应区域内时,生成至少一个处于空闲状态的可交互的第一对象;响应于针对所述虚拟对象的交互指令,控制所述虚拟对象与所述第一对象在所述虚拟场景中进行交互,所述交互用于将所述第一对象从所述空闲状态转变为交互状态。
- 如权利要求11所述的方法,其中,所述当所述虚拟对象处于所述虚拟自然元素的感应区域内时,生成至少一个处于空闲状态的可交互的第一对象,包括:当所述虚拟对象处于所述虚拟自然元素的感应区域内时,获取所述虚拟对象的等级,并基于所述等级,确定与所述等级对应的对象类型;基于所述对象类型,生成至少一个处于空闲状态的可交互的第一对象。
- 如权利要求1至12中任一项所述的方法,其中,所述当接收到针对所述目标对象的交互指令时,控制所述虚拟对象与所述目标对象在所述虚拟场景中进行交互,包括:展示所述目标对象在所述虚拟场景中搜索所述虚拟对象的搜索画面;在所述目标对象搜索到所述虚拟对象、且执行了针对所述虚拟对象的交互操作的情况下, 当接收到针对所述目标对象的交互指令时,控制所述虚拟对象与所述目标对象在所述虚拟场景中进行交互。
- 如权利要求13所述的方法,其中,所述方法还包括:当在目标时长内所述目标对象未在所述感应区域内搜索到所述虚拟对象时,控制所述目标对象转化为所述虚拟自然元素。
- 如权利要求1至14中任一项所述的方法,其中,所述控制所述虚拟对象与所述目标对象在所述虚拟场景中进行交互之后,所述方法还包括:响应于针对所述虚拟对象的隐藏指令,控制所述虚拟对象从交互状态转变为隐藏状态,所述隐藏状态使得所述目标对象无法感知到所述虚拟对象;展示所述目标对象在所述虚拟场景中搜索所述虚拟对象的画面;当所述目标对象在目标时长内未搜索到处于隐藏状态的所述虚拟对象时,控制所述目标对象转化为所述虚拟自然元素。
- 如权利要求1至15中任一项所述的方法,其中,所述当接收到针对所述目标对象的交互指令时,控制所述虚拟对象与所述目标对象在所述虚拟场景中进行交互之后,所述方法还包括:当所述虚拟对象击杀所述目标对象时,在所述虚拟场景中展示用作奖励的虚拟资源;其中,所述虚拟资源,用于运用于所述虚拟场景。
- 如权利要求1至16中任一项所述的方法,其中,所述当接收到针对所述目标对象的交互指令时,控制所述虚拟对象与所述目标对象在所述虚拟场景中进行交互之后,所述方法还包括:当所述虚拟对象击杀所述目标对象时,以所述目标对象所处位置为中心,向周围以预设速率消除所述虚拟自然元素在所述负面影响区域内所造成的所述负面影响。
- 如权利要求1至17中任一项所述的方法,其中,当所述虚拟自然元素的数量是多个时,多个所述虚拟自然元素至少归属于两种虚拟自然现象,所述在虚拟场景中,展示归属于虚拟自然现象的虚拟自然元素,包括:在所述虚拟场景中,展示归属于至少两个虚拟自然现象的多个虚拟自然元素;其中,不同所述虚拟自然现象的所述虚拟自然元素对所处环境造成的负面影响不同。
- 如权利要求1至18中任一项所述的方法,其中,所述在虚拟场景中,展示归属于虚拟自然现象的虚拟自然元素之后,所述方法还包括:响应于针对所述虚拟对象的移动指令,控制所述虚拟对象向所述虚拟自然元素移动、并在所述虚拟场景中展示可交互的第二对象;当所述虚拟对象未进入所述负面影响区域时,展示所述第二对象在所述虚拟场景中搜索所述虚拟对象的画面;当所述第二对象搜索到所述虚拟对象、且执行了针对所述虚拟对象的交互操作时,响应于针对所述虚拟对象的交互指令,控制所述虚拟对象与所述第二对象在所述虚拟场景中进行交互。
- 如权利要求19所述的方法,其中,所述在所述虚拟场景中展示可交互的第二对象之后,所述方法还包括:当所述虚拟对象进入所述负面影响区域时,对所述虚拟对象进行标记,并展示携带标记的所述虚拟对象;其中,所述标记用于使得所述第二对象无法搜索到所述虚拟对象。
- 如权利要求1至20中任一项所述的方法,其中,所述在虚拟场景中,展示归属于虚拟自然现象的虚拟自然元素,包括:当所述虚拟自然现象为虚拟龙卷风时,在虚拟场景中,展示归属于虚拟龙卷风的虚拟自然元素;其中,所述虚拟自然元素的负面影响区域内的光照强度低于所述虚拟场景中的非负面影响区域、且对所述负面影响区域内的虚拟物体有破坏性。
- 一种虚拟场景中的交互装置,所述装置包括:展示模块,配置为在虚拟场景中,展示虚拟对象、以及归属于虚拟自然现象的虚拟自然元素;其中,所述虚拟自然元素,用于对所述虚拟自然元素所处的环境造成负面影响;第一控制模块,配置为当所述虚拟对象处于所述虚拟自然元素的感应区域内时,控制所述虚拟自然元素转化为可交互的目标对象;其中,所述虚拟自然元素的负面影响区域包括所述感应区域;第二控制模块,配置为当接收到针对所述目标对象的交互指令时,控制所述虚拟对象与所述目标对象在所述虚拟场景中进行交互。
- 一种电子设备,包括:存储器,配置为存储计算机可执行指令;处理器,配置为执行所述存储器中存储的计算机可执行指令时,实现权利要求1至21任一项所述的虚拟场景中的交互方法。
- 一种计算机可读存储介质,存储有计算机可执行指令,所述计算机可执行指令被处理器执行时,实现权利要求1至21任一项所述的虚拟场景中的交互方法。
- 一种计算机程序产品,包括计算机程序或计算机可执行指令,所述计算机程序或计算机可执行指令被处理器执行时,实现权利要求1至21任一项所述的虚拟场景中的交互方法。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/638,550 US20240261684A1 (en) | 2022-07-25 | 2024-04-17 | Interaction method and apparatus in virtual scene, electronic device, computer-readable storage medium, and computer program product |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210876584.3A CN117482505A (zh) | 2022-07-25 | 2022-07-25 | 虚拟场景中的交互方法、装置、设备、存储介质及产品 |
CN202210876584.3 | 2022-07-25 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/638,550 Continuation US20240261684A1 (en) | 2022-07-25 | 2024-04-17 | Interaction method and apparatus in virtual scene, electronic device, computer-readable storage medium, and computer program product |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024021750A1 true WO2024021750A1 (zh) | 2024-02-01 |
Family
ID=89674979
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2023/092695 WO2024021750A1 (zh) | 2022-07-25 | 2023-05-08 | 虚拟场景中的交互方法、装置、电子设备、计算机可读存储介质及计算机程序产品 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20240261684A1 (zh) |
CN (1) | CN117482505A (zh) |
WO (1) | WO2024021750A1 (zh) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111589128A (zh) * | 2020-04-23 | 2020-08-28 | 腾讯科技(深圳)有限公司 | 基于虚拟场景的操作控件显示方法及装置 |
CN113101657A (zh) * | 2021-05-14 | 2021-07-13 | 网易(杭州)网络有限公司 | 游戏界面元素的控制方法、装置、计算机设备和存储介质 |
US20220032191A1 (en) * | 2020-05-15 | 2022-02-03 | Tencent Technology (Shenzhen) Company Limited | Virtual object control method and apparatus, device, and medium |
CN114307139A (zh) * | 2021-12-29 | 2022-04-12 | 网易(杭州)网络有限公司 | 游戏场景中虚拟自然现象的生成方法和装置 |
-
2022
- 2022-07-25 CN CN202210876584.3A patent/CN117482505A/zh active Pending
-
2023
- 2023-05-08 WO PCT/CN2023/092695 patent/WO2024021750A1/zh unknown
-
2024
- 2024-04-17 US US18/638,550 patent/US20240261684A1/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111589128A (zh) * | 2020-04-23 | 2020-08-28 | 腾讯科技(深圳)有限公司 | 基于虚拟场景的操作控件显示方法及装置 |
US20220032186A1 (en) * | 2020-04-23 | 2022-02-03 | Tencent Technology (Shenzhen) Company Limited | Operation control display method and apparatus based on virtual scene |
US20220032191A1 (en) * | 2020-05-15 | 2022-02-03 | Tencent Technology (Shenzhen) Company Limited | Virtual object control method and apparatus, device, and medium |
CN113101657A (zh) * | 2021-05-14 | 2021-07-13 | 网易(杭州)网络有限公司 | 游戏界面元素的控制方法、装置、计算机设备和存储介质 |
CN114307139A (zh) * | 2021-12-29 | 2022-04-12 | 网易(杭州)网络有限公司 | 游戏场景中虚拟自然现象的生成方法和装置 |
Also Published As
Publication number | Publication date |
---|---|
CN117482505A (zh) | 2024-02-02 |
US20240261684A1 (en) | 2024-08-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP2023538962A (ja) | 仮想キャラクタの制御方法、装置、電子機器、コンピュータ読み取り可能な記憶媒体及びコンピュータプログラム | |
WO2023082927A1 (zh) | 虚拟场景中任务引导方法、装置、电子设备、存储介质及程序产品 | |
CN112121430B (zh) | 虚拟场景中的信息展示方法、装置、设备及存储介质 | |
CN112402960B (zh) | 虚拟场景中状态切换方法、装置、设备及存储介质 | |
CN112057860B (zh) | 虚拟场景中激活操作控件的方法、装置、设备及存储介质 | |
CN112295230B (zh) | 虚拟场景中虚拟道具的激活方法、装置、设备及存储介质 | |
KR102698789B1 (ko) | 가상 장면의 정보 처리 방법 및 장치, 디바이스, 매체 및 프로그램 제품 | |
CN111921198B (zh) | 虚拟道具的控制方法、装置、设备及计算机可读存储介质 | |
CN112295228B (zh) | 虚拟对象的控制方法、装置、电子设备及存储介质 | |
CN114307152A (zh) | 虚拟场景的显示方法、装置、电子设备及存储介质 | |
CN113018862B (zh) | 虚拟对象的控制方法、装置、电子设备及存储介质 | |
WO2022156629A1 (zh) | 虚拟对象的控制方法、装置、电子设备、存储介质及计算机程序产品 | |
CN112337096B (zh) | 虚拟角色的控制方法、装置、电子设备及存储介质 | |
WO2024021750A1 (zh) | 虚拟场景中的交互方法、装置、电子设备、计算机可读存储介质及计算机程序产品 | |
WO2022237446A1 (zh) | 虚拟对象的控制方法、装置、设备、存储介质及程序产品 | |
CN112263834B (zh) | 虚拟场景中的区域控制方法、装置、设备及存储介质 | |
US20240350922A1 (en) | Method and apparatus for interaction in virtual scene, electronic device, computer-readable storage medium, and computer program product | |
CN114146413B (zh) | 虚拟对象的控制方法、装置、设备、存储介质及程序产品 | |
WO2024193715A1 (zh) | 虚拟场景的交互方法、装置、电子设备、计算机可读存储介质及计算机程序产品 | |
WO2024012016A1 (zh) | 虚拟场景的信息显示方法、装置、电子设备、存储介质及计算机程序产品 | |
CN116920368A (zh) | 虚拟对象的控制方法、装置、设备、存储介质及程序产品 | |
CN118718395A (zh) | 成就的获取方法、装置、电子设备、存储介质及程序产品 | |
CN118543100A (zh) | 游戏数据处理方法、装置、设备及计算机可读存储介质 | |
CN116943187A (zh) | 对象交互方法、装置、电子设备、存储介质及程序产品 | |
CN114210061A (zh) | 虚拟场景中的地图交互处理方法、装置、设备及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23844965 Country of ref document: EP Kind code of ref document: A1 |