WO2022042435A1 - 虚拟环境画面的显示方法、装置、设备及存储介质 - Google Patents
虚拟环境画面的显示方法、装置、设备及存储介质 Download PDFInfo
- Publication number
- WO2022042435A1 WO2022042435A1 PCT/CN2021/113710 CN2021113710W WO2022042435A1 WO 2022042435 A1 WO2022042435 A1 WO 2022042435A1 CN 2021113710 W CN2021113710 W CN 2021113710W WO 2022042435 A1 WO2022042435 A1 WO 2022042435A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- control
- action
- type
- virtual object
- virtual
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 83
- 230000009471 action Effects 0.000 claims abstract description 194
- 230000004044 response Effects 0.000 claims description 65
- 230000033001 locomotion Effects 0.000 claims description 17
- 238000012545 processing Methods 0.000 claims description 11
- 230000008569 process Effects 0.000 abstract description 17
- 230000003993 interaction Effects 0.000 abstract description 10
- 230000001960 triggered effect Effects 0.000 description 17
- 238000005516 engineering process Methods 0.000 description 14
- 238000010586 diagram Methods 0.000 description 13
- 230000006870 function Effects 0.000 description 13
- 230000009183 running Effects 0.000 description 13
- 230000000694 effects Effects 0.000 description 7
- 238000004590 computer program Methods 0.000 description 6
- 230000036544 posture Effects 0.000 description 6
- 230000002093 peripheral effect Effects 0.000 description 5
- 238000009877 rendering Methods 0.000 description 5
- 238000004088 simulation Methods 0.000 description 5
- 230000026058 directional locomotion Effects 0.000 description 4
- 230000009286 beneficial effect Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 230000003190 augmentative effect Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000002860 competitive effect Effects 0.000 description 2
- 230000006835 compression Effects 0.000 description 2
- 238000007906 compression Methods 0.000 description 2
- 230000009191 jumping Effects 0.000 description 2
- 210000003127 knee Anatomy 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000009184 walking Effects 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 210000002414 leg Anatomy 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 230000001681 protective effect Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 208000015041 syndromic microphthalmia 10 Diseases 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
- A63F13/422—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle automatically for the purpose of assisting the player, e.g. automatic braking in a driving game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/537—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/22—Setup operations, e.g. calibration, key configuration or button assignment
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/837—Shooting of targets
Definitions
- the present application relates to the field of human-computer interaction, and in particular, to a method, apparatus, device and storage medium for displaying a virtual environment screen.
- users can control virtual objects to perform actions such as crouching, lying down, shooting, and running.
- UI controls User Interface, UI controls
- Multiple UI controls are distributed on the virtual environment screen according to a certain layout.
- Each UI control uses In order to realize controlling the virtual object to perform an action, for example, UI control 1 controls the virtual object to perform a squatting action, and UI control 2 controls the virtual object to perform a squatting action.
- Embodiments of the present application provide a method, apparatus, device, and storage medium for displaying a virtual environment screen, which improve the efficiency of human-computer interaction by changing the layout of UI controls on the virtual environment screen.
- the technical solution is as follows:
- a method for displaying a virtual environment screen which is applied to a computer device, and the method includes:
- displaying a virtual environment screen where the virtual environment screen displays a first control and a second control, and the first control and the second control belong to different control types;
- the first control and the second control are merged into a third control based on the merge setting operation.
- a device for displaying a virtual environment picture comprising:
- a display module configured to display a virtual environment picture, the virtual environment picture is displayed with a first control and a second control, and the first control and the second control belong to different control types;
- a receiving module configured to receive a combined setting operation for the first control and the second control
- a processing module configured to merge the first control and the second control into a third control based on the merge setting operation.
- a computer device comprising: a processor and a memory, wherein the memory stores at least one instruction, at least a piece of program, a code set or an instruction set, the at least one The instructions, the at least one piece of program, the code set or the instruction set are loaded and executed by the processor to implement the method for displaying a virtual environment screen as described above.
- a computer-readable storage medium stores at least one instruction, at least one piece of program, code set or instruction set, the at least one instruction, the at least one piece of program , the code set or the instruction set is loaded and executed by the processor to implement the method for displaying a virtual environment screen as described in the above aspect.
- a computer program product or computer program comprising computer instructions stored in a computer readable storage medium.
- the processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device executes the method for displaying a virtual environment screen as described above.
- the different types of controls displayed on the virtual environment screen are merged, so that the user can merge the less commonly used UI controls into the same UI control through independent settings, or merge the UI controls that need to be used in combination.
- the layout of the UI control on the virtual environment screen is simplified by merging the UI control, thereby simplifying the process of the user controlling the virtual object and improving the human-computer interaction efficiency.
- FIG. 1 is a block diagram of a computer system provided by an exemplary embodiment of the present application.
- FIG. 2 is a schematic diagram of a state synchronization technology provided by an exemplary embodiment of the present application
- FIG. 3 is a schematic diagram of a frame synchronization technology provided by an exemplary embodiment of the present application.
- FIG. 4 is a flowchart of a method for displaying a virtual environment screen provided by an exemplary embodiment of the present application
- FIG. 5 is a flowchart of a method for displaying a virtual environment screen provided by another exemplary embodiment of the present application.
- FIG. 6 is a schematic diagram of a virtual environment screen before merging controls provided by an exemplary embodiment of the present application
- FIG. 7 is a schematic diagram of a setting interface corresponding to a merge setting operation provided by an exemplary embodiment of the present application.
- FIG. 8 is a schematic diagram of an updated virtual environment screen provided by an exemplary embodiment of the present application.
- FIG. 9 is a flowchart of a method for displaying a virtual environment screen provided by another exemplary embodiment of the present application.
- FIG. 10 is a flowchart of a method for displaying a virtual environment screen provided by another exemplary embodiment of the present application.
- FIG. 11 is a schematic diagram of a split control provided by an exemplary embodiment of the present application.
- FIG. 12 is a schematic diagram of an updated virtual environment screen provided by another exemplary embodiment of the present application.
- FIG. 13 is a flowchart of a method for displaying a virtual environment screen provided by another exemplary embodiment of the present application.
- 15 is a flowchart of a method for judging the state of a virtual object provided by an exemplary embodiment of the present application.
- 16 is a block diagram of a display device for a virtual environment screen provided by an exemplary embodiment of the present application.
- FIG. 17 is a schematic diagram of an apparatus structure of a computer device provided by an exemplary embodiment of the present application.
- Virtual environment is the virtual environment displayed (or provided) by the application when it is run on the terminal.
- the virtual environment may be a simulated environment of the real world, a semi-simulated and semi-fictional environment, or a purely fictional environment.
- the virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment, which is not limited in this application.
- the following embodiments are exemplified by the virtual environment being a three-dimensional virtual environment.
- Virtual object refers to the movable object in the virtual environment.
- the movable objects may be virtual characters, virtual animals, cartoon characters, etc., such as characters, animals, plants, oil barrels, walls, stones, etc. displayed in a three-dimensional virtual environment.
- the virtual object is a three-dimensional solid model created based on animation skeleton technology.
- Each virtual object has its own shape and volume in the three-dimensional virtual environment, and occupies a part of the space in the three-dimensional virtual environment.
- a virtual object generally refers to one or more virtual objects in a virtual environment.
- UI control refers to any visual control or element that can be seen on the user interface of an application, such as pictures, input boxes, text boxes, buttons, labels, etc., some of which UI controls respond to The user's operation, for example, the user can input text in the input box, and the user interacts with the user interface through the above-mentioned UI controls.
- the method provided in this application can be applied to virtual reality applications, three-dimensional map programs, military simulation programs, first-person shooting games (First-Person Shooting Game, FPS) games, multiplayer online tactical competitive games (Multiplayer Online Battle Arena Games, MOBA), battle royale type shooting games, virtual reality (Virtual Reality, VR) applications, augmented reality (Augmented Reality, AR), etc.
- the following embodiments are examples of applications in games.
- a game based on a virtual environment consists of one or more maps of the game world.
- the virtual environment in the game simulates the real world scene, and the user can manipulate the virtual objects in the game to walk, run, jump, shoot, fight, Driving, being attacked by other virtual objects, being injured in the virtual environment, attacking other virtual objects, using disruptive throwing props, rescuing teammates in the same team, etc., are highly interactive, and multiple users can form a team online.
- Competitive Games A virtual environment picture corresponding to the game is displayed on the terminal used by the user, and the virtual environment picture is obtained by observing the virtual environment from the perspective of a virtual object controlled by the user.
- a plurality of UI controls are displayed on the virtual environment screen to form a user interface, and each UI control is used to control the virtual object to perform different actions. For example, the user triggers UI control 1 to control the virtual object to run forward.
- FIG. 1 shows a structural block diagram of a computer system provided by an exemplary embodiment of the present application.
- the computer system 100 includes: a first terminal 120 , a server 140 and a second terminal 160 .
- the first terminal 120 has an application program supporting a virtual environment installed and running.
- the first terminal 120 is a terminal used by the first user, and the first user uses the first terminal 120 to control the first virtual object located in the virtual environment to perform activities, including but not limited to: adjusting body posture, walking, running, jumping, At least one of riding, aiming, picking up, using throwing props, and attacking other virtual objects.
- the first virtual object is a first virtual character, such as a simulated character object or an anime character object.
- the first terminal 120 is connected to the server 140 through a wireless network or a wired network.
- the server 140 includes at least one of a server, multiple servers, a cloud computing platform and a virtualization center.
- the server 140 includes a processor 144 and a memory 142, and the memory 142 further includes a receiving module 1421, a control module 1422 and a sending module 1423.
- the receiving module 1421 is used to receive a request sent by a client, such as a team request; the control module 1422 It is used to control the rendering of the virtual environment screen; the sending module 1423 is used to send a response to the client, such as sending a prompt message of successful team formation to the client.
- the server 140 is used to provide background services for applications supporting a three-dimensional virtual environment.
- the server 140 undertakes the main computing work, and the first terminal 120 and the second terminal 160 undertake the secondary computing work; or, the server 140 undertakes the secondary computing work, and the first terminal 120 and the second terminal 160 undertake the main computing work; Alternatively, the server 140 , the first terminal 120 and the second terminal 160 use a distributed computing architecture to perform collaborative computing.
- the server 140 may adopt a synchronization technology to make the display performance of multiple clients consistent.
- the synchronization technology adopted by the server 140 includes: a state synchronization technology or a frame synchronization technology.
- the server 140 uses state synchronization technology to synchronize among multiple clients. As shown in FIG. 2 , the battle logic runs in the server 140 . When a state of a virtual object in the virtual environment changes, the server 140 sends the state synchronization result to all clients, such as client 1 to client 10 .
- the client 1 sends a request to the server 140, the request is used to request the virtual object 1 to perform an action of attacking the virtual object 2, the server 140 determines whether the virtual object 1 can attack the virtual object 2, and when the virtual object 1 executes the action of attacking the virtual object 2, the server 140 determines whether the virtual object 1 can attack the virtual object 2.
- the remaining health of virtual object 2 after the attack action The server 140 synchronizes the remaining life value of the virtual object 2 to all clients, and all the clients update local data and interface performance according to the remaining life value of the virtual object 2 .
- the server 140 uses frame synchronization technology to synchronize among multiple clients.
- the battle logic runs in each client.
- the client sends a frame synchronization request to the server, and the frame synchronization request carries the local data changes of the client.
- the server 140 forwards the frame synchronization request to all clients.
- each client receives the frame synchronization request, it processes the frame synchronization request according to the local battle logic, and updates the local data and interface performance.
- the second terminal 160 has an application program supporting a virtual environment installed and running.
- the second terminal 160 is a terminal used by the second user, and the second user uses the second terminal 160 to control the second virtual object located in the virtual environment to perform activities, including but not limited to: adjusting body posture, walking, running, jumping, At least one of riding, aiming, picking up, using throwing props, and attacking other virtual objects.
- the second virtual object is a second virtual character, such as a simulated character object or an anime character object.
- first virtual object and the second virtual object are in the same virtual environment.
- first virtual object and the second virtual object may belong to the same team, the same organization, the same camp, have a friend relationship or have temporary communication rights; or, the first virtual character object and the second virtual character Objects can also belong to different factions, different teams, different organizations, or have an adversarial relationship.
- the applications installed on the first terminal 120 and the second terminal 160 are the same, or the applications installed on the two terminals are the same type of applications on different operating system platforms (Android or IOS).
- the first terminal 120 may generally refer to one of multiple terminals
- the second terminal 160 may generally refer to one of multiple terminals.
- This embodiment only takes the first terminal 120 and the second terminal 160 as examples for illustration.
- the device types of the first terminal 120 and the second terminal 160 are the same or different, and the device types include: smart phones, tablet computers, e-book readers, MP3 players, MP4 players, laptop computers and desktop computers. at least one.
- the following embodiments take the terminal including a smart phone as an example for illustration.
- the number of the above-mentioned terminals may be more or less.
- the above-mentioned terminal may be only one, or the above-mentioned terminal may be dozens or hundreds, or more.
- the embodiments of the present application do not limit the number of terminals and device types.
- FIG. 4 shows a flowchart of a method for displaying a virtual environment screen provided by an exemplary embodiment of the present application.
- the method can be applied to a computer device, and the computer device is implemented as the first terminal 120 or the first terminal as shown in FIG. 1 .
- the two terminals 160 or other terminals in the computer system 100 are taken as an example for description.
- the method includes the following steps:
- Step 401 displaying a virtual environment screen and a first control and a second control, where the first control and the second control belong to different control types.
- the first control is used to control the first virtual object to perform the first action
- the second control is used to control the first virtual object to perform the second action.
- the first action and the second action belong to different types of actions; or, the first action and the second action are different actions.
- the first control is used to control the first virtual object to perform a first performance in the virtual environment; alternatively, the first control is used to control the first virtual object to use the first prop in the virtual environment; The first control is used to control the first virtual object to trigger the first skill; or, the first control is used to control the first virtual object to be in the first motion state.
- the second action corresponding to the second control includes the second performance, the use of the second prop, the triggering of the second skill, the second movement state, etc.
- the control functions of the first control and the second control are not limited in this embodiment. .
- the terminal used by the user runs an application that supports the virtual environment, such as a first-person shooter game.
- an application that supports the virtual environment
- a virtual environment picture is displayed, and the virtual environment picture is a picture obtained by observing the virtual environment from the perspective of the first virtual object.
- the virtual environment displayed on the virtual environment screen includes: at least one element of mountains, flats, rivers, lakes, oceans, deserts, sky, plants, buildings, and vehicles.
- the control type of the UI control is mainly used to indicate the type of function triggered by the UI control.
- UI controls are displayed on the virtual environment screen.
- the UI controls include: at least one of auxiliary-type UI controls, mobile-type UI controls, aiming-type UI controls, and state-switching-type UI controls.
- Auxiliary-type UI controls are used to assist virtual objects to perform activities.
- auxiliary-type UI controls are used to control virtual objects to use auxiliary-type virtual props to assist in activities, or auxiliary-type UI controls are used to control
- the virtual object triggers auxiliary type skills to assist activities.
- the open mirror control belongs to the auxiliary type UI control, which is used to control the virtual object to use the scope prop to aim at the target during shooting activities; the mobile type UI control is used to control the virtual object.
- the direction movement control belongs to the movement type UI control.
- the aiming type UI control is the UI control corresponding to the virtual object when using the virtual prop.
- the aiming-type UI control is the UI control corresponding to the virtual object using the attacking prop.
- the shooting control belongs to the aiming-type UI control.
- the control is triggered, the virtual object shoots at the target; the UI control of the state switching type is used to switch the posture of the virtual object in the virtual environment.
- the squatting control belongs to the UI control of the state switching type.
- the virtual The object switches from standing to squatting state, or from other postures to squatting state.
- Step 402 Receive a combined setting operation for the first widget and the second widget.
- UI controls such as a smart phone or a tablet computer
- the user implements the merge setting operation by triggering a UI control corresponding to the merge setting operation, or the user performs a gesture operation corresponding to the merge setting operation on the touch screen display, such as a single-click operation, a long-press operation, a double-click operation.
- Operations including at least one of a single-finger double-click operation and a multi-finger double-click operation), hovering operations, dragging operations, and their combined operations, etc.
- the merge setting operation can also be performed through the external input device.
- the terminal is implemented as a notebook computer connected with a mouse, and the user moves the mouse pointer to the UI control corresponding to the merge setting operation, and performs the merge setting operation by clicking the mouse.
- the user can also perform the merge setting operation by pressing the keyboard keys and clicking the mouse.
- a UI control corresponding to the merge setting operation is displayed separately on the virtual environment screen, and the UI control is named as a merge UI control; in other embodiments, the virtual environment screen includes a setting page for setting the game application, The setting page includes UI controls corresponding to the merge setting operation.
- Step 403 in response to the merge setting operation, merge the first control and the second control into a third control.
- the third control is used to control the first virtual object to perform a first action (corresponding to the first control) and a second action (corresponding to the second control).
- the third control is used to control the first virtual object to perform a third action independent of the first action and the second action.
- Merging refers to synthesizing at least two controls into one control, and only the merged control is displayed on the virtual environment screen, and the merged control has the functions of the control before the merge.
- the third widget is displayed on the virtual environment screen, and the display of the first widget and the second widget is canceled.
- the third control has both the function corresponding to the first control and the function corresponding to the second control, and the user can control the first virtual object to perform the first action and the second action by triggering the third control.
- the first virtual object when the user clicks the third control, the first virtual object performs the first action; when the user presses the third control for a long time, the first virtual object performs the second action; or, when the user clicks the third control, the first virtual object performs the second action.
- a virtual object performs the first action, and when the user clicks the third control again, the game application determines that the first virtual object is performing the first action, and controls the first virtual object to perform the second action while performing the first action; Or, when the user clicks the third control, the first virtual object performs the first action, and when the user clicks the third control again, the game application determines that the first virtual object has performed the first action (the first action has been performed), Then the first virtual object is controlled to perform the second action.
- the UI control corresponds to a control identifier in the application, and when receiving the merge setting operation, determines the corresponding control identifier of the first control and the second control, and based on the corresponding control identifier of the first control and the second control
- the control identifier corresponding to the third control is determined, and when the interface is rendered, the third control is rendered according to the control identifier of the third control, and the rendering of the first control and the second control is canceled.
- the user performs a merge setting operation to merge at least two controls.
- the method provided in this embodiment merges different types of controls displayed on the virtual environment screen through the received merge setting operation, so that the user can merge the UI controls that are not commonly used into the same one through independent settings.
- UI controls, or combining UI controls that need to be used in combination into one UI control by changing the layout of UI controls on the virtual environment screen, simplifies the process of users controlling virtual objects and improves the efficiency of human-computer interaction.
- At least two controls are merged by performing a merge setting operation on the virtual environment screen, or one control is split into at least two controls by performing a split setting operation on the virtual environment screen.
- merging controls and the process of splitting controls will be described in combination with the user interface in the game application.
- FIG. 5 shows a flowchart of a method for displaying a virtual environment screen provided by another exemplary embodiment of the present application.
- the method can be applied to a computer device implemented as the first terminal 120 or the second terminal 160 as shown in FIG. 1 , or other terminals in the computer system 100 .
- the method includes the following steps:
- Step 501 displaying a virtual environment picture, where a first control and a second control are displayed on the virtual environment picture, and the first control and the second control belong to different control types.
- the first control is used to control the first virtual object to perform the first action
- the second control is used to control the first virtual object to perform the second action.
- a first control 11 and a second control 12 are displayed on the virtual environment screen.
- the first control 11 is a squat control
- the second control 12 is an aiming control
- the first control 11 belongs to a state switching type control.
- the second control 12 is a mobile control
- the first control 11 and the second control 12 are located on the right side of the virtual environment screen.
- Step 502 Receive a combined setting operation for the first widget and the second widget.
- the merge setting operation is implemented as dragging the controls that need to be merged to the same place.
- the user drags the first control 11 to the second control 12 , or the user drags the second control 12 to the first control 11 .
- the merge setting operation is an operation enabled by the user in the setting interface, and the user places the control 20 corresponding to the merge setting operation in the open state, then the first control 11 can be set in the game application program. Merge with the second control 12.
- Step 503 Obtain a first widget type of the first widget and a second widget type of the second widget.
- the game application acquires the control types of the first control and the second control according to the control selected by the user when dragging.
- control type of the control to be merged is acquired according to the user's operation on the setting interface.
- the user's operation on the setting interface is used to combine all controls of the same type, or to combine the controls of a preset type (for example, combining the controls of the shooting type and the controls of the movement type) ), or, to combine the preset controls (for example, combine the squat controls and the down controls).
- Step 504 in response to the first widget type and the second widget type satisfying the preset condition, merge the first widget and the second widget into a third widget.
- the preset conditions include at least one of the following conditions:
- the first control type is an auxiliary type, and the second control type is an aiming type.
- the first control is a mirror-opening control (a control for opening the scope of a firearm-type virtual prop), and the second control is a shooting control;
- the first control type is an auxiliary type
- the second control type is a mobile type
- the first control is a mirror-opening control
- the second control is a directional movement control (including a left-moving control, a right-moving control, and a forward-moving control). , move the control back);
- the first control type is a movement type, and the second control type is an aiming type; schematically, the first control is a directional movement control, and the second control is a shooting control; the first control is a directional movement control, and the second control is a throwing control ( controls for using throwing virtual props);
- the first control type is a movement type
- the second control type is a state switching type
- the first control is a directional movement control
- the second control is a squat control
- the first control type is a state switching type
- the second control type is an auxiliary type
- the first control is a lying down control (a control for controlling the virtual object to lie down)
- the second control is a mirror-on control
- the first control type is a state switching type
- the second control type is an aiming type.
- the first control is a crouch control
- the second control is a shooting control.
- the game application merges the first control and the second control.
- Step 505 updating and displaying a third widget for replacing the first widget and the second widget.
- the third widget is updated and displayed on the virtual environment screen, and the updated virtual environment screen does not include the first widget and the second widget.
- the game application program When the user performs the merge setting operation, the game application program identifies the user account corresponding to the merge setting operation, and the game application program updates the virtual environment screen corresponding to the user account according to the user account.
- a third control 13 is displayed on the updated virtual environment screen.
- the control identifier of the first control 11 shown in FIG. 6 is used as the current updated third control
- the control identification of 13 is shown as an example; alternatively, or the control identification of the second control is used as the control identification of the third control; or a new control identification is generated as the third control, and the control identification is different from the control identification of the first control.
- the action performed by the first virtual object is related to the operation received by the third control.
- Step 507a in response to the first operation on the third control, controlling the first virtual object to perform the first action.
- the first operation includes at least one of a single-click operation, a long-press operation, a sliding operation, a hovering operation, a drag operation, a double-click operation (including at least one of a single-finger double-click and a multi-finger double-click), and their combined operations .
- the first virtual object in response to receiving a long-press operation on the third control, is controlled to perform a running action.
- Step 508a in response to the second operation on the third control, controlling the first virtual object to perform the second action.
- the second operation includes at least one of single-click operation, long-press operation, slide operation, hover operation, drag operation, double-click operation (including at least one of single-finger double-click and multi-finger double-click), and their combined operations .
- the first operation is different from the second operation.
- the first virtual object in response to receiving a double-click operation on the third control, is controlled to perform a mirror-opening action.
- the third control generates a control instruction according to the received operation type, and controls the first virtual object to perform different actions.
- Step 507a may be performed prior to step 507b, or step 507b may be performed prior to step 507a.
- the third control receives an execution operation corresponding to action b, and the first virtual object performs action b while performing action a.
- the first virtual object in response to receiving a double-click operation on the third control, the first virtual object performs a running action.
- the first virtual object in response to receiving a long-pressing action on the third control, the first virtual object runs Perform an open-scope action while running.
- the first virtual object simultaneously executes the action corresponding to the third control.
- Step 507b in response to the third operation on the third control, controlling the first virtual object to perform the first action and the second action simultaneously.
- the third operation includes at least one of single-click operation, long-press operation, slide operation, hover operation, drag operation, double-click operation (including at least one of single-finger double-click and multi-finger double-click) and their combined operations .
- the game application in response to the third control receiving a drag operation, controls the first virtual object to perform a running action and a reloading action at the same time.
- the first virtual object executes the action according to the priority of the action.
- Step 507c in response to the fourth operation on the third control, obtain the priorities of the first action and the second action.
- the fourth operation includes at least one of single-click operation, long-press operation, slide operation, hover operation, drag operation, double-click operation (including at least one of single-finger double-click and multi-finger double-click) and their combined operations .
- the game application in response to receiving a long-press operation on the third control, obtains the priority of the action corresponding to the third control.
- priority ordering running action>shooting action (or throwing action)>squatting action (or lying down action)>mirror-opening action (reload action).
- Step 508c controlling the first virtual object to perform the first action and the second action in a preset order based on the priority.
- the game application controls the first virtual object to perform the second action first, and then perform the first action.
- the method provided in this embodiment merges different types of controls displayed on the virtual environment screen through the received merge setting operation, so that the user can merge the UI controls that are not commonly used into the same one through independent settings.
- UI controls, or combining UI controls that need to be used in combination into one UI control by changing the layout of UI controls on the virtual environment screen, simplifies the process of users controlling virtual objects and improves the efficiency of human-computer interaction.
- the first control and the second control are merged into a third control, so that the user can combine different types of controls through the merge setting operation. Combined, so that the layout of UI controls on the virtual environment screen is more flexible.
- the user can determine the types of UI controls that can be merged, and the UI controls can be flexibly merged.
- the virtual environment screen is updated and displayed, and the updated virtual environment screen displays the merged third control, and the user can control the virtual object more intuitively through the updated virtual environment screen.
- the virtual object is controlled to perform different actions according to different rules, which makes the way of controlling the virtual object more flexible and diverse, and helps users to set UI controls that suit their own preferences or usage habits. Layout.
- the process of splitting controls includes the following three situations, as shown in Figure 10:
- the split controls belong to different types of controls.
- the third widget is split into the first widget and the second widget.
- Step 511a in response to the first split setting operation for the third control, obtain the action type corresponding to the third control.
- the split setting operation may be opposite to the implementation of the merge setting operation.
- the merge setting operation is a leftward drag operation
- the split setting operation is a right drag operation;
- the merge setting operation is to drag the first control to the second control to form the third control, then the split setting operation starts from the third control 13, and drags the control outward from the third control 13 as the first control 11 Or the second control 12, the arrow indicates the dragging direction, as shown in FIG. 11 .
- the action type corresponds to the control type of the control.
- the control type of control 1 is an auxiliary type, and the virtual object performs an auxiliary type action when control 1 is triggered.
- control 1 is a mirror-opening control
- the virtual object performs the mirror-opening action
- the mirror-opening action The action type is an auxiliary type.
- the third control Since the third control is a merged control, the third control has the functions of at least two controls.
- the game application acquires an action list corresponding to the third control, where the action list is used to provide the control composition of the third control.
- the game application or the background server establishes an action list corresponding to the merged control, and binds the action list to the merged control.
- the control identifier of the third control and the control identifiers corresponding to at least two controls having a split relationship with the third control are stored in the action list, so that when the third control needs to be split, it is cancelled during the interface rendering process.
- the rendering of the third control is replaced by the rendering of at least two controls in the list corresponding to the control identifiers of the third control.
- Step 512a splitting the third control into a first control and a second control based on the action type.
- the action list corresponding to the third control includes a squatting action and a mirror-opening action, and the third control is split into a squatting control and a mirror-opening control.
- the split controls belong to the same type of controls.
- the third widget is split into a fourth widget and a fifth widget, the fourth widget and the fifth widget belong to the same type of widget.
- Step 511b in response to the second split setting operation for the third control, obtain an association relationship between at least two actions corresponding to the third control.
- the association relationship refers to the hierarchical relationship between actions performed by the virtual object when the third control is triggered.
- the action corresponding to the third control is a posture switching action, and there is a hierarchical relationship between each posture switching action.
- the virtual object is controlled to perform a full squat action (the knees are bent and the legs are close to each other). hips) and half squat movements (on one knee) to obtain the hierarchical relationship between full squat movements and half squat movements.
- Step 512b splitting the third control into a fourth control and a fifth control based on the association relationship.
- the third control is split into a full squat motion and a half squat motion based on the association relationship; in another example, the game application divides the third control into a squat control and a prone control according to the association relationship; In another example, the game application splits the third control into an action of opening a high-power scope and an action of opening a low-power scope according to an association relationship.
- the virtual environment picture is a picture obtained by observing the virtual environment from the perspective of the first virtual object.
- a camera model is bound to the first virtual object, and the virtual environment is photographed through the camera model to obtain the virtual environment picture.
- the third control is split based on the number of virtual objects in the virtual environment screen.
- the third control is split based on the number of the second virtual objects.
- Step 511c in response to that the viewing angle of the first virtual object includes at least two second virtual objects, obtain the number of the second virtual objects.
- the first virtual object and the second virtual object are in the same virtual environment, and the first virtual object will see the second virtual object in the virtual environment during the activity of the virtual environment.
- the second virtual object has a teammate relationship with the first virtual object.
- the viewing angle of the first virtual object includes three second virtual objects, and the game application binds the number of the second virtual objects to the virtual environment picture observed by the first virtual object.
- Step 512c splitting the third control based on the second virtual quantity.
- the third control is divided into three controls according to the three virtual objects.
- the third control is divided into control 1, control 2 and control 3.
- Control 1 is used to attack virtual object 1
- control 2 is used to attack virtual object 2
- control 3 is used to attack virtual object 3.
- the third control in response to the second virtual object using the virtual prop, is split based on the second virtual object and the virtual prop used by the second virtual object.
- the virtual prop used by the second virtual object is a shield (used to reduce damage to the virtual object)
- the third control is split into two controls, and one control is used to destroy the virtual prop (shield) used by the second virtual object.
- a control for attacking the second virtual object In some embodiments, the controls for destroying the virtual item last longer than the controls for attacking the second virtual object.
- Step 513 update and display at least two split controls that are used to replace the third control after the third control is split.
- At least two split widgets corresponding to the split third widget are updated and displayed, and the updated virtual environment picture does not include the third widget.
- the game application When the user performs the split setting operation, the game application identifies the user account corresponding to the split setting operation, and the game application updates the virtual environment screen corresponding to the user account according to the user account.
- the split fourth control 14 and the fifth control 15 are displayed on the updated virtual environment screen.
- the fourth control 14 (full squat control) and the fifth control 15 (half squat control) are The resulting split from the squat control 13.
- the squatting control 13 is not displayed on the updated virtual environment screen (the figure is for illustration only).
- control identification of the fourth control 14 and the control identification of the fifth control 15 are newly generated control identifications, and these two control identifications are different from the control identification of the third control 13; or the control identification of the third control 13
- the identification is the control identification of the fourth control 14, and the control identification of the fifth control 15 is the newly generated control identification; or the control identification of the third control 13 is used as the control identification of the fifth control 15, and the control identification of the fourth control 14 is The newly generated control ID.
- the first virtual object performs at least two actions simultaneously.
- Step 516a in response to the first operation on the first control, controlling the first virtual object to perform the first action.
- the control type of the first control is different from the control type of the second control.
- Step 517a in response to the second operation on the second control, controlling the second virtual object to perform the second action while performing the first action.
- the game application program controls the first virtual object to throw the virtual object during the squatting process. props. The user can control the first virtual object to accurately hit the target according to the squatting angle.
- the first virtual object performs at least two actions in sequence.
- Step 516b in response to the third operation on the fourth control, controlling the first virtual object to perform a fourth action.
- the fourth control is a lying down control
- the game application program controls the first virtual object to perform a lying down action
- Step 517b in response to the fourth operation on the fifth control, controlling the first virtual object to perform a fifth action, where the fourth action and the fifth action have an associated relationship.
- association relationship means that there is a certain hierarchical relationship between actions, and they belong to the same type of action.
- the fifth control is a crawl control (for controlling the first virtual object to crawl forward), and in response to receiving a double-click operation on the crawl control, the game application program controls the first virtual object to crawl in the virtual environment. It can be understood that step 516b may be performed prior to step 517b, and step 517b may be performed prior to step 516b.
- the third control is split into the same type of action or different types of controls, so that the user can split it in a targeted manner according to different battle situations
- the controls while conforming to the user's operating habits, ensure that the virtual objects controlled by the user improve the battle efficiency.
- the split control can realize functions corresponding to different types of actions, and it is convenient for users to split the third control according to different functions. .
- the split control By splitting the third control according to the association relationship between at least two actions corresponding to the third control when it is triggered, the split control can realize different levels of similar actions, which is convenient for users to perform different actions according to different combat situations.
- the action of the level is conducive to improving the combat efficiency of virtual objects.
- the split control can attack different virtual objects, which is beneficial to improve the combat efficiency of the virtual objects.
- the virtual environment screen is updated and displayed, the updated virtual environment screen displays the split controls, and the user can control the virtual object more intuitively through the updated virtual environment screen.
- the virtual objects can be controlled to perform different operations, which makes the methods of controlling virtual objects more flexible and diverse, and helps users to set the UI controls that suit their own preferences or usage habits. layout.
- control merging includes the following steps, as shown in Figure 14:
- Step 1402 whether the user performs a merge setting operation.
- step 1403a When the user performs the merge setting operation, go to step 1403a; when the user does not perform the merge setting operation, go to step 1403b.
- Step 1403a the duration of triggering the lie down control exceeds the time threshold.
- the game application merges the first control and the second control to obtain the third control.
- the first control is a crouching control
- the second control is a lying down control.
- the control identification of the first control (crouching control) is used as the control identification of the third control
- the control identification of the second control (squatting control) is hidden and displayed.
- the time threshold is 0.5 seconds, and the trigger operation received on the third control exceeds 0.5 seconds.
- Step 1404a hide and display the lying down control.
- the execution condition of the lying down action is satisfied, and the original second control (the lying down control) is not displayed on the virtual environment screen.
- Step 1405a enable the prone state prompt of the squat control.
- the merged third control has two functions of supporting squatting action and lying down action.
- the user is prompted in the form of a squat control highlighted, and the squat control at this time is used to control the first virtual object to perform a squat action.
- the squat control is used to control the first virtual object to perform a squat action.
- Step 1403b close the long-press squat control to trigger the squat operation.
- the first control crouching control
- the second control craying control
- Step 1404b display the lie down control.
- Step 1405b turning off the prone state prompt of the squatting control.
- the squatting control is used to control the first virtual object to perform the squatting action
- the lying down control is used to control the first virtual object to perform the squatting action.
- the functions of the two controls are not combined.
- steps 1402 to 1405a can be performed in a loop until the end of the game.
- the game application determines whether the user's operation is to control the first virtual object to perform a squatting action or a squatting action, including the following steps, as shown in FIG. 15 :
- Step 1501 start.
- Step 1502 whether to trigger the merged control.
- the game application merges the first control and the second control to obtain the third control.
- the user triggers the third control go to step 1503a, and when the user does not trigger the third control, go to step 1503b.
- the third control is triggered, and the game application records the initial time and the duration T corresponding to the triggering process.
- Step 1503b the virtual environment picture does not change.
- the third control is not triggered, and the virtual environment screen does not change.
- Step 1505 whether the user has let go.
- the game application program determines whether the user has let go, and if the user has let go, go to step 1506; if the user does not let go, go back to step 1504 (continue to calculate the duration).
- Step 1506 determine whether ⁇ t>0.5 is established.
- the game application program determines whether the trigger duration of the third control is greater than the time threshold, if the trigger duration of the third control is greater than the time threshold, go to step 1507a; if the trigger duration of the third control is not greater than the time threshold, go to step 1507b.
- Step 1507a trigger the squat action.
- the time period when the third control is triggered is greater than the time threshold, and the game application program controls the first virtual object to perform a squatting action.
- Step 1507b triggering the lying down action.
- the time period when the third control is triggered is greater than the time threshold, and the game application program controls the first virtual object to perform a squatting action.
- the current option setting will be recorded in the memory, where the primary key value is SettingKeys.HideProneBtn, and the value content is the status of the current option (true or false).
- This part of the information will be stored in the local data of the computer device corresponding to the user.
- the value of this option will be obtained from the local data to maintain Consistency of user action settings.
- steps 1502 to 1507a can be performed in a loop until the end of the game.
- the foregoing embodiments describe the foregoing method based on an application scenario of a game, and the foregoing method is exemplarily described below with an application scenario of military simulation.
- Simulation technology is a model technology that reflects system behavior or process by simulating real-world experiments using software and hardware.
- the military simulation program is a program specially constructed for military applications by using simulation technology to quantitatively analyze the combat elements such as sea, land, and air, the performance of weapons and equipment, and combat operations, and then accurately simulate the battlefield environment, present the battlefield situation, and realize the integration of the combat system. Evaluation and decision-making aids.
- soldiers build a virtual battlefield at the terminal where the military simulation program is located, and compete in teams.
- Soldiers control virtual objects in the virtual environment of the battlefield to stand, squat, sit, lie down, lie down, lie on the side, walk, run, climb, drive, shoot, throw, attack, wound, reconnaissance, close At least one of the actions such as body fighting.
- the virtual environment of the battlefield includes: at least one natural form among flats, mountains, plateaus, basins, deserts, rivers, lakes, oceans, and vegetation, as well as location forms such as buildings, vehicles, ruins, and training grounds.
- the virtual objects include: virtual characters, virtual animals, cartoon characters, etc. Each virtual object has its own shape and volume in the three-dimensional virtual environment, and occupies a part of the space in the three-dimensional virtual environment.
- soldier A controls virtual object a
- soldier B controls virtual object b
- soldier C controls virtual object c
- soldier A and soldier B are soldiers in the same team
- soldier C is the same as soldier A
- soldier C B does not belong to the same team.
- Seeing virtual object a and virtual object b from the perspective of virtual object c soldier c sets up a split operation in the military simulation program, and splits the shooting control into two shooting controls, namely shooting control 1 and shooting control 2,
- Shooting control 1 is used to attack virtual object 1
- shooting control 2 is used to attack virtual object 2, wherein when shooting control 1 and shooting control 2 are triggered, the duration of shooting control 1 is longer than that of shooting control 2 (virtual object). 1 Wearing protective virtual props, so it takes longer to attack).
- the above-mentioned display method of the virtual environment screen is applied in the military simulation program, and the soldiers combine the tactical layout to merge the controls or split the controls, so that the layout of the controls is more in line with their own usage habits, thereby improving the The human-computer interaction efficiency of the soldiers is improved, and a more realistic simulation of the actual combat scene is carried out, so that the soldiers can get better training.
- FIG. 16 shows a schematic structural diagram of a display device for a virtual environment screen provided by an exemplary embodiment of the present application.
- the device can be implemented as all or a part of the terminal through software, hardware or a combination of the two, and the device includes:
- the display module 1610 is used to display the virtual environment picture and the first control and the second control, the first control and the second control belong to different control types;
- a receiving module 1620 configured to receive a merge setting operation for the first control and the second control
- the processing module 1630 is configured to merge the first control and the second control into a third control in response to the merge setting operation.
- the apparatus includes an obtaining module 1640;
- the obtaining module 1640 is configured to obtain the first widget type of the first widget and the second widget type of the second widget in response to the merge setting operation; the processing module 1630 is configured to respond to the first widget type and the second widget type.
- the second control type satisfies the preset condition, and the first control and the second control are combined into a third control.
- the preset condition includes at least one of the following conditions:
- the first control type is an auxiliary type, and the second control type is an aiming type;
- the first control type is an auxiliary type, and the second control type is a mobile type;
- the first control type is a movement type
- the second control type is an aiming type
- the first control type is a mobile type, and the second control type is a state switching type;
- the first control type is a state switching type
- the second control type is an auxiliary type
- the first control type is a state switching type
- the second control type is an aiming type
- the display module 1610 is configured to update and display the third control for replacing the first control and the second control.
- the first control is used to control the first virtual object to perform the first action
- the second control is used to control the first virtual object to perform a second action
- the third control is used to control the first virtual object to perform the first action and the second action.
- the processing module 1630 is configured to, in response to the first operation on the third control, control the first virtual object to perform the first action; in response to the second operation on the third control, control The first virtual object performs the second action; or, in response to the third operation for the third control, the first virtual object is controlled to perform the first action and the second action simultaneously; or, in response to the fourth operation for the third control, Acquire the priority of the first action and the second action; control the first virtual object to execute the first action and the second action in a preset order according to the priority.
- the processing module 1630 is configured to split the third widget into a first widget and a second widget in response to a first split setting operation for the third widget; or, In response to the second split setting operation for the third widget, the third widget is split into a fourth widget and a fifth widget, the fourth widget and the fifth widget belong to the same type of widget.
- the obtaining module 1640 is configured to obtain the action type corresponding to the third control in response to the first split setting operation for the third control; the processing module 1630 is configured to The third control is split into a first control and a second control based on the action type.
- the obtaining module 1640 is configured to obtain the association relationship between at least two actions corresponding to the third control in response to the second split setting operation for the third control;
- the processing module 1630 is used to split the third control into a fourth control and a fifth control based on the association relationship.
- the virtual environment picture is a picture obtained by observing the virtual environment from the perspective of the first virtual object
- the obtaining module 1640 is configured to obtain the number of the second virtual objects in response to the first virtual object including at least two second virtual objects in the viewing angle; the processing module 1630 is configured to determine the number of the second virtual objects based on the The third control is split.
- the display module 1610 is configured to update and display at least two split controls that are used to replace the third control after the third control is split.
- the processing module 1630 is configured to, in response to a first operation on the first control, control the first virtual object to perform a first action; in response to a second operation on the second control, control The first virtual object performs the second action while performing the first action; or, in response to the third operation for the fourth control, controls the first virtual object to perform the fourth action; in response to the fourth operation for the fifth control , the first virtual object is controlled to perform the fifth action, and the fourth action and the fifth action have an associated relationship.
- the device provided in this embodiment merges different types of controls displayed on the virtual environment screen through the received merge setting operation, so that the user can merge the UI controls that are not commonly used into the same one through independent settings.
- UI controls, or combining UI controls that need to be used in combination into one UI control by changing the layout of UI controls on the virtual environment screen, simplifies the process of users controlling virtual objects and improves the efficiency of human-computer interaction.
- the first control and the second control are combined into a third control, so that the user can combine different types of The controls are merged, so that the layout of UI controls on the virtual environment screen is more flexible.
- the user can determine the types of UI controls that can be merged, and the UI controls can be flexibly merged.
- the virtual environment screen is updated and displayed, and the updated virtual environment screen displays the merged third control, and the user can control the virtual object more intuitively through the updated virtual environment screen.
- the virtual object is controlled to perform different actions according to different rules, which makes the way of controlling the virtual object more flexible and diverse, and helps users to set UI controls that suit their own preferences or usage habits. Layout.
- the third control is split into the same type of action or different types of controls, so that the user can split the controls in a targeted manner according to different battle situations.
- User-controlled virtual objects improve combat efficiency.
- the split control can realize functions corresponding to different types of actions, and it is convenient for the user to split the third control according to different functions. .
- the split control By splitting the third control according to the association relationship between at least two actions corresponding to the third control when it is triggered, the split control can realize different levels of similar actions, which is convenient for users to perform different actions according to different combat situations.
- the action of the level is conducive to improving the combat efficiency of virtual objects.
- the split control can attack different virtual objects, which is beneficial to improve the combat efficiency of the virtual objects.
- the virtual environment screen is updated and displayed, the updated virtual environment screen displays the split controls, and the user can control the virtual object more intuitively through the updated virtual environment screen.
- the virtual objects can be controlled to perform different operations, which makes the methods of controlling virtual objects more flexible and diverse, and helps users to set the UI controls that suit their own preferences or usage habits. layout.
- the display device of the virtual environment screen provided by the above-mentioned embodiment is only illustrated by the division of the above-mentioned functional modules.
- the internal structure is divided into different functional modules to complete all or part of the functions described above.
- the device for displaying a virtual environment screen provided by the above embodiments and the embodiment of the method for displaying a virtual environment screen belong to the same concept, and the specific implementation process thereof is detailed in the method embodiment, which will not be repeated here.
- FIG. 17 shows a structural block diagram of a computer device 1700 provided by an exemplary embodiment of the present application.
- the computer device 1700 can be a portable mobile terminal, such as: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, a moving picture expert compression standard audio layer 3), MP4 (Moving Picture Experts Group Audio Layer IV, Motion Picture Expert Compression Standard Audio Layer 4) Player.
- Computer device 1700 may also be referred to by other names such as user equipment, portable terminal, and the like.
- computer device 1700 includes: processor 1701 and memory 1702 .
- the processor 1701 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like.
- Memory 1702 may include one or more computer-readable storage media, which may be tangible and non-transitory. Memory 1702 may also include high-speed random access memory, as well as non-volatile memory, such as one or more disk storage devices, flash storage devices. In some embodiments, the non-transitory computer-readable storage medium in the memory 1702 is used for storing at least one instruction, and the at least one instruction is used for being executed by the processor 1701 to realize the virtual environment screen provided in the embodiments of the present application display method.
- the computer device 1700 may also optionally include: a peripheral device interface 1703 and at least one peripheral device.
- the peripheral device includes: at least one of a radio frequency circuit 1704 , a touch display screen 1705 , a camera assembly 1706 , an audio circuit 1707 , a positioning assembly 1708 and a power supply 1709 .
- the peripheral device interface 1703 may be used to connect at least one peripheral device related to I/O (Input/Output) to the processor 1701 and the memory 1702 .
- I/O Input/Output
- the radio frequency circuit 1704 is used for receiving and transmitting RF (Radio Frequency, radio frequency) signals, also called electromagnetic signals.
- the radio frequency circuit 1704 communicates with communication networks and other communication devices via electromagnetic signals.
- the radio frequency circuit 1704 converts electrical signals into electromagnetic signals for transmission, or converts received electromagnetic signals into electrical signals.
- the touch display screen 1705 is used to display UI (User Interface, user interface).
- UI User Interface, user interface
- the camera assembly 1706 is used to capture images or video.
- Audio circuitry 1707 is used to provide an audio interface between the user and computer device 1700 .
- Audio circuitry 1707 may include a microphone and speakers.
- the positioning component 1708 is used to locate the current geographic location of the computer device 1700 to implement navigation or LBS (Location Based Service).
- LBS Location Based Service
- Power supply 1709 is used to power various components in computer device 1700 .
- computer device 1700 also includes one or more sensors 1710 .
- the one or more sensors 1710 include, but are not limited to, an acceleration sensor 1711 , a gyro sensor 1712 , a pressure sensor 1713 , a fingerprint sensor 1714 , an optical sensor 1715 , and a proximity sensor 1716 .
- FIG. 17 does not constitute a limitation on the computer device 1700, and may include more or less components than those shown, or combine some components, or adopt different component arrangements.
- An embodiment of the present application further provides a computer device, the computer device includes a processor and a memory, and the memory stores at least one instruction, at least one piece of program, code set or instruction set, the at least one instruction, the at least one piece of program, the at least one piece of program, the The code set or the instruction set is loaded and executed by the processor to implement the method for displaying a virtual environment screen provided by the above method embodiments.
- Embodiments of the present application further provide a computer-readable storage medium, in which at least one instruction, at least one piece of program, code set or instruction set is stored, and the at least one instruction, at least one piece of program, code set or instruction set is processed by The browser loads and executes the method to realize the display method of the virtual environment picture provided by the above method embodiments.
- Embodiments of the present application further provide a computer program product or computer program, where the computer program product or computer program includes computer instructions, and the computer instructions are stored in a computer-readable storage medium.
- the processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device executes the method for displaying a virtual environment screen as described above.
- references herein to "a plurality” means two or more.
- "And/or" which describes the association relationship of the associated objects, means that there can be three kinds of relationships, for example, A and/or B, which can mean that A exists alone, A and B exist at the same time, and B exists alone.
- the character “/” generally indicates that the associated objects are an "or" relationship.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims (15)
- 一种虚拟环境画面的显示方法,其中,应用于计算机设备,所述方法包括:显示虚拟环境画面以及第一控件和第二控件,所述第一控件和所述第二控件属于不同的控件类型;接收针对所述第一控件和所述第二控件的合并设置操作;响应于所述合并设置操作将所述第一控件和所述第二控件合并为第三控件。
- 根据权利要求1所述的方法,其中,所述响应于所述合并设置操作将所述第一控件和所述第二控件合并为第三控件,包括:响应于所述合并设置操作获取所述第一控件的第一控件类型和所述第二控件的第二控件类型;响应于所述第一控件类型和所述第二控件类型满足预设条件,将所述第一控件和所述第二控件合并为所述第三控件。
- 根据权利要求2所述的方法,其中,所述预设条件包括如下条件中的至少一种:所述第一控件类型是辅助类型,所述第二控件类型是瞄准类型;所述第一控件类型是所述辅助类型,所述第二控件类型是移动类型;所述第一控件类型是所述移动类型,所述第二控件类型是所述瞄准类型;所述第一控件类型是所述移动类型,所述第二控件类型是状态切换类型;所述第一控件类型是所述状态切换类型,所述第二控件类型是所述辅助类型;所述第一控件类型是所述状态切换类型,所述第二控件类型是所述瞄准类型。
- 根据权利要求1至3任一所述的方法,其中,所述响应于所述合并设置操作将所述第一控件和所述第二控件合并为第三控件之后,还包括:更新显示用于替换所述第一控件和所述第二控件的所述第三控件。
- 根据权利要求1至3任一所述的方法,其中,所述第一控件用于控制第一虚拟对象执行第一动作;所述第二控件用于控制所述第一虚拟对象执行第二动作;所述第三控件用于控制所述第一虚拟对象执行所述第一动作和所述第二动作。
- 根据权利要求5所述的方法,其中,所述方法还包括:响应于针对所述第三控件的第一操作,控制所述第一虚拟对象执行所述第一动作;响应于针对所述第三控件的第二操作,控制所述第一虚拟对象执行所述第二动作;或,响应于针对所述第三控件的第三操作,控制所述第一虚拟对象同时执行所述第一动作和所述第二动作;或,响应于针对所述第三控件的第四操作,获取所述第一动作和所述第二动作的优先级;根据所述优先级控制所述第一虚拟对象按照预设顺序执行所述第一动作和所述第二动作。
- 根据权利要求1至3任一所述的方法,其中,所述方法还包括:响应于针对所述第三控件的第一拆分设置操作,将所述第三控件拆分成所述第一控件和所述第二控件;或,响应于针对所述第三控件的第二拆分设置操作,将所述第三控件拆分成所述第四控件和所述第五控件,所述第四控件和所述第五控件属于同一类型的控件。
- 根据权利要求7所述的方法,其中,所述响应于针对所述第三控件的第一拆分设置操作,将所述第三控件拆分成所述第一控件和所述第二控件,包括:响应于针对所述第三控件的第一拆分设置操作,获取所述第三控件对应的动作类型;基于所述动作类型将所述第三控件拆分成所述第一控件和所述第二控件。
- 根据权利要求7所述的方法,其中,所述响应于针对所述第三控件的第二拆分设置操作,将所述第三控件拆分成所述第四控件和所述第五控件,包括:响应于针对所述第三控件的第二拆分设置操作,获取所述第三控件对应的至少两个动作之间的关联关系;基于所述关联关系将所述第三控件拆分成所述第四控件和所述第五控件。
- 根据权利要求7所述的方法,其中,所述虚拟环境画面是以第一虚拟对象的视角对虚拟环境进行观察得到的画面;所述方法还包括:响应于所述第一虚拟对象的视角内包括至少两个第二虚拟对象,基于所述第二虚拟对象的数量对所述第三控件进行拆分。
- 根据权利要求7所述的方法,其中,所述方法还包括:更新显示所述第三控件拆分后用于替代所述第三控件的至少两个拆分控件。
- 根据权利要求7所述的方法,其中,所述将所述第三控件拆分成所述第一控件和所述第二控件之后,还包括:响应于针对所述第一控件的第一操作,控制所述第一虚拟对象执行第一动作;响应于针对所述第二控件的第二操作,控制所述第一虚拟对象在执行所述第一动作的同时,执行第二动作;或,所述将所述第三控件拆分成所述第四控件和所述第五控件之后,还包括:响应于针对所述第四控件的第三操作,控制所述第一虚拟对象执行第四动作;响应于针对所述第五控件的第四操作,控制所述第一虚拟对象执行第五动作,所述第四动作和所述第五动作具有关联关系。
- 一种虚拟环境画面的显示装置,其中,所述装置包括:显示模块,用于显示虚拟环境画面以及第一控件和第二控件,所述第一控件和所述第二控件属于不同的控件类型;接收模块,用于接收针对所述第一控件和所述第二控件的合并设置操作;处理模块,用于响应于所述合并设置操作将所述第一控件和所述第二控件合并为第三控件。
- 一种计算机设备,其中,所述计算机设备包括处理器和存储器,所述存储器中存储有至少一条指令、至少一段程序、代码集或指令集,所述指令、所述程序、所述代码集或所述指令集由所述处理器加载并执行以实现如权利要求1至11任一项所述的虚拟环境画面的显示方法。
- 一种计算机可读存储介质,其中,所述存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由处理器加载并执行,以实现如权利要求1至11任一项所述的虚拟环境画面的显示方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020227040140A KR20230007392A (ko) | 2020-08-26 | 2021-08-20 | 가상 환경 픽처를 디스플레이하기 위한 방법 및 장치, 디바이스, 및 저장 매체 |
JP2022560942A JP7477640B2 (ja) | 2020-08-26 | 2021-08-20 | 仮想環境画面の表示方法、装置及びコンピュータプログラム |
US17/883,323 US20220379214A1 (en) | 2020-08-26 | 2022-08-08 | Method and apparatus for a control interface in a virtual environment |
JP2024067203A JP2024099643A (ja) | 2020-08-26 | 2024-04-18 | 仮想環境画面の表示方法、装置及びコンピュータプログラム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010870309.1 | 2020-08-26 | ||
CN202010870309.1A CN111921194A (zh) | 2020-08-26 | 2020-08-26 | 虚拟环境画面的显示方法、装置、设备及存储介质 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/883,323 Continuation US20220379214A1 (en) | 2020-08-26 | 2022-08-08 | Method and apparatus for a control interface in a virtual environment |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022042435A1 true WO2022042435A1 (zh) | 2022-03-03 |
Family
ID=73305521
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/113710 WO2022042435A1 (zh) | 2020-08-26 | 2021-08-20 | 虚拟环境画面的显示方法、装置、设备及存储介质 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20220379214A1 (zh) |
JP (2) | JP7477640B2 (zh) |
KR (1) | KR20230007392A (zh) |
CN (1) | CN111921194A (zh) |
WO (1) | WO2022042435A1 (zh) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111921194A (zh) * | 2020-08-26 | 2020-11-13 | 腾讯科技(深圳)有限公司 | 虚拟环境画面的显示方法、装置、设备及存储介质 |
CN112402960B (zh) * | 2020-11-19 | 2022-11-04 | 腾讯科技(深圳)有限公司 | 虚拟场景中状态切换方法、装置、设备及存储介质 |
CN113398564B (zh) * | 2021-07-12 | 2024-02-13 | 网易(杭州)网络有限公司 | 虚拟角色控制方法、装置、存储介质及计算机设备 |
CN113476823B (zh) * | 2021-07-13 | 2024-02-27 | 网易(杭州)网络有限公司 | 虚拟对象控制方法、装置、存储介质及电子设备 |
CN113926181A (zh) * | 2021-10-21 | 2022-01-14 | 腾讯科技(深圳)有限公司 | 虚拟场景的对象控制方法、装置及电子设备 |
CN118767426A (zh) * | 2023-04-21 | 2024-10-15 | 网易(杭州)网络有限公司 | 虚拟对象的控制方法、装置和电子设备 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105549888A (zh) * | 2015-12-15 | 2016-05-04 | 芜湖美智空调设备有限公司 | 组合控件生成方法和装置 |
CN106126064A (zh) * | 2016-06-24 | 2016-11-16 | 乐视控股(北京)有限公司 | 一种信息处理方法及设备 |
CN109078326A (zh) * | 2018-08-22 | 2018-12-25 | 网易(杭州)网络有限公司 | 游戏的控制方法和装置 |
WO2019201047A1 (zh) * | 2018-04-16 | 2019-10-24 | 腾讯科技(深圳)有限公司 | 在虚拟环境中进行视角调整的方法、装置及可读存储介质 |
CN111209000A (zh) * | 2020-01-08 | 2020-05-29 | 网易(杭州)网络有限公司 | 自定义控件的处理方法、装置、电子设备及存储介质 |
CN111921194A (zh) * | 2020-08-26 | 2020-11-13 | 腾讯科技(深圳)有限公司 | 虚拟环境画面的显示方法、装置、设备及存储介质 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7749089B1 (en) * | 1999-02-26 | 2010-07-06 | Creative Kingdoms, Llc | Multi-media interactive play system |
JP5137932B2 (ja) * | 2009-11-17 | 2013-02-06 | 株式会社ソニー・コンピュータエンタテインメント | 通信システム、端末装置、通信処理方法、通信処理プログラム、通信処理プログラムが記憶された記憶媒体、拡張機器 |
KR102109054B1 (ko) * | 2013-04-26 | 2020-05-28 | 삼성전자주식회사 | 애니메이션 효과를 제공하는 사용자 단말 장치 및 그 디스플레이 방법 |
KR101866198B1 (ko) * | 2016-07-06 | 2018-06-11 | (주) 덱스인트게임즈 | 터치스크린 기반의 게임제공방법 및 프로그램 |
US20190282895A1 (en) * | 2018-03-13 | 2019-09-19 | Microsoft Technology Licensing, Llc | Control sharing for interactive experience |
CN109701274B (zh) * | 2018-12-26 | 2019-11-08 | 网易(杭州)网络有限公司 | 信息处理方法及装置、存储介质、电子设备 |
US20200298110A1 (en) * | 2019-03-20 | 2020-09-24 | Eric Alan Koziel | Universal Game Controller Remapping Device |
-
2020
- 2020-08-26 CN CN202010870309.1A patent/CN111921194A/zh active Pending
-
2021
- 2021-08-20 JP JP2022560942A patent/JP7477640B2/ja active Active
- 2021-08-20 KR KR1020227040140A patent/KR20230007392A/ko not_active Application Discontinuation
- 2021-08-20 WO PCT/CN2021/113710 patent/WO2022042435A1/zh active Application Filing
-
2022
- 2022-08-08 US US17/883,323 patent/US20220379214A1/en active Pending
-
2024
- 2024-04-18 JP JP2024067203A patent/JP2024099643A/ja active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105549888A (zh) * | 2015-12-15 | 2016-05-04 | 芜湖美智空调设备有限公司 | 组合控件生成方法和装置 |
CN106126064A (zh) * | 2016-06-24 | 2016-11-16 | 乐视控股(北京)有限公司 | 一种信息处理方法及设备 |
WO2019201047A1 (zh) * | 2018-04-16 | 2019-10-24 | 腾讯科技(深圳)有限公司 | 在虚拟环境中进行视角调整的方法、装置及可读存储介质 |
CN109078326A (zh) * | 2018-08-22 | 2018-12-25 | 网易(杭州)网络有限公司 | 游戏的控制方法和装置 |
CN111209000A (zh) * | 2020-01-08 | 2020-05-29 | 网易(杭州)网络有限公司 | 自定义控件的处理方法、装置、电子设备及存储介质 |
CN111921194A (zh) * | 2020-08-26 | 2020-11-13 | 腾讯科技(深圳)有限公司 | 虚拟环境画面的显示方法、装置、设备及存储介质 |
Also Published As
Publication number | Publication date |
---|---|
JP7477640B2 (ja) | 2024-05-01 |
US20220379214A1 (en) | 2022-12-01 |
JP2024099643A (ja) | 2024-07-25 |
CN111921194A (zh) | 2020-11-13 |
KR20230007392A (ko) | 2023-01-12 |
JP2023523157A (ja) | 2023-06-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2022042435A1 (zh) | 虚拟环境画面的显示方法、装置、设备及存储介质 | |
JP7476235B2 (ja) | 仮想オブジェクトの制御方法、装置、デバイス及びコンピュータプログラム | |
WO2022151946A1 (zh) | 虚拟角色的控制方法、装置、电子设备、计算机可读存储介质及计算机程序产品 | |
WO2022057529A1 (zh) | 虚拟场景中的信息提示方法、装置、电子设备及存储介质 | |
WO2022134980A1 (zh) | 虚拟对象的控制方法、装置、终端及存储介质 | |
CN111399639B (zh) | 虚拟环境中运动状态的控制方法、装置、设备及可读介质 | |
CN112416196B (zh) | 虚拟对象的控制方法、装置、设备及计算机可读存储介质 | |
CN113440846B (zh) | 游戏的显示控制方法、装置、存储介质及电子设备 | |
WO2022105362A1 (zh) | 虚拟对象的控制方法、装置、设备、存储介质及计算机程序产品 | |
WO2021238870A1 (zh) | 信息显示方法、装置、设备及存储介质 | |
JP7492611B2 (ja) | バーチャルシーンにおけるデータ処理方法、装置、コンピュータデバイス、及びコンピュータプログラム | |
WO2022052831A1 (zh) | 应用程序内的控件位置调整方法、装置、设备及存储介质 | |
CN114225372B (zh) | 虚拟对象的控制方法、装置、终端、存储介质及程序产品 | |
CN112402960A (zh) | 虚拟场景中状态切换方法、装置、设备及存储介质 | |
WO2022227958A1 (zh) | 虚拟载具的显示方法、装置、设备以及存储介质 | |
WO2023010690A1 (zh) | 虚拟对象释放技能的方法、装置、设备、介质及程序产品 | |
CN113546422B (zh) | 虚拟资源的投放控制方法、装置、计算机设备及存储介质 | |
WO2023134272A1 (zh) | 视野画面的显示方法、装置及设备 | |
KR20220098355A (ko) | 가상 대상 상호작용 모드를 선택하기 위한 방법 및 장치, 디바이스, 매체, 및 제품 | |
CN111249726B (zh) | 虚拟环境中虚拟道具的操作方法、装置、设备及可读介质 | |
KR20210144786A (ko) | 가상 환경 픽처를 디스플레이하기 위한 방법 및 장치, 디바이스, 및 저장 매체 | |
KR20230042517A (ko) | 연락처 정보 디스플레이 방법, 장치 및 전자 디바이스, 컴퓨터-판독가능 저장 매체, 및 컴퓨터 프로그램 제품 | |
CN111589129B (zh) | 虚拟对象的控制方法、装置、设备及介质 | |
WO2022170892A1 (zh) | 虚拟对象的控制方法、装置、设备、存储介质及程序产品 | |
CN115645923A (zh) | 游戏交互方法、装置、终端设备及计算机可读存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21860272 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022560942 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20227040140 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 04-07-2023) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21860272 Country of ref document: EP Kind code of ref document: A1 |