WO2023020125A1 - 虚拟环境画面的显示方法、装置、终端、介质及程序产品 - Google Patents
虚拟环境画面的显示方法、装置、终端、介质及程序产品 Download PDFInfo
- Publication number
- WO2023020125A1 WO2023020125A1 PCT/CN2022/102206 CN2022102206W WO2023020125A1 WO 2023020125 A1 WO2023020125 A1 WO 2023020125A1 CN 2022102206 W CN2022102206 W CN 2022102206W WO 2023020125 A1 WO2023020125 A1 WO 2023020125A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- touch
- virtual environment
- functional control
- virtual
- virtual object
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 61
- 230000004044 response Effects 0.000 claims abstract description 135
- 230000006399 behavior Effects 0.000 claims description 108
- 230000006870 function Effects 0.000 claims description 39
- 238000004590 computer program Methods 0.000 claims description 16
- 230000008859 change Effects 0.000 abstract description 21
- 238000005516 engineering process Methods 0.000 abstract description 4
- 238000010586 diagram Methods 0.000 description 15
- 230000008569 process Effects 0.000 description 7
- 230000009191 jumping Effects 0.000 description 6
- 230000002093 peripheral effect Effects 0.000 description 6
- 238000006243 chemical reaction Methods 0.000 description 5
- 230000009471 action Effects 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 230000004083 survival effect Effects 0.000 description 2
- 230000003542 behavioural effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000001427 coherent effect Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
- A63F13/525—Changing parameters of virtual cameras
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/214—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
- A63F13/2145—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/837—Shooting of targets
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
Definitions
- This application relates to the technical fields of computer and Internet, and in particular to the display of virtual environment pictures.
- users can observe the virtual environment by swiping on the screen.
- Embodiments of the present application provide a display method, device, terminal, medium, and program product of a virtual environment screen, which can simplify user operations and improve the adjustment efficiency of the virtual environment screen while controlling the execution behavior of virtual objects. Described technical scheme is as follows:
- a method for displaying a virtual environment screen is provided, the method is executed by a terminal, and the method includes:
- the first virtual environment picture is a picture for observing the virtual environment from a first perspective of a virtual object
- control the virtual object to perform a behavior corresponding to the first function control In response to a first touch operation on a first function control, control the virtual object to perform a behavior corresponding to the first function control; the initial touch position of the first touch operation is located at the first function the first position within the response area of the control;
- a second virtual environment screen In response to the touch position of the first touch operation moving from the first position to a second position, a second virtual environment screen is displayed; the second position is located outside the response area of the first functional control , the second virtual environment picture is a picture for observing the virtual environment from a second perspective of the virtual object, and the second perspective and the first perspective have different orientations.
- a display device for a virtual environment screen includes:
- a screen display module configured to display a first virtual environment screen, where the first virtual environment screen is a screen for observing the virtual environment from a first perspective of a virtual object;
- An object control module configured to control the virtual object to perform a behavior corresponding to the first functional control in response to a first touch operation on the first functional control; the initial touch position of the first touch operation is located at a first position within the response area of the first functional control;
- a screen adjustment module configured to display a second virtual environment screen in response to the touch position of the first touch operation moving from the first position to a second position; the second position is located at the first function
- the position outside the response area of the control, the second virtual environment picture is a picture for observing the virtual environment from the second perspective of the virtual object, and the second perspective is different from the first perspective orientation.
- a terminal includes a processor and a memory, the memory stores at least one instruction, at least one program, a code set or an instruction set, and the at least one instruction, The at least one program, the code set or the instruction set is loaded and executed by the processor to realize the display method of the virtual environment screen in the above aspect.
- a computer-readable storage medium is provided, and a computer program is stored in the readable storage medium, and the computer program is used to execute to realize the method for displaying a virtual environment screen of the above aspect.
- a computer program product or computer program includes computer instructions, and the computer instructions are stored in a computer-readable storage medium.
- the processor of the server reads the computer instruction from the computer-readable storage medium, and the processor executes the computer instruction, so that the server executes the above method for selecting a virtual object;
- the terminal processor reads the computer instruction from the computer-readable storage medium,
- the processor executes the computer instructions, so that the terminal executes the method for displaying the virtual environment screen of the above aspect.
- the virtual object is controlled to perform the behavior corresponding to the first functional control, and, at the touch position of the first touch operation, moves from the first position to the In the case of the second position outside the response area, adjust the virtual object from the first viewing angle to the second viewing angle, and then display the second virtual environment screen based on the second viewing angle, that is, in the same touch operation In it, it can not only control the virtual object to perform the corresponding behavior, but also adjust the virtual environment screen according to the change of the touch position.
- the client can still respond to the user's sliding operation without the user letting go, simplifying user operations, and improving the adjustment efficiency of the virtual environment screen while controlling the execution behavior of the virtual object.
- Fig. 1 is a schematic diagram of a virtual screen display system provided by an embodiment of the present application
- Fig. 2 exemplarily shows a schematic diagram of a display manner of a virtual environment picture
- FIG. 3 is a flowchart of a method for displaying a virtual environment screen provided by an embodiment of the present application
- Fig. 4 exemplarily shows a schematic diagram of a user interface
- Fig. 5 exemplarily shows a schematic diagram of a configuration interface
- Fig. 6 exemplarily shows a schematic diagram of another display manner of a virtual environment picture
- Fig. 7 exemplarily shows a schematic diagram of another display mode of a virtual environment screen
- Fig. 8 exemplarily shows a schematic diagram of a control manner of a virtual object
- FIG. 9 exemplarily shows a schematic diagram of another virtual object control manner
- Fig. 10 exemplarily shows a schematic diagram of a touch mode of a first energy supply control
- Fig. 11 is a block diagram of a display device for a virtual environment screen provided by an embodiment of the present application.
- Fig. 12 is a block diagram of a display device for a virtual environment screen provided by another embodiment of the present application.
- Fig. 13 is a structural block diagram of a terminal provided by an embodiment of the present application.
- FIG. 1 shows a schematic diagram of a virtual screen display system provided by an embodiment of the present application.
- the virtual screen display system may include: a terminal 10 and a server 20 .
- the terminal 10 may be an electronic device such as a mobile phone, a tablet computer, a game console, an e-book reader, a multimedia playback device, a wearable device, and a PC (Personal Computer, personal computer).
- Clients of application programs can be installed in the terminal 10 .
- the application program may be an application program that needs to be downloaded and installed, or a click-to-run application program, which is not limited in this embodiment of the present application.
- the above-mentioned application program may be any application program that can provide a virtual environment for the user to assume and operate a virtual character to perform activities in the virtual environment.
- the application program can be a game application program, such as TPS (Third-Personal Shooting Game, third-person shooting game), FPS (First-person Shooting Game, first-person shooting game), BR (Battle Royale, battle royale survival) ) games, MOBA (Multiplayer Online Battle Arena, multiplayer online tactical arena) games, multiplayer gun battle survival games, etc.
- TPS Transmission-Personal Shooting Game, third-person shooting game
- FPS First-person Shooting Game, first-person shooting game
- BR Blu-R, battle royale survival
- MOBA Multiplayer Online Battle Arena, multiplayer online tactical arena
- multiplayer gun battle survival games etc.
- other types of application programs may also display virtual objects to users and provide corresponding functions to virtual objects.
- VR Virtual Reality, virtual reality
- AR Augmented Reality, augmented reality
- three-dimensional map program military simulation program
- social application program interactive entertainment application program
- the embodiment of the present application There is no limit to this.
- the forms of virtual objects provided by them will also be different, and the corresponding functions will also be different, which can be pre-configured according to actual needs.
- the terminal 10 runs the client of the above application program.
- the above-mentioned application program is an application program developed based on a three-dimensional virtual environment engine.
- the virtual environment engine is a Unity engine, and the virtual environment engine can build a three-dimensional virtual environment, virtual objects and virtual props, etc. Bring a more immersive gaming experience.
- the aforementioned virtual environment is a scene displayed (or provided) when the client of an application program (such as a game application program) runs on a terminal, and the virtual environment refers to a scene created for virtual objects to perform activities (such as a game competition) , such as virtual houses, virtual islands, virtual maps, etc.
- the virtual environment can be a simulation environment of the real world, a semi-simulation and semi-fictional environment, or a purely fictitious environment.
- the virtual environment may be a two-dimensional virtual environment, a 2.5-dimensional virtual environment, or a three-dimensional virtual environment, which is not limited in this embodiment of the present application.
- the aforementioned virtual object may be a virtual character controlled by a user account in the application program, or may be a virtual character controlled by a computer program in the application program.
- the virtual object may be a game character controlled by a user account in the game application program, or a game monster controlled by a computer program in the game application program.
- the virtual object may be in the form of a character, an animal, a cartoon or other forms, which is not limited in this embodiment of the present application.
- the virtual object can be displayed in a three-dimensional form or in a two-dimensional form, which is not limited in this embodiment of the present application.
- the virtual object is a three-dimensional model created based on animation skeleton technology.
- Each virtual object has its own shape and volume in the 3D virtual environment, occupying a part of the space in the 3D virtual environment.
- the user when the application program in the terminal 10 is running, the user can also adjust the viewing angle of the virtual object in the virtual environment when controlling the execution behavior of the virtual object.
- the terminal 10 displays the first functional control and the first virtual environment screen in the user interface; further, the terminal 10 determines whether the first touch operation; if the first touch operation for the first functional control is detected, control the virtual object to perform the behavior corresponding to the first functional control; if the first touch operation for the first functional control is not detected , the viewing angle of the virtual object remains unchanged.
- the terminal 10 judges whether the touch position of the first touch operation has moved from inside the first functional control to outside the first functional control; if the first When the touch position of the touch operation is moved from inside the first functional control to outside the first functional control, the angle of view of the virtual object is adjusted according to the change of the position of the touch position, and the first virtual environment screen is displayed based on the adjusted angle of view. Change to the second virtual environment screen; if the touch position of the first touch operation does not move from inside the first function control to outside the first function control, then keep the virtual object's viewing angle unchanged.
- the server 20 is used to provide background services for clients of applications in the terminal 10 .
- the server 20 may be a background server of the above-mentioned game application.
- the server 20 may be one server, or a server cluster composed of multiple servers, or a cloud computing service center.
- the server 20 provides background services for applications in multiple terminals 10 at the same time.
- the terminal 10 and the server 20 may communicate with each other through a network.
- FIG. 3 shows a flowchart of a method for displaying a virtual environment screen provided by an embodiment of the present application.
- This method can be applied to the terminal 10 of the virtual screen display system shown in FIG.
- the method may include the following steps (301-303):
- Step 301 displaying a first virtual environment screen.
- the virtual environment picture refers to a display picture of the virtual environment displayed to the user on the user interface.
- the virtual environment picture may be a picture obtained by the virtual camera from the virtual environment.
- the virtual camera acquires a virtual environment picture from a third-person perspective of the virtual object.
- the virtual camera is set obliquely above the virtual object, and the client observes the virtual environment centered on the virtual object through the virtual camera, acquires and displays a virtual environment picture centered on the virtual object.
- the virtual camera acquires a virtual environment picture from a first-person perspective of the virtual object.
- the virtual camera is set directly in front of the virtual object, and the client observes the virtual environment from the first-person perspective of the virtual object through the virtual camera, acquires and displays the virtual environment picture from the first-person perspective of the virtual object.
- the client side displays the first virtual environment screen.
- the first virtual environment screen is a screen for observing the virtual environment from a first perspective of a virtual object
- the virtual object refers to a virtual object controlled by the user in an application program
- the first perspective refers to any perspective.
- the above-mentioned first perspective is used to represent the orientation of the virtual object when observing the virtual environment, and is not limited to the above-mentioned first-person perspective and the above-mentioned third-person perspective.
- the first perspective may be any perspective under the first-person perspective, or any perspective under the third-person perspective, which is not limited in this embodiment of the present application.
- the orientation of the virtual camera is adjustable in real time, that is, the user can adjust the viewing angle of the virtual object according to the actual situation.
- the user can adjust the orientation of the virtual camera through a sliding operation on the user interface, so as to adjust the viewing angle of the virtual object.
- the placement position of the above virtual camera can be adjusted in real time.
- the user controls the virtual object to move in the virtual environment through the joystick, changes the position of the virtual object in the virtual environment, and then changes the placement position of the virtual camera in the virtual environment.
- the above-mentioned map display control refers to a control for displaying the global map in the shooting application.
- Step 302 in response to the first touch operation on the first functional control, control the virtual object to execute the behavior corresponding to the first functional control.
- the user interface includes at least one functional control.
- the functional control is directly displayed in the user interface.
- the user displays at least one functional control provided by the application program for the user in the user interface while displaying the above-mentioned first virtual environment picture.
- the functional controls are hidden and displayed in the user interface.
- the user interface determines at least one For the response area of the functional control, in order to keep the user interface clean, the at least one functional control is not displayed in the user interface.
- the functional controls are half-hidden and displayed in the user interface.
- the user interface is displaying the first virtual environment picture, at least The response area of a functional control, and when any touch operation of the user on the response area of a certain functional control is detected, the functional control is displayed in the user interface, so that the user can determine whether the functional control is the desired functional control.
- a touch operation refers to an operation triggered and generated by a user.
- the touch operation may be an operation directed at any position in the user interface.
- the above-mentioned touch operation may be a single-click operation, a double-click operation, a slide operation, etc., which is not limited in this embodiment of the present application.
- different positions in the user interface correspond to different touch operations.
- the touch operation corresponding to the functional control in the user interface is a click operation, and the user controls the virtual object to perform the behavior corresponding to the functional control by clicking the functional control;
- the display area of the virtual environment screen in the user interface corresponds to
- the touch operation is a sliding operation, and the user adjusts the viewing angle of the virtual object through the sliding operation on the display area so that the user interface displays a new virtual environment picture.
- the first touch operation refers to a touch operation on a first functional control
- the first functional control refers to any one of the above at least one functional control.
- the client after displaying the above-mentioned first virtual environment screen, the client detects the user interface, and controls the virtual object to execute the first function when the first touch operation on the first function control is detected.
- the corresponding behavior of the control is a first position within the response area of the first functional control, and the first position refers to any position in the response area of the first functional control.
- the response area of the above-mentioned first functional control may be any area in the user interface.
- the first functional control is displayed directly or half-hidden in the user interface.
- the response area of the first functional control is centered on the display area of the first functional control and has an area larger than or equal to the display area.
- the response area of the first functional control is the display area of the first functional control.
- the shape of the response area of the first functional control may be the same as or different from the shape of the display area of the first functional control, which is not limited in this embodiment of the present application.
- the first functional control is hidden and displayed in the user interface.
- the response area of the first functional control may be an area set by the designer according to user habits. For example, since the user is accustomed to controlling the movement of virtual objects with the left hand, and controlling the movement of virtual objects with the right hand, the designer can set the response area of the functional control for controlling the movement of the virtual object on the left side of the user interface, and let the user control the virtual object The response area for the function control of releasing skills is set on the right side of the user interface.
- the above-mentioned behaviors may be any behaviors, such as item usage behaviors, posture conversion behaviors, skill usage behaviors, etc., which are not limited in this embodiment of the present application.
- the above behavior is an item usage behavior.
- the client determines the virtual item corresponding to the first functional control, and controls the virtual object to use the virtual item.
- the aforementioned virtual items may also be called virtual weapons, such as virtual firearms, virtual bullets, virtual explosives, and the like.
- the functional controls corresponding to different virtual items are different; or, the functional controls corresponding to different virtual items are the same, but the corresponding touch operations are different.
- the above behavior is a gesture conversion behavior.
- the client detects the first touch operation on the above-mentioned first functional control, it determines the gesture corresponding to the first functional control, and controls the virtual object to transform from the current gesture to the one corresponding to the first functional control. attitude.
- the posture of the virtual object may include a standing posture, a squatting posture, a lying posture, a jumping posture, etc., which are not limited in this embodiment of the present application.
- different gestures correspond to different functional controls; or, different gestures correspond to the same functional controls, but the corresponding touch operations are different.
- the above behavior is a skill release behavior.
- the client determines the skill corresponding to the first functional control, and controls the virtual object to release the skill corresponding to the first functional control.
- the functional controls corresponding to different skills are different; or, the functional controls corresponding to different skills are the same, but the corresponding touch operations are different.
- Step 303 in response to the first touch operation, the touch position moves from the first position to the second position, and a second virtual environment image is displayed.
- the second position is a position outside the response area of the first functional control, that is to say, in the embodiment of the application, after the client detects the above-mentioned first touch operation, the first touch The touch position of the control operation is detected, and when the touch position of the first touch operation moves from the above-mentioned first position to a second position outside the response area of the first functional control, the second virtual Environment screen.
- the above-mentioned second position can be any position outside the response area of the first functional control
- the above-mentioned second virtual environment picture is a picture for observing the virtual environment from the second perspective of the virtual object, and the second perspective and the first Views have different orientations. Similar to the above-mentioned first perspective, the second perspective is only used to represent the orientation of the virtual object when observing the virtual environment, and does not limit the above-mentioned first-person perspective and the above-mentioned third-person perspective.
- the first touch operation may also be referred to as a sliding operation during the touch position change process.
- the user interface 40 includes a first virtual environment screen and a first functional control 41 (shooting button).
- the client determines that the first touch operation on the first functional control 41 is detected, and controls the virtual object to use the virtual weapon 42 to shoot at the enemy virtual object 43;
- the client determines that the first touch operation includes a sliding operation, and adjusts the orientation of the virtual object’s viewing angle based on the sliding operation, and then displays the second virtual environment on the user interface 40 screen.
- the second virtual environment screen includes a new virtual object 44 .
- the virtual object is controlled to perform the behavior corresponding to the first functional control, and, after the touch operation of the first touch operation Control position, when moving from the first position to the second position outside the response area of the first functional control, adjust the virtual object from the first viewing angle to the second viewing angle, and then display the second viewing angle based on the second viewing angle
- Two virtual environment screens that is to say, in the same touch operation, it can not only control the virtual object to perform the corresponding behavior, but also adjust the virtual environment screen according to the change of the touch position. The user controls the virtual object to perform a certain behavior.
- the client can still respond to the user's sliding operation without letting go of the user's hand, simplifying the user's operation, and controlling the execution behavior of the virtual object. At the same time, the adjustment efficiency of the virtual environment screen is improved.
- step 303 includes the following steps:
- the touch position refers to the current contact position between the first touch operation and the user interface.
- the touch position is the click position of the click operation; if the first touch operation is a slide operation, the touch position is the current sliding end point of the slide operation Location.
- the client after detecting the above-mentioned first touch operation, the client detects the touch position of the first touch operation in real time, and obtains the touch position of the first touch operation.
- the client calculates the distance between the touch position of the first touch operation and the reference position of the first functional control.
- the reference position may be the center position of the first functional control.
- the client after acquiring the distance between the touch position and the reference position, the client compares the distance with a threshold. Further, when the distance is greater than the threshold, the current touch position is used as the second position to display the second virtual environment screen; when the distance is less than or equal to the threshold, the orientation of the first viewing angle of the virtual object remains unchanged .
- the aforementioned threshold is greater than or equal to the maximum distance between the center position of the first functional control and the edge position of the first functional control.
- the distance between the current touch position and the reference position as the judgment basis, it can be accurately determined whether the current touch position is outside the response area of the first functional control, improving the response speed and accuracy of adjusting the viewing angle.
- the client determines that the touch position of the first touch operation moves from the response area of the first functional control to the response area of the first functional control when the distance is determined to be greater than the threshold
- the client acquires the direction information of the touch position of the first touch operation relative to the reference position, and determines the orientation of the second viewing angle according to the orientation information, and then displays the orientation of the second viewing angle based on the orientation of the second viewing angle.
- the direction information of the second position relative to the reference position can intuitively reflect the orientation of the current touch position relative to the first functional control, the user can accurately control the adjustment of the viewing angle direction based on the orientation and improve the viewing angle adjustment accuracy.
- the movement of the virtual object can also cause the virtual environment screen to change, therefore, in the case of keeping the orientation of the first viewing angle of the virtual object unchanged, the first virtual environment screen may also be caused by the virtual Changes due to movement in the environment.
- the above-mentioned reference position may also be an edge point of the first functional control passed by the first touch operation during the position movement process, and correspondingly, the above-mentioned threshold value is any value greater than zero.
- the above direction information includes a moving direction and a moving distance
- the client determines the orientation change direction of the first viewing angle according to the moving direction of the touch position relative to the reference position, and determines the The orientation change angle of the first viewing angle, further, the client adjusts the first viewing angle to the second viewing angle according to the orientation changing direction and the orientation changing angle.
- the user can customize and adjust the adjustment rules corresponding to the virtual environment screen (that is, the custom operation rules hereinafter) according to the actual situation, and the specific steps are as follows:
- a custom operation rule is an operation rule selected from various candidate operation rules.
- the multiple candidate operating rules include a first operating rule.
- the first operation rule is for a touch operation whose initial touch position is located within the response area of the functional control, the touch position of the touch operation moves within the response area of the functional control and cannot trigger steering, and the touch operation The movement of the touch position outside the response area of the functional control can trigger the steering operation rule.
- the above-mentioned plurality of candidate operation rules further include at least one of the second operation rule and the third operation rule.
- the second operation rule is an operation rule for the touch operation whose initial touch position is within the response area of the functional control, and the touch position of the touch operation cannot trigger steering when moving;
- the third operation rule is for For a touch operation whose initial touch position is within the response area of the functional control, the touch position of the touch operation triggers the steering operation rule when it moves. It should be noted that the above-mentioned turning refers to a change in the orientation of the viewing angle of the virtual object.
- custom action rules The trigger position is within the response area outside the response area under the trigger position
- first operating rule Can't trigger steering can trigger steering
- second operating rule Can't trigger steering can trigger steering
- third operating rule can trigger steering can trigger steering
- the trigger position in Table 1 refers to the trigger position of the first touch operation on the first functional control
- the response area refers to the response area of the first functional control
- the client obtains the above-mentioned custom operation rules through the user's selection operation in the configuration interface.
- the client when it acquires the above-mentioned custom operation rules, it first displays a configuration interface, wherein the configuration interface includes options corresponding to the above-mentioned multiple candidate operation rules; further, when the client detects that the above-mentioned multiple The operation of the option corresponding to the target operation rule in one of the candidate operation rules, and set the target operation rule as a custom operation rule.
- the above-mentioned operation on the selection item may also be referred to as a selection operation.
- the selectable candidate operation rules can be intuitively displayed through the configuration interface, which helps users to accurately select the candidate operation rules for their own needs.
- the above configuration interface may also include selection items of other contents, such as selection items of the response area of the function control, selection items of the display rule of the function control, selection items of the display area of the function control, Behavioral operation rules of functional controls, etc.
- the configuration interface 50 includes an adjustment rule 51 of a virtual environment screen, a behavior operation rule 52 of a first functional control, and a display rule 53 of each functional control.
- the adjustment rule 51 of the virtual environment screen includes the first operation rule, the second operation rule and the third operation rule
- the behavior operation rule 52 of the first function control includes the execution behavior after pressing and the execution behavior after releasing; each function
- the display rules of the control include direct display, hidden display and semi-hidden display.
- an option 54 is included before each selectable rule.
- step 303 in response to the touch position of the first touch operation moving from the first position to the second position, displaying the second virtual environment screen.
- the display mode of the virtual environment screen corresponding to the first operation rule is the display mode of the virtual environment screen introduced in the above-mentioned embodiment in FIG. 3 .
- the display mode of the virtual environment screen corresponding to the first operation rule is the display mode of the virtual environment screen introduced in the above-mentioned embodiment in FIG. 3 .
- custom operation rules users can select corresponding custom operation rules to operate based on their own usage habits, thereby improving flexibility.
- the client controls the virtual object to execute the first Behaviors corresponding to the functional controls; further, when detecting that the touch position of the first touch operation has moved, the client controls the virtual object to keep the first viewing angle unchanged.
- the client controls the virtual object to keep the first viewing angle unchanged, only the orientation of the first viewing angle remains unchanged. Therefore, in this case, when the position of the virtual object changes due to the movement of the virtual object , the first virtual environment picture will also change correspondingly as the position of the virtual object changes.
- the client receives the second touch operation after controlling the virtual object to perform the behavior corresponding to the first functional control.
- the initial touch position of the second touch operation is a third position outside the response area of the first functional control.
- the user-defined operation rule is the second operation rule
- the client detects that the touch position of the second touch operation moves from the above-mentioned third position to the fourth position, it displays the third Virtual environment screen.
- the third virtual environment picture is a picture for observing the virtual environment from a third perspective of the virtual object.
- the third viewing angle and the second viewing angle have different orientations. Similar to the above-mentioned first perspective, the third perspective is only used to represent the orientation of the virtual object when observing the virtual environment, and does not limit the above-mentioned first-person perspective and the above-mentioned third-person perspective.
- the third position may be any position outside the response area of the first functional control; or, the third position may be any position in the target area.
- the target area may be part of the remaining area of the user interface except the response area of each functional control, or may be all the remaining area of the user interface except the response area of each functional control.
- the above-mentioned fourth position may be any position outside the response area of the first functional control except the third position; or, the above-mentioned fourth position may be any position in the target area except the third position; or , the above-mentioned fourth position may be any position in the user interface except the third position.
- the target area may be part of the remaining area of the user interface except the response area of each functional control, or may be all the remaining area of the user interface except the response area of each functional control.
- the second touch operation may also be referred to as a sliding operation.
- the user can also conveniently control the viewing angle orientation of the first virtual object based on the sliding operation without touching the first functional control, which simplifies the operation.
- Step 601 displaying the first virtual environment screen
- Step 602 in response to the first touch operation on the first functional control, control the virtual object to perform the behavior corresponding to the first functional control;
- Step 603 in response to the movement of the touch position of the first touch operation, control the virtual object to keep the first viewing angle unchanged;
- Step 604 in response to the second touch operation whose initial touch position is outside the response area of the first functional control, obtain the position change of the second touch operation;
- Step 605 adjusting the virtual object from the first viewing angle to the third viewing angle according to the position change of the touch position of the second touch operation;
- Step 606 adjust the first virtual environment picture based on the third viewing angle.
- the user-defined operation rule is the third operation rule
- in response to the touch position of the first touch operation move from the first position to the fifth position within the response area of the first functional control, and display The fourth virtual environment screen.
- the client controls the virtual object to execute the third operation rule.
- a behavior corresponding to a functional control further, at the touch position where the first touch operation is detected, move from the first position to a fifth position within the response area of the first functional control, and display a fourth virtual environment screen .
- the fourth virtual environment picture is a picture for observing the virtual environment from the fourth angle of view of the virtual object, and the fourth angle of view and the first angle of view have different orientations; the above-mentioned fifth position can be any Location. Similar to the above-mentioned first perspective, the fourth perspective is only used to represent the orientation of the virtual object when observing the virtual environment, and does not limit the above-mentioned first-person perspective and the above-mentioned third-person perspective.
- the client when the touch position of the first touch operation is detected and moves from the first position to any position other than the first position, the client can The first viewing angle of the virtual object is adjusted to adjust the first virtual environment picture.
- Step 701 displaying the first virtual environment screen
- Step 702 in response to the first touch operation on the first functional control, control the virtual object to perform the behavior corresponding to the first functional control;
- Step 703 in response to the movement of the touch position of the first touch operation, acquire the position change of the touch position of the first touch operation;
- Step 704 adjusting the virtual object from the first viewing angle to the fourth viewing angle according to the position change of the first touch operation
- Step 705 adjust the first virtual environment picture based on the fourth viewing angle.
- the adjustment for the first viewing angle mentioned in this application is only the adjustment for the orientation of the first viewing angle, and does not limit the adjustment for the position of the first viewing angle.
- the position of the virtual camera changes following the position of the virtual object, which will cause the position of the first perspective of the virtual object to change.
- FIG. 8 shows a flowchart of a virtual object control method provided by an embodiment of the present application. This method can be applied to the terminal 10 of the virtual screen display system shown in FIG.
- the method may include the following steps (801-803):
- Step 801 displaying a first virtual environment screen.
- Step 802 in response to the first touch operation on the first functional control, control the virtual object to execute the behavior corresponding to the first functional control.
- steps 801 and 802 are the same as steps 301 and 302 in the embodiment in FIG. 3 .
- steps 301 and 302 are the same as steps 301 and 302 in the embodiment in FIG. 3 .
- steps 301 and 302 are the same as steps 301 and 302 in the embodiment in FIG. 3 .
- details refer to the embodiment in FIG. 3 , and details are not repeated here.
- Step 803 in response to the touch position of the first touch operation, move from the first position to a sixth position within the response area of the second functional control, and control the virtual object to perform the behavior corresponding to the second functional control.
- the client after detecting the first touch operation, the client detects the touch position of the first touch operation. If the touch position of the first touch operation is After moving to the sixth position within the response area of the second functional control, the virtual object is controlled to perform the behavior corresponding to the second functional control.
- the second functional control refers to any functional control in the above at least one functional control except the first functional control
- the sixth position refers to any position within the response area of the second functional control.
- the user can control the virtual object to perform multiple behaviors in one touch operation, such as the behavior corresponding to the first functional control and the behavior corresponding to the second functional control, which effectively reduces the number of touches and simplifies the difficulty of control.
- the client before the client controls the virtual object to perform the behavior corresponding to the second touch operation, it needs to perform the behavior corresponding to the second touch operation and the behavior corresponding to the first functional control. judge.
- the behavior corresponding to the second functional control and the behavior corresponding to the first functional control are non-mutually exclusive behaviors
- the client detects that the touch position of the first touch operation moves from the first position to the above-mentioned sixth position
- Control the virtual object to execute the behavior corresponding to the second functional control
- the client detects the touch position of the first touch operation When moving from the first position to the sixth position, the control virtual object does not perform the behavior corresponding to the second functional control.
- mutually exclusive behavior refers to two behaviors that cannot be executed consecutively.
- the prone posture can be converted into a standing posture, and the standing posture can be converted into a jumping posture, but the prone posture cannot be converted into a jumping posture.
- the prone posture The conversion behavior of the conversion behavior and the jumping posture are mutually exclusive behaviors. If the first functional control is used to instruct the virtual object to convert to the prone posture, and the second functional control is used to instruct the virtual object to convert to the jumping posture, then the client detects When the touch position of the first touch operation moves from the first position to the sixth position, the virtual object cannot be controlled to transform into a jumping gesture.
- the virtual object can be controlled to continue to execute the action corresponding to the second function control after performing the action corresponding to the first function control. Behavior, improve the control efficiency of virtual objects.
- the first viewing angle of the virtual object can be adjusted to observe the virtual environment , at this time, while observing the virtual environment, according to the actual situation, the user can continue to move the touch position of the first touch operation to the response area of the second function control, and control the virtual object to perform the behavior corresponding to the second function control. While improving the behavior execution efficiency of the virtual object, user operations are simplified, making the behavior execution of the virtual object more coherent.
- FIG. 9 shows a flowchart of a virtual object control method provided by another embodiment of the present application. This method can be applied to the terminal 10 of the virtual screen display system shown in FIG.
- the method may include the following steps (901-905):
- Step 901 displaying a first virtual environment screen.
- Step 902 in response to a first touch operation on the first functional control, control the virtual object to perform a behavior corresponding to the first functional control.
- steps 901 and 902 are the same as steps 301 and 302 in the embodiment in FIG. 3 .
- steps 301 and 302 are the same as steps 301 and 302 in the embodiment in FIG. 3 .
- steps 301 and 302 are the same as steps 301 and 302 in the embodiment in FIG. 3 .
- details refer to the embodiment in FIG. 3 , and details are not repeated here.
- Step 903 in response to the touch position of the first touch operation, move from the first position to a second position outside the response area of the first functional control, and acquire the area where the second position is located.
- the client after detecting the first touch operation, the client detects the touch position of the first touch operation. If the touch position of the first touch operation is When moving to a second position outside the response area of the first function control, the area where the second position is located is obtained.
- the first functional control corresponds to an extended response area
- the extended response area of the first functional control surrounds the response area of the first functional control
- Step 904 in the case that the second position is within the extended response area of the first functional control, periodically control the virtual object to execute the behavior corresponding to the first functional control.
- the client moves from the first position to the second touch position when the first touch operation is detected.
- the virtual object is periodically controlled to perform the behavior corresponding to the first functional control.
- Step 905 when the second position is outside the extended response area of the first functional control, control the virtual object not to execute the behavior corresponding to the first functional control.
- the client moves from the first position to the second touch position when the first touch operation is detected.
- the control virtual object does not perform the behavior corresponding to the first functional control.
- the above-mentioned virtual environment is a virtual environment in a shooting application.
- the client controls the virtual object to shoot; then, when the touch position of the first touch operation moves from position 101 to position 102 in the extended response area of the shooting button 100, the client adjusts While controlling the virtual object’s viewing angle, periodically control the virtual object to shoot; then, when the touch position of the first touch operation moves from position 102 to position 103 outside the extended response area of the shooting button 100, the customer The terminal only adjusts the viewing angle of the virtual object, and controls the virtual object not to shoot.
- the virtual object when the touch position of the first functional control is within the extended response area of the first functional control, the virtual object is periodically controlled to perform the behavior corresponding to the first functional control , making the behavior execution mode of the virtual object more flexible, and improving the flexibility of the user when controlling the execution behavior of the virtual object.
- the first viewing angle of the virtual object can be adjusted to observe the virtual environment , in this case, while controlling the virtual object to observe the virtual environment, the user can properly slide the touch position of the first touch operation to the extended response area of the first function control according to the actual situation, so as to timely control
- the virtual object reacts to the virtual environment and executes the behavior corresponding to the first functional control.
- FIG. 11 shows a block diagram of a display device for a virtual environment screen provided by an embodiment of the present application.
- the device has the function of realizing the display method of the above-mentioned virtual environment picture, and the function can be realized by hardware, and can also be realized by hardware executing corresponding software.
- the device may be a terminal, or be set in the terminal.
- the apparatus 1000 may include: a picture display module 1110 , an object control module 1120 and a picture adjustment module 1130 .
- the screen display module 1110 is configured to display a first virtual environment screen, where the first virtual environment screen is a screen for observing the virtual environment from a first perspective of a virtual object.
- the object control module 1120 is configured to control the virtual object to execute the behavior corresponding to the first functional control in response to the first touch operation on the first functional control; the initial touch position of the first touch operation is the first position within the response area of the first function control.
- the screen adjustment module 1130 is configured to move from the first position to a second position in response to the touch position of the first touch operation, and display a second virtual environment screen; the second position is located at the first A position outside the response area of a functional control, the second virtual environment screen is a screen for observing the virtual environment from a second viewing angle of the virtual object, and the second viewing angle and the first viewing angle have different orientations.
- the screen adjustment module 1130 is configured to obtain the touch position of the first touch operation; calculate the touch position of the first touch operation and the benchmark of the first functional control The distance between the positions; if the distance is greater than a threshold, it is determined that the touch position is outside the response area of the first functional control, and the touch position is used as the second position and the The second virtual environment screen.
- the screen adjustment module 1130 is configured to acquire direction information of the second touch position relative to the reference position; determine an orientation of the second viewing angle according to the direction information; The orientation of the second viewing angle is displayed, and the second virtual environment picture is displayed.
- the apparatus 1100 further includes: a rule acquiring module 1140 .
- a rule acquisition module 1140 configured to acquire a custom operation rule, the custom operation rule is an operation rule selected from various candidate operation rules, the multiple candidate operation rules include a first operation rule, and the The plurality of candidate operating rules also include at least one of a second operating rule and a third operating rule;
- the first operating rule is for a touch operation whose initial touch position is within the response area of the functional control, and the touch The operation rule that the touch position of the touch operation moves within the response area of the functional control cannot trigger steering, and the touch position of the touch operation moves outside the response area of the functional control can trigger steering;
- the second operation The rule is for the touch operation whose initial touch position is within the response area of the functional control, and the touch position of the touch operation cannot trigger steering when moving;
- the third operation rule is for the initial touch
- the control position is located in the touch operation of the response area of the functional control, and the touch position of the touch operation triggers the operation rule of turning when it moves.
- the screen adjustment module 1130 is further configured to, in the case that the user-defined operation rule is the first operation rule, execute the change of the touch position from the first position in response to the first touch operation.
- the device 1100 further includes: a viewing angle maintaining module 1150 .
- the viewing angle maintaining module 1150 is configured to control the virtual object to maintain the second touch position in response to the movement of the touch position of the first touch operation when the user-defined operation rule is the second operation rule. The angle of view remains unchanged.
- the apparatus 1100 further includes: an operation receiving module 1160 .
- the operation receiving module 1160 is configured to receive a second touch operation, the initial touch position of the second touch operation is a third position outside the response area of the first functional control.
- the screen adjustment module 1130 is further configured to move from the third position to In the fourth position, a third virtual environment picture is displayed; the third virtual environment picture is a picture for observing the virtual environment from a third perspective of the virtual object, and the third perspective and the second perspective have different orientations.
- the screen adjustment module 1130 is further configured to respond to the touch position of the first touch operation when the custom operation rule is the third operation rule, from The first position is moved to a fifth position within the response area of the first functional control, and a fourth virtual environment picture is displayed; the fourth virtual environment picture is a view of the virtual object from a fourth perspective A screen for observing the virtual environment, and the fourth viewing angle and the first viewing angle have different orientations.
- the rule acquisition module 1140 is configured to display a configuration interface, the configuration interface includes selection items corresponding to the various candidate operation rules; The operation of the selection item corresponding to the target operation rule, and the target operation rule is set as the custom operation rule.
- the object control module 1120 is further configured to move from the first position to the touch position within the response area of the second function control in response to the touch position of the first touch operation. In a sixth position, the virtual object is controlled to execute the behavior corresponding to the second functional control.
- the object control module 1120 is further configured to, in the case that the behavior corresponding to the second functional control and the behavior corresponding to the first functional control are non-mutually exclusive behaviors, execute the The virtual object executes the behavior corresponding to the second functional control; or, in the case that the behavior corresponding to the second functional control and the behavior corresponding to the first functional control are mutually exclusive behaviors, controlling the virtual The object does not execute the behavior corresponding to the second functional control.
- the object control module 1020 is further configured to periodically control the virtual object to execute the the behavior corresponding to the first functional control; or, in the case that the second position is outside the extended response area of the first functional control, control the virtual object not to execute the behavior corresponding to the first functional control ; Wherein, the extended response area of the first functional control surrounds the response area of the first functional control.
- the virtual environment is a virtual environment in a shooting application.
- the virtual object is controlled to perform the behavior corresponding to the first functional control, and, after the touch operation of the first touch operation Control position, when moving from the first position to the second position outside the response area of the first functional control, adjust the virtual object from the first viewing angle to the second viewing angle, and then display the second viewing angle based on the second viewing angle
- Two virtual environment screens that is to say, in the same touch operation, it can not only control the virtual object to perform the corresponding behavior, but also adjust the virtual environment screen according to the change of the touch position. The user controls the virtual object to perform a certain behavior.
- the client can still respond to the user's sliding operation without letting go of the user's hand, simplifying the user's operation, and controlling the execution behavior of the virtual object. At the same time, the adjustment efficiency of the virtual environment screen is improved.
- the division of the above-mentioned functional modules is used as an example for illustration. In practical applications, the above-mentioned function allocation can be completed by different functional modules according to the needs.
- the internal structure of the device is divided into different functional modules to complete all or part of the functions described above.
- the device and the method embodiment provided by the above embodiment belong to the same idea, and the specific implementation process thereof is detailed in the method embodiment, and will not be repeated here.
- FIG. 13 shows a structural block diagram of a terminal 1300 provided by an embodiment of the present application.
- the terminal 1300 may be an electronic device such as a mobile phone, a tablet computer, a game console, an e-book reader, a multimedia playback device, a wearable device, and a PC.
- the terminal is used to implement the method for displaying a virtual environment screen provided in the foregoing embodiments.
- the terminal may be the terminal 10 in the game running environment shown in FIG. 1 . Specifically:
- the terminal 1300 includes: a processor 1301 and a memory 1302 .
- the processor 1301 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like.
- Processor 1301 can be realized by at least one hardware form in DSP (Digital Signal Processing, digital signal processing), FPGA (Field Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array, programmable logic array) .
- Processor 1301 may also include a main processor and a coprocessor, the main processor is a processor for processing data in a wake-up state, and is also called a CPU (Central Processing Unit, central processing unit); the coprocessor is Low-power processor for processing data in standby state.
- CPU Central Processing Unit
- the processor 1301 may be integrated with a GPU (Graphics Processing Unit, image processor), and the GPU is used for rendering and drawing the content that needs to be displayed on the display screen.
- the processor 1301 may also include an AI (Artificial Intelligence, artificial intelligence) processor, where the AI processor is used to process computing operations related to machine learning.
- AI Artificial Intelligence, artificial intelligence
- Memory 1302 may include one or more computer-readable storage media, which may be non-transitory.
- the memory 1302 may also include high-speed random access memory and non-volatile memory, such as one or more magnetic disk storage devices and flash memory storage devices.
- the non-transitory computer-readable storage medium in the memory 1302 is used to store at least one instruction, at least one program, code set or instruction set, and the at least one instruction, at least one program, code set or instruction set, and configured to be executed by one or more processors, so as to realize the display method of the above-mentioned virtual environment screen.
- the terminal 1300 may optionally further include: a peripheral device interface 1303 and at least one peripheral device.
- the processor 1301, the memory 1302, and the peripheral device interface 1303 may be connected through buses or signal lines.
- Each peripheral device can be connected to the peripheral device interface 1303 through a bus, a signal line or a circuit board.
- the peripheral device includes: at least one of a radio frequency circuit 1304 , a display screen 1305 , a camera 1306 , an audio circuit 1307 , a positioning component 1308 and a power supply 1309 .
- FIG. 13 does not constitute a limitation to the terminal 1300, and may include more or less components than shown in the figure, or combine certain components, or adopt a different component arrangement.
- an embodiment of the present application further provides a storage medium, where the storage medium is used to store a computer program, and the computer program is used to execute the method provided in the foregoing embodiments.
- the embodiment of the present application also provides a computer program product including instructions, which, when run on a computer, causes the computer to execute the method provided in the foregoing embodiments.
- the computer-readable storage medium may include: ROM (Read Only Memory, read-only memory), RAM (Random Access Memory, random access memory), SSD (Solid State Drives, solid state drive) or an optical disc, etc.
- the random access memory may include ReRAM (Resistance Random Access Memory, resistive random access memory) and DRAM (Dynamic Random Access Memory, dynamic random access memory).
- a computer program product is also provided, and when the computer program product is executed by a processor, it is used to implement the above method for displaying a virtual environment screen.
- the "plurality” mentioned herein refers to two or more than two.
- “And/or” describes the association relationship of associated objects, indicating that there may be three types of relationships, for example, A and/or B may indicate: A exists alone, A and B exist simultaneously, and B exists independently.
- the character "/” generally indicates that the contextual objects are an "or” relationship.
- the numbering of the steps described herein only exemplarily shows a possible sequence of execution among the steps. In some other embodiments, the above-mentioned steps may not be executed according to the order of the numbers, such as two different numbers The steps are executed at the same time, or two steps with different numbers are executed in the reverse order as shown in the illustration, which is not limited in this embodiment of the present application.
Abstract
Description
自定义操作规则 | 触发位置在响应区域内 | 触发位置下响应区域外 |
第一操作规则 | 不能触发转向 | 能触发转向 |
第二操作规则 | 不能触发转向 | 不能触发转向 |
第三操作规则 | 能触发转向 | 能触发转向 |
Claims (16)
- 一种虚拟环境画面的显示方法,所述方法由终端执行,所述方法包括:显示第一虚拟环境画面,所述第一虚拟环境画面是以虚拟对象的第一视角对虚拟环境进行观察的画面;响应于针对第一功能控件的第一触控操作,控制所述虚拟对象执行所述第一功能控件对应的行为;所述第一触控操作的起始触控位置是位于所述第一功能控件的响应区域内的第一位置;响应于所述第一触控操作的触控位置从所述第一位置移动至第二位置,显示第二虚拟环境画面;所述第二位置为位于所述第一功能控件的响应区域之外的位置,所述第二虚拟环境画面是以所述虚拟对象的第二视角对所述虚拟环境进行观察的画面,且所述第二视角和所述第一视角具有不同的朝向。
- 根据权利要求1所述的方法,所述响应于所述第一触控操作的触控位置从所述第一位置移动至第二位置,显示第二虚拟环境画面,包括:获取所述第一触控操作的触控位置;计算所述第一触控操作的触控位置和所述第一功能控件的基准位置之间的距离;在所述距离大于阈值的情况下确定所述触控位置位于所述第一功能控件的响应区域之外,将所述触控位置作为所述第二位置并显示所述第二虚拟环境画面。
- 根据权利要求2所述的方法,所述将所述触控位置作为所述第二位置并显示所述第二虚拟环境画面,包括:获取所述第二位置相对于所述基准位置的方向信息;根据所述方向信息确定所述第二视角的朝向;基于所述第二视角的朝向,显示所述第二虚拟环境画面。
- 根据权利要求1所述的方法,所述方法还包括:获取自定义操作规则,所述自定义操作规则是从多种候选操作规则中选取的一种操作规则,所述多种候选操作规则包括第一操作规则,且所述多种候选操作规则还包括第二操作规则和第三操作规则中的至少一种;所述第一操作规则为针对起始触控位置位于功能控件的响应区域内的触控操作,所述触控操作的触控位置在所述功能控件的响应区域内移动无法触发转向,且所述触控操作的触控位置在功能控件的响应区域外移动能够触发转向的操作规则;所述第二操作规则为针对起始触控位置位于功能控件的响应区域内的触控操作,所述触控操作的触控位置在移动时无法触发转向的操作规则;所述第三操作规则为针对起始触控位置位于功能控件的响应区域内的触控操作,所述触控操作的触控位置在移动时触发转向的操作规则;在所述自定义操作规则为所述第一操作规则的情况下,执行所述响应于所述第一触控操作的触控位置从所述第一位置移动至第二位置,显示第二虚拟环境画面的步骤。
- 根据权利要求4所述的方法,所述响应于针对第一功能控件的第一触控操作,控制所述虚拟对象执行所述第一功能控件对应的行为之后,还包括:在所述自定义操作规则为所述第二操作规则的情况下,响应于所述第一触控操作的触 控位置发生移动,控制所述虚拟对象保持所述第一视角不变。
- 根据权利要求4所述的方法,所述方法还包括:接收第二触控操作,所述第二触控操作的起始触控位置是位于所述第一功能控件的响应区域外的第三位置;在所述自定义操作规则为所述第二操作规则的情况下,响应于所述第二触控操作的触控位置,从所述第三位置移动至第四位置,显示第三虚拟环境画面;所述第三虚拟环境画面是以所述虚拟对象的第三视角对所述虚拟环境进行观察的画面,且所述第三视角和所述第二视角具有不同的朝向。
- 根据权利要求4所述的方法,所述方法还包括:在所述自定义操作规则为所述第三操作规则的情况下,响应于所述第一触控操作的触控位置,从所述第一位置移动至位于所述第一功能控件的响应区域之内的第五位置,显示第四虚拟环境画面;所述第四虚拟环境画面是以所述虚拟对象的第四视角对所述虚拟环境进行观察的画面,且所述第四视角和所述第一视角具有不同的朝向。
- 根据权利要求4所述的方法,所述获取自定义操作规则,包括:显示配置界面,所述配置界面中包括所述多种候选操作规则分别对应的选择项;响应于针对所述多种候选操作规则中的目标操作规则对应的选择项的操作,将所述目标操作规则设置为所述自定义操作规则。
- 根据权利要求1至8任一项所述的方法,所述响应于针对第一功能控件的第一触控操作,控制所述虚拟对象执行所述第一功能控件对应的行为之后,还包括:响应于所述第一触控操作的触控位置,从所述第一位置移动至位于第二功能控件的响应区域之内的第六位置,控制所述虚拟对象执行所述第二功能控件对应的行为。
- 根据权利要求9所述的方法,所述控制所述虚拟对象执行所述第二功能控件对应的行为之前,还包括:在所述第二功能控件对应的行为与所述第一功能控件对应的行为是非互斥行为的情况下,执行所述控制所述虚拟对象执行所述第二功能控件对应的行为的步骤;或者,在所述第二功能控件对应的行为与所述第一功能控件对应的行为是互斥行为的情况下,控制所述虚拟对象不执行所述第二功能控件对应的行为。
- 根据权利要求1至8任一项所述的方法,所述方法还包括:在所述第二位置位于所述第一功能控件的扩展响应区域之内的情况下,周期性地控制所述虚拟对象执行所述第一功能控件对应的行为;或者,在所述第二位置位于所述第一功能控件的扩展响应区域之外的情况下,控制所述虚拟对象不执行所述第一功能控件对应的行为;其中,所述第一功能控件的扩展响应区域包围所述第一功能控件的响应区域。
- 根据权利要求1至8任一项所述的方法,所述虚拟环境为射击应用中的虚拟环境。
- 一种虚拟环境画面的显示装置,所述装置包括:画面显示模块,用于显示第一虚拟环境画面,所述第一虚拟环境画面是以虚拟对象的第一视角对虚拟环境进行观察的画面;对象控制模块,用于响应于针对第一功能控件的第一触控操作,控制所述虚拟对象执行所述第一功能控件对应的行为;所述第一触控操作的起始触控位置是位于所述第一功能控件的响应区域内的第一位置;画面调整模块,用于响应于所述第一触控操作的触控位置,从所述第一位置移动至第二位置,显示第二虚拟环境画面;所述第二位置为位于所述第一功能控件的响应区域之外的位置,所述第二虚拟环境画面是以所述虚拟对象的第二视角对所述虚拟环境进行观察的画面,且所述第二视角和所述第一视角具有不同的朝向。
- 一种终端,所述终端包括处理器和存储器,所述存储器中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如权利要求1至12任一项所述的虚拟环境画面的显示方法。
- 一种计算机可读存储介质,所述存储介质中存储有计算机程序,所述计算机程序用于执行以实现如权利要求1至12任一项所述的虚拟环境画面的显示方法。
- 一种包括指令的计算机程序产品,当其在计算机上运行时,使得所述计算机执行权利要求1至12任一项所述的虚拟环境画面的显示方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020237034101A KR20230152753A (ko) | 2021-08-19 | 2022-06-29 | 가상 환경 화상 디스플레이 방법 및 디바이스, 단말기, 매체, 및 프로그램 제품 |
US18/199,217 US20230289054A1 (en) | 2021-08-19 | 2023-05-18 | Control mode selection to indicate whether simultaneous perspective change and function selection is enabled |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110956123.2 | 2021-08-19 | ||
CN202110956123.2A CN113633975B (zh) | 2021-08-19 | 2021-08-19 | 虚拟环境画面的显示方法、装置、终端及存储介质 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/199,217 Continuation US20230289054A1 (en) | 2021-08-19 | 2023-05-18 | Control mode selection to indicate whether simultaneous perspective change and function selection is enabled |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023020125A1 true WO2023020125A1 (zh) | 2023-02-23 |
Family
ID=78423068
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2022/102206 WO2023020125A1 (zh) | 2021-08-19 | 2022-06-29 | 虚拟环境画面的显示方法、装置、终端、介质及程序产品 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20230289054A1 (zh) |
KR (1) | KR20230152753A (zh) |
CN (1) | CN113633975B (zh) |
WO (1) | WO2023020125A1 (zh) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113633975B (zh) * | 2021-08-19 | 2023-10-20 | 腾讯科技(深圳)有限公司 | 虚拟环境画面的显示方法、装置、终端及存储介质 |
CN116943214A (zh) * | 2022-04-14 | 2023-10-27 | 腾讯科技(深圳)有限公司 | 虚拟道具的使用方法、装置、设备、介质及程序产品 |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105760076A (zh) * | 2016-02-03 | 2016-07-13 | 网易(杭州)网络有限公司 | 游戏控制方法及装置 |
CN107694087A (zh) * | 2017-10-23 | 2018-02-16 | 网易(杭州)网络有限公司 | 信息处理方法及终端设备 |
CN107773987A (zh) * | 2017-10-24 | 2018-03-09 | 网易(杭州)网络有限公司 | 虚拟射击主体控制方法、装置、电子设备及存储介质 |
CN109847370A (zh) * | 2019-03-26 | 2019-06-07 | 网易(杭州)网络有限公司 | 射击游戏的控制方法、装置、设备和存储介质 |
JP2019154668A (ja) * | 2018-03-09 | 2019-09-19 | 株式会社 ディー・エヌ・エー | ゲーム装置及びゲームプログラム |
CN112451969A (zh) * | 2020-12-04 | 2021-03-09 | 腾讯科技(深圳)有限公司 | 虚拟对象控制方法、装置、计算机设备及存储介质 |
CN112957729A (zh) * | 2021-02-25 | 2021-06-15 | 网易(杭州)网络有限公司 | 游戏中的射击瞄准方法、装置、设备及存储介质 |
CN113633975A (zh) * | 2021-08-19 | 2021-11-12 | 腾讯科技(深圳)有限公司 | 虚拟环境画面的显示方法、装置、终端及存储介质 |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006146556A (ja) * | 2004-11-19 | 2006-06-08 | Nintendo Co Ltd | 画像表示処理プログラムおよび画像表示処理装置 |
-
2021
- 2021-08-19 CN CN202110956123.2A patent/CN113633975B/zh active Active
-
2022
- 2022-06-29 WO PCT/CN2022/102206 patent/WO2023020125A1/zh active Application Filing
- 2022-06-29 KR KR1020237034101A patent/KR20230152753A/ko active Search and Examination
-
2023
- 2023-05-18 US US18/199,217 patent/US20230289054A1/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105760076A (zh) * | 2016-02-03 | 2016-07-13 | 网易(杭州)网络有限公司 | 游戏控制方法及装置 |
CN107694087A (zh) * | 2017-10-23 | 2018-02-16 | 网易(杭州)网络有限公司 | 信息处理方法及终端设备 |
CN107773987A (zh) * | 2017-10-24 | 2018-03-09 | 网易(杭州)网络有限公司 | 虚拟射击主体控制方法、装置、电子设备及存储介质 |
JP2019154668A (ja) * | 2018-03-09 | 2019-09-19 | 株式会社 ディー・エヌ・エー | ゲーム装置及びゲームプログラム |
CN109847370A (zh) * | 2019-03-26 | 2019-06-07 | 网易(杭州)网络有限公司 | 射击游戏的控制方法、装置、设备和存储介质 |
CN112451969A (zh) * | 2020-12-04 | 2021-03-09 | 腾讯科技(深圳)有限公司 | 虚拟对象控制方法、装置、计算机设备及存储介质 |
CN112957729A (zh) * | 2021-02-25 | 2021-06-15 | 网易(杭州)网络有限公司 | 游戏中的射击瞄准方法、装置、设备及存储介质 |
CN113633975A (zh) * | 2021-08-19 | 2021-11-12 | 腾讯科技(深圳)有限公司 | 虚拟环境画面的显示方法、装置、终端及存储介质 |
Also Published As
Publication number | Publication date |
---|---|
CN113633975A (zh) | 2021-11-12 |
US20230289054A1 (en) | 2023-09-14 |
CN113633975B (zh) | 2023-10-20 |
KR20230152753A (ko) | 2023-11-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021017784A1 (zh) | 虚拟对象的控制方法、装置、终端及存储介质 | |
JP7331124B2 (ja) | 仮想オブジェクトの制御方法、装置、端末及び記憶媒体 | |
JP7350088B2 (ja) | 仮想オブジェクトの制御方法、装置、デバイス及びコンピュータプログラム | |
WO2023020125A1 (zh) | 虚拟环境画面的显示方法、装置、终端、介质及程序产品 | |
JP7387758B2 (ja) | インタフェース表示方法、装置、端末、記憶媒体及びコンピュータプログラム | |
CN110585731B (zh) | 在虚拟环境中投掷虚拟物品的方法、装置、终端及介质 | |
JP7137719B2 (ja) | 仮想オブジェクトの選択方法、装置、端末及びプログラム | |
WO2022037529A1 (zh) | 虚拟对象的控制方法、装置、终端及存储介质 | |
US20230330537A1 (en) | Virtual object control method and apparatus, terminal and storage medium | |
CN111921188A (zh) | 虚拟对象的控制方法、装置、终端及存储介质 | |
JP2024001280A (ja) | 仮想オブジェクトを選択するための方法並びにその、装置、端末及び記憶媒体 | |
WO2023029859A1 (zh) | 实时对局信息的显示方法、装置、终端及存储介质 | |
CN111905380B (zh) | 虚拟对象的控制方法、装置、终端及存储介质 | |
CN113617030B (zh) | 虚拟对象的控制方法、装置、终端及存储介质 | |
KR102648210B1 (ko) | 가상 객체 제어 방법 및 장치, 단말, 및 저장 매체 | |
KR102656970B1 (ko) | 가상 객체 선택 방법 및 장치, 디바이스, 및 저장 매체 | |
CN111643895B (zh) | 操作响应方法、装置、终端及存储介质 | |
CN111905380A (zh) | 虚拟对象的控制方法、装置、终端及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22857438 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20237034101 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020237034101 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023571704 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |