WO2022100339A1 - 虚拟对象的控制方法和装置、存储介质及电子设备 - Google Patents

虚拟对象的控制方法和装置、存储介质及电子设备 Download PDF

Info

Publication number
WO2022100339A1
WO2022100339A1 PCT/CN2021/123270 CN2021123270W WO2022100339A1 WO 2022100339 A1 WO2022100339 A1 WO 2022100339A1 CN 2021123270 W CN2021123270 W CN 2021123270W WO 2022100339 A1 WO2022100339 A1 WO 2022100339A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
virtual
action
virtual object
touch operation
Prior art date
Application number
PCT/CN2021/123270
Other languages
English (en)
French (fr)
Inventor
潘佳绮
杨泽锋
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to EP21815878.0A priority Critical patent/EP4026595A4/en
Priority to BR112022001065A priority patent/BR112022001065A2/pt
Priority to CA3146804A priority patent/CA3146804A1/en
Priority to JP2022514177A priority patent/JP7418554B2/ja
Priority to KR1020227002073A priority patent/KR102721446B1/ko
Priority to AU2021307015A priority patent/AU2021307015B2/en
Priority to US17/585,331 priority patent/US12090404B2/en
Priority to SA522431616A priority patent/SA522431616B1/ar
Publication of WO2022100339A1 publication Critical patent/WO2022100339A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle

Definitions

  • the present application relates to the field of computers, and in particular, to a method and apparatus for controlling a virtual object, a storage medium, an electronic device, and a computer program product.
  • buttons for triggering will be arranged in the area of the trigger button, which makes users at a loss. button, resulting in inefficient control.
  • An embodiment of the present application provides a method for controlling a virtual object, including:
  • the target virtual object When the target virtual object is located within the triggering range of the target interaction event, displaying first prompt information in the display interface, wherein the first prompt information is used to prompt to perform a touch operation on the virtual action button;
  • the target virtual object is controlled to perform a second action in the target interaction event.
  • the embodiment of the present application provides a control device for a virtual object, including:
  • a first display module configured to display a target virtual object and a virtual action button in the display interface, wherein the virtual action button is used to control the target virtual object to perform a first action
  • the second display module is configured to display first prompt information in the display interface when the target virtual object is within the triggering range of the target interaction event, wherein the first prompt information is used to prompt the virtual Action buttons perform touch operations;
  • the execution module is configured to control the target virtual object to perform a second action in the target interaction event when a first touch operation performed on the virtual action button is detected.
  • Embodiments of the present application provide a computer-readable storage medium, where computer instructions are stored in the computer-readable storage medium, wherein, when the computer instructions are executed by a processor, the virtual object control method provided by the embodiments of the present application is implemented.
  • An embodiment of the present application provides an electronic device, including a memory and a processor, where computer instructions are stored in the memory, and the processor is configured to implement the above-mentioned virtual object control method through the computer instructions.
  • the embodiments of the present application provide a computer program product, including computer instructions, and when the computer instructions are executed by a processor, implement the virtual object control method provided by the embodiments of the present application.
  • FIG. 1 is a schematic diagram of an application environment of a method for controlling a virtual object according to an embodiment of the present application
  • FIG. 2 is a schematic flowchart of a method for controlling a virtual object according to an embodiment of the present application
  • FIG. 3 is a schematic diagram of a display interface according to an embodiment of the present application.
  • FIG. 4 is a schematic diagram of another display interface according to an embodiment of the present application.
  • FIG. 5 is a schematic diagram of yet another display interface according to an embodiment of the present application.
  • FIG. 6 is a schematic diagram of another display interface according to an embodiment of the present application.
  • FIG. 7 is a schematic diagram of yet another display interface according to an embodiment of the present application.
  • FIG. 8 is a schematic diagram of yet another display interface according to an embodiment of the present application.
  • FIG. 9A is a schematic diagram of yet another display interface according to an embodiment of the present application.
  • FIG. 9B is a schematic diagram of yet another display interface according to an embodiment of the present application.
  • FIG. 10 is a schematic diagram of another display interface according to an embodiment of the present application.
  • FIG. 11 is a schematic diagram of yet another display interface according to an embodiment of the present application.
  • FIG. 12 is a schematic diagram of another display interface according to an embodiment of the present application.
  • FIG. 13 is a schematic flowchart of another method for controlling a virtual object according to an embodiment of the present application.
  • FIG. 14 is a schematic structural diagram of a control device for a virtual object according to an embodiment of the present application.
  • FIG. 15 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • Virtual scene using the scene output by electronic equipment that is different from the real world, the visual perception of the virtual scene can be formed through the assistance of naked eyes or equipment, such as the two-dimensional image output by the display screen, through stereo projection, virtual reality and augmentation. Reality technology and other stereoscopic display technologies to output three-dimensional images; in addition, various perceptions of the real world, such as auditory perception, tactile perception, olfactory perception and motion perception, can be formed through various possible hardware.
  • the virtual scene can be a simulated environment of the real world, a semi-simulated and semi-fictional virtual environment, or a purely fictional virtual environment.
  • the virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, or a three-dimensional virtual scene, and the embodiment of the present application does not limit the dimension of the virtual scene.
  • Virtual objects the images of various people and objects that can interact in the virtual scene, or the movable objects in the virtual scene.
  • the movable objects may be virtual characters, virtual animals, cartoon characters, etc., for example, characters, animals, plants, oil barrels, walls, stones, etc. displayed in the virtual scene.
  • the virtual object may be a virtual avatar representing the user in the virtual scene.
  • the virtual scene may include multiple virtual objects, and each virtual object has its own shape and volume in the virtual scene and occupies a part of the space in the virtual scene.
  • the virtual object may be a virtual character in a virtual scene, which is controlled by a user or artificial intelligence (Artificial Intelligence, AI).
  • AI Artificial Intelligence
  • Virtual props also known as interactive props, for the interaction of virtual objects in the virtual scene.
  • virtual props may include a virtual zipline that connects two locations, and a player can ride the zipline to quickly move from one location to another.
  • Embodiments of the present application provide a method for controlling a virtual object.
  • the above-mentioned method for controlling a virtual object may be applied to a hardware environment composed of a server 101 and a terminal device 103 as shown in FIG. 1 .
  • the server 101 is connected to the terminal device 103 through the network, and can be used to provide services for the terminal device 103 or the application 107 installed in the terminal device 103, and the application can be a video application, an instant messaging application, a browser application, an education application Applications, game applications, etc., may also include but are not limited to other applications capable of virtual object control.
  • a database 105 may be provided in the server 101 or independently of the server 101, and the database 105 is used to provide the server 101 with data storage services, such as game data storage services.
  • the above-mentioned networks may include but are not limited to wired networks and wireless networks, wherein the wired networks include but are not limited to local area networks, metropolitan area networks and wide area networks, and the wireless networks include but are not limited to Bluetooth, WIFI and other networks that implement wireless communication.
  • the terminal device 103 may be a terminal device configured with a virtual object control application (ie, the application 107 ), which may include, but is not limited to, at least one of the following: a mobile phone (such as an Android mobile phone, an iOS mobile phone, etc.), a laptop computer, a tablet computer, a handheld computer , Mobile Internet Devices (MID), PAD, desktop computer, smart TV, etc.
  • a mobile phone such as an Android mobile phone, an iOS mobile phone, etc.
  • MID Mobile Internet Devices
  • PAD desktop computer
  • smart TV smart TV
  • the above-mentioned server 101 may be a single server, a server cluster composed of multiple servers, or a cloud server, which may include but not limited to a router or a gateway.
  • the application 107 can be started, and the virtual scene (including the target virtual object and the virtual action button) can be output in the display interface of the application 107, and the application 107 is a game application
  • the display interface may include a game screen (or called a game interface) and an interactive interface.
  • this is only an example, which is not limited in this embodiment.
  • control method for virtual objects can be implemented in the terminal device 103 through the following steps:
  • S1 start the application 107 on the terminal device 103, and display the target virtual object and the virtual action button in the display interface of the application 107, wherein the virtual action button is used to control the target virtual object to perform the first action;
  • the display interface may include a game screen and an interactive interface, and the virtual action button and the first prompt information may be displayed in the interactive interface.
  • the above-mentioned control method for virtual objects may also be implemented by an application program including, but not limited to, configured on the server 101 , or, the terminal device 103 or the server 101 implements the above-mentioned control method for virtual objects in combination, for example, the server 101
  • the relevant display data of the virtual scene may be sent to the terminal device 103, so that the terminal device 103 displays the virtual scene according to the received display data.
  • the above is only an example, and this embodiment does not make a specific limitation.
  • control method for a virtual object includes:
  • the virtual object control application used to implement the virtual object control method may include but not limited to game software, app, applet, etc., and may also include but not limited to be configured in any software, app, applet, etc.
  • the above-mentioned target virtual objects may include, but are not limited to, virtual objects controlled when logging in to the control application of the virtual object after registering with the background or server corresponding to the control application of the virtual object.
  • the display interface may include a game screen and an interactive interface
  • the above-mentioned game screen and interactive interface may include but are not limited to being configured to be displayed in the same display area, overlapping the interactive interface and the game screen. Display, or display in different display areas (that is, display the interactive interface in the area of the display interface other than the above-mentioned game screen).
  • the game screen may be used to display the target virtual object
  • the interactive interface may be used to display the virtual action button and the first prompt information.
  • the display logic is not limited to this.
  • FIG. 3 is a schematic diagram of a display interface according to an embodiment of the present application. As shown in FIG. 3 , it may include, but is not limited to, displaying a game screen 302 and an interactive interface 304 in a game application, wherein the game screen A target virtual character 306 is included in 302 (target virtual character 306 is an example of a target virtual object), and a virtual action button 308 is displayed in the interactive interface 304 .
  • target virtual character 306 is an example of a target virtual object
  • a virtual action button 308 is displayed in the interactive interface 304 .
  • the above-mentioned virtual action button may include, but is not limited to, control the target virtual object to perform a preset first action when a touch operation performed on the above-mentioned virtual action button is detected.
  • the virtual action button 308 is configured as a "jump" button, that is, when a touch operation performed on the above-mentioned virtual action button 308 is detected, the control The target avatar 306 performs a "jump" operation.
  • the first action can be flexibly configured according to the control application of the virtual object, for example, it may include but not limited to configuring the above-mentioned first action as virtual objects such as "jump”, “squat”, “get down”, “bend over”, etc.
  • the control methods of objects can also include but are not limited to the control methods of virtual objects such as pushing, pulling, lifting, lifting, pressing, etc., and can also include but are not limited to being configured to match the preset interactive events.
  • the interactive event when the target virtual object is in the area that allows to perform predetermined operations, control the target virtual object to perform the predetermined operation; when the interactive event is that the target virtual object is in the area where virtual props are allowed to be opened, control the target virtual object to perform the operation of opening virtual props; When the target virtual object is in an area where virtual props are allowed to be used, the target virtual object is controlled to perform an operation using virtual props.
  • the first action is an action initially bound to the virtual action button
  • the second action is an action performed when the target virtual object is in the target interaction event.
  • the above-mentioned first prompt information is used to prompt to perform a touch operation on the virtual action button.
  • the above-mentioned touch operation may include but is not limited to clicking, long pressing, dragging, releasing, double-clicking, etc. Configure effects corresponding to different touch operations, where a long press means that the pressing duration is greater than the duration threshold.
  • FIG. 4 is according to an embodiment of the present application.
  • a schematic diagram of the display interface, as shown in FIG. 4 when the target avatar is located within the triggering range of the target interaction event, the first prompt information 406 is displayed in the vicinity 404 of the virtual action button 402, which can be displayed by including but not limited to arrows.
  • the dragging direction corresponding to the dragging operation can be indicated in the form of .
  • FIG. 5 is according to an embodiment of the present application. A schematic diagram of the display interface, as shown in FIG.
  • the first prompt information 506 is displayed in the vicinity of the virtual action button 502 in the area 504, which can be displayed by including but not limited to the force
  • the pressing force corresponding to the pressing operation is indicated in the form of a bar graph, wherein the shaded part in the first prompt information 506 is the current pressing force, and the corresponding force threshold value identifier can be configured in the first prompt information 506 to inform
  • the user performs the corresponding second action after detecting the pressing force that reaches the force threshold, and can also instruct the virtual action button to perform the pressing operation by including but not limited to displaying a text mark on the game screen, and different effects corresponding to different pressing forces. text identification.
  • a specific implementation manner may include, but is not limited to, a combination of one or more of the above, and this embodiment does not make any specific limitation.
  • target interaction events may include, but are not limited to, interaction events for interactable props, interaction events for interactable areas, and interaction events for interactable virtual objects.
  • target interaction events may include, but are not limited to, ziplining, rock climbing, swimming, riding, talking, shopping, and the like.
  • the virtual action button is configured to perform a second action different from the preconfigured first action when the target interaction event is triggered within the trigger range, so that the same virtual action button can perform different virtual actions in different areas.
  • the purpose of the action is to achieve the technical effect of improving the control efficiency of virtual objects, optimizing the user experience, and reducing the adverse effects of obscuring the field of view caused by too many virtual buttons, thereby solving the control efficiency of virtual objects in related technologies. Less technical issues.
  • controlling the target virtual object to perform the second action in the target interaction event includes: when the first touch operation performed on the virtual action button is detected When the control operation is performed and the first touch operation ends in the target area, the target virtual object is controlled to perform the second action in the target interaction event.
  • the first touch operation may include, but is not limited to, a sliding operation.
  • the target virtual object is controlled to perform the second action in the target interaction event, which can be flexible and convenient
  • the first touch operation is implemented for the same virtual action button, and other functions different from the original function of the virtual action button are realized according to the direction of the first touch operation, and further, the target virtual object can be located in the trigger range of the target interaction event.
  • the number of virtual action buttons displayed is reduced, and multiple virtual actions can be completed, so as to solve the technical problems of complex control methods and low control efficiency of virtual objects existing in the related art, to optimize the user experience and improve the user experience.
  • the above-mentioned target area may include, but is not limited to, preset by the system or the server, and may also be flexibly configured by the user on the terminal device according to actual needs. This may include, but is not limited to, configuring the above-mentioned target area to be bound to the above-mentioned virtual action button.
  • the above-mentioned target area is displayed.
  • the first touch operation is the same as or different from the second touch operation.
  • the first touch operation is a sliding operation
  • the second touch operation is any one of a sliding operation and a long press operation.
  • the target area may be displayed on the interactive interface.
  • FIG. 6 is a schematic diagram of a display interface according to an embodiment of the present application.
  • a virtual action button 604 is displayed in the display interface 602 , and when a sliding operation on the virtual action button 604 is detected, the display is pulled up to display In the target area 606, when it is detected that the above sliding operation ends on the target area 606, the target virtual character is controlled to perform the second action in the target interaction event.
  • the sliding operation is both the first touch operation and the second touch operation.
  • the above-mentioned second action corresponding to the target interaction event may be separately set, including but not limited to, and may also include, but not limited to, the second action corresponding to the display identifier of the target area and corresponding to the target interaction event. .
  • control method of the virtual object further includes:
  • the above-mentioned virtual action button may be configured to display the above-mentioned target area through a sliding operation or a long-pressing operation, including but not limited to, where the sliding operation is performed as shown in FIG.
  • FIG. 7 is a schematic diagram of a display interface according to an embodiment of the present application.
  • a target area 706 is displayed in the display interface 702 .
  • the method for controlling a virtual object further includes: when it is detected that the virtual action button is pressed and the offset distance of the pressing point in the target direction is greater than a distance threshold, determining that a sliding operation performed on the virtual action button is detected , and use the sliding operation as the first touch operation. For example, when it is detected that the virtual action button is pressed by the user's finger and the offset distance of the finger in the target direction is greater than a preset distance threshold, it is determined that a sliding operation performed on the virtual action button is detected, and the sliding operation is used as The first touch operation.
  • the target direction can be the direction indicated by the first prompt information, or can be any direction;
  • the distance threshold can be preset by the system or the server, and can also be based on the size of the display interface (such as the game screen) displayed by the current terminal device. at least one of the size and the size of the interactive interface) to be flexibly configured.
  • it may include, but is not limited to, setting a press detection contact in the display area corresponding to the virtual action button to detect whether the virtual action button is pressed (eg, whether it is pressed by the user's finger). For example, the pressing force in the display area corresponding to the virtual action button can be acquired, and when the pressing force exceeds a preset force threshold and the holding time exceeds the preset holding time threshold, it is determined that the virtual action button is pressed.
  • the displacement distance of the pressing point in the display interface can be obtained as the offset distance.
  • FIG. 8 is a schematic diagram of a display interface according to an embodiment of the present application. As shown in FIG. 8 , the process includes the following steps:
  • the first touch operation is a sliding operation
  • the method for controlling the virtual object further includes: when it is detected that the virtual action button is pressed and the pressing point disappears after sliding to the target area, determining that the sliding operation ends in the target area ; or when it is detected that the virtual action button is dragged to overlap with the target area and then the dragging ends, it is determined that the sliding operation ends in the target area. For example, when it is detected that the virtual action button is pressed by the finger and the finger is slid to the target area and then released, it is determined that the sliding operation ends in the target area; or when it is detected that the virtual action button is dragged to overlap the target area and then released , it is determined that the sliding operation ends in the target area.
  • multiple press detection contacts may be set at the bottom of the screen, and when multiple press detection contacts between the virtual action button and the target area all detect a press operation, it is determined that the detected press point slides to the target area, such as swiping your finger to the target area. Then, when no pressing operation is detected at the detection contact in the target area, it is determined that the sliding operation ends in the target area, for example, the finger is released from the target area.
  • the press detection contact located in the target area does not detect a pressing operation when it is moved to the target area, it is determined that the virtual action button is dragged to the target area. End the drag after it overlaps with the target area (end drag as if the finger is released).
  • FIG. 9A and 9B are schematic diagrams of a display interface according to an embodiment of the present application.
  • FIG. 9A shows that the virtual action button 904 in the display interface 902 is pressed by a finger, and the finger is slid to the target area 906 and then released On;
  • FIG. 9B shows that the virtual action button 904 in the display interface 902 is dragged to overlap with the target area 906 and then released.
  • the method for controlling a virtual object further includes: when a first touch operation performed on the virtual action button is detected, and the execution object of the first touch operation is updated from the virtual action button to the target area, in the display Second prompt information is displayed on the interface, wherein the second prompt information is used to prompt the end of the first touch operation in the target area.
  • displaying the second prompt information on the display interface includes: performing at least one of the following processes on the display interface: updating the display state of the target area; displaying at least one of a text mark and an animation effect; updating a virtual Identification information for the action button.
  • the prompt may be prompted by means including but not limited to highlighting the target area
  • the specific implementation process may include but not limited to one or more combinations of the above. The above is only an example, and this embodiment does not make any specific limitation.
  • the first prompt information includes direction prompt information, where the direction prompt information is used to prompt the target direction of the touch operation performed on the virtual action button.
  • the above-mentioned direction prompt information may include, but is not limited to, arrows, characters and other identification information, wherein the above-mentioned arrows may indicate that the above-mentioned target virtual object is in the triggering range corresponding to the target interaction event by means of highlighting or flashing.
  • displaying the direction prompt information on the display interface helps the user to intuitively know the direction of the touch operation, and then execute the touch operation according to the learned direction, so as to control the target virtual object based on the same virtual action button
  • the technical solution for performing different actions solves the technical problems of complex control methods and low control efficiency of virtual objects in related technologies, and achieves the technical effect of optimizing user experience and improving the efficiency of users controlling virtual objects to complete multiple operations.
  • the target interactive event corresponds to an interactive item, and the second action is an interactive action for the interactive item; or the target interactive event corresponds to an interactive area, and the second action is an interactive action for the interactive area; or the target interaction
  • the event corresponds to the interactive virtual object, and the second action is an interactive action for the interactive virtual object.
  • the interactable prop corresponding to the target interaction event is a virtual zipline (that is, the target interaction event is a zipline riding event), and the second action is a riding zipline action on the virtual zipline, wherein the ride
  • the zipline action is used to make the target virtual object jump on and hold the virtual zipline, and slide along the virtual zipline.
  • the zipline riding event may include, but is not limited to, a target interaction event triggered by the target virtual object being located in an area corresponding to triggering the target virtual object to use the zipline function or the riding function.
  • FIG. 10 is a schematic diagram of a display interface according to an embodiment of the present application.
  • a target virtual character 1004 and a virtual zipline 1006 are displayed on the display interface 1002 .
  • the target avatar 1004 is controlled to jump on and hold the virtual zipline 1006 during the zipline riding event, and slide along the virtual zipline 1006 .
  • the target virtual object is controlled to perform a zipline riding action in the zipline riding event, so that the target virtual object jumps on and pulls the virtual zipline, And slide along the virtual zipline, so that the gameplay of the application is increased and the user experience is optimized.
  • the interactable area corresponding to the target interaction event is a climbing area (ie, the target interaction event is a climbing event), and the second action is a climbing action on the climbing area.
  • the climbing event may include, but is not limited to, a target interaction event triggered when the target virtual object is located in an area corresponding to triggering the virtual object to use the climbing function.
  • FIG. 11 is a schematic diagram of a display interface according to an embodiment of the present application.
  • a target virtual character 1104 and a virtual climbing area 1106 are displayed in the display interface 1102 .
  • the target avatar 1104 is controlled to perform a climbing action in the virtual climbing area 1106 .
  • the above is only an example, and this embodiment does not make any other specific limitations.
  • the target interaction event is a climbing event
  • the target virtual object is controlled to perform a climbing action in the climbing event, thus increasing the gameplay of the application and optimizing the user experience.
  • the method for controlling a virtual object further includes: when the target virtual object is simultaneously within the triggering range of multiple target interaction events and a second touch operation performed on the virtual action button is detected, in the display interface Displaying target areas corresponding to multiple target interaction events, wherein the target area is used to trigger the target virtual object to perform a second action in the corresponding target interaction event; wherein the first touch operation is the same as or different from the second touch operation .
  • the target interactive event includes a first interactive event and a second interactive event, when the target virtual object is located within the triggering range of the first interactive event and the triggering range of the second interactive event at the same time, and the second interactive event executed on the virtual action button is detected
  • a first area and a second area are displayed on the display interface, wherein the first area is the target area corresponding to the first interactive event, and the first area is used to trigger the target virtual object to execute the first area in the first interactive event.
  • Two actions A the second area is the target area corresponding to the second interactive event, and the second area is used to trigger the target virtual object to perform the second action B in the second interactive event.
  • the second action may include, but is not limited to, any interactive action using a zipline, rock climbing, swimming, riding, talking, shopping, and the like.
  • it may include, but is not limited to, setting the second action A and the second action B as different virtual actions, and may also include, but not limited to, setting the second action A and the second action B as the same virtual action , but correspond to different interactive virtual objects.
  • the first area and the second area may be flexibly set by the system or the server, and the matching relationship between the first area and the second action A and the matching relationship between the second area and the second action B may be Including but not limited to flexible settings by the system or server.
  • the corresponding corresponding multiple target interaction events are displayed on the display interface.
  • a target area wherein the target area is used to trigger the target virtual object to perform the second action in the corresponding target interaction event.
  • the embodiment of the present application can realize the technical solution of controlling the target virtual object to perform different actions based on the same virtual action button , to solve the technical problems of complex control methods and low control efficiency of virtual objects existing in the related art, to achieve the technical effect of optimizing user experience and improving the efficiency of users controlling virtual objects to complete multiple interactive operations.
  • controlling the target virtual object to perform the second action in the target interaction event includes: when the first touch operation performed on the virtual action button is detected When the control operation is performed and the first touch operation ends in any target area, the target virtual object is controlled to perform the second action in the target interaction event corresponding to any target area. For example, when the first touch operation performed on the virtual action button is detected and the first touch operation ends in the first area, the target virtual object is controlled to perform the second action A in the first interaction event; When the first touch operation performed by the virtual action button and the first touch operation ends in the second area, the target virtual object is controlled to perform the second action B in the second interaction event.
  • FIG. 12 is a schematic diagram of a display interface according to an embodiment of the present application.
  • a virtual action button 1204 is displayed in the display interface 1202.
  • the first area 1206 and the second area 1208 are displayed in the display interface 1202 , such as the shaded parts in FIG. 12 .
  • the sliding operation ends in the first area 1206, the target avatar is controlled to perform the second action A; when the sliding operation ends in the second area 1208, the target avatar is controlled to perform the second action B.
  • the sliding operation is both the first touch operation and the second touch operation.
  • the first interactive event is a zipline riding event
  • the second action A includes a zipline riding action
  • the zipline riding action is used to make the target virtual object jump on and pull on the virtual zipline, and move along the zipline. slide along the virtual zipline;
  • the second interactive event is a climbing event, and the second action B includes a climbing action.
  • the corresponding corresponding multiple target interaction events are displayed on the display interface.
  • a target area wherein the target area is used to trigger the target virtual object to perform the second action in the corresponding target interaction event.
  • the method further includes: when a third touch operation performed on the virtual action button is detected, controlling the target virtual object to end the target interaction event; wherein , the first touch operation is the same as or different from the third touch operation.
  • the above-mentioned third touch operation may include, but is not limited to, be set to be the same as the first touch operation.
  • the third touch operation may be Actions are also configured as click actions.
  • the above-mentioned third touch operation may also include, but is not limited to, setting different from the first touch operation.
  • the third touch operation may be configured as a release operate.
  • the above-mentioned target interaction event can be ended by acquiring the third touch operation and responding to the above-mentioned third touch operation.
  • controlling the target virtual object to end the target interactive event includes: when the target interactive event corresponds to an interactive item, controlling the target virtual object to end the interactive action on the interactive item; or when the target interactive event corresponds to an interactive area , control the target virtual object to end the interactive action on the interactive area; or when the target interactive event corresponds to the interactive virtual object, control the target virtual object to end the interactive action on the interactive virtual object.
  • controlling the target virtual object to end the interactive action on the interactive prop includes: when the interactive prop corresponding to the target interactive event is a virtual zipline, controlling the target virtual object Jump off the virtual zipline. That is, when the target interaction event is a zipline riding event, the target virtual object is controlled to jump off the virtual zipline.
  • controlling the target virtual object to end the interactive action on the interactive area includes: when the interactive area corresponding to the target interactive event is a climbing area, controlling the target virtual object Jump out of the climbing area. That is, when the target interaction event is a climbing event, the target virtual object is controlled to jump away from the climbing area.
  • FIG. 13 is a schematic flowchart of another method for controlling a virtual object according to an embodiment of the present application. As shown in FIG. 13 , the process includes but is not limited to the following steps:
  • the apparatus includes: a first display module 1402 configured to display a target virtual object and a virtual action button in a display interface, wherein the virtual action button is used to control the target virtual object to perform a first action; a second display Module 1404, configured to display first prompt information in the display interface when the target virtual object is within the triggering range of the target interaction event, wherein the first prompt information is used to prompt to perform a touch operation on the virtual action button; execute module 1406 is configured to control the target virtual object to perform a second action in the target interaction event when a first touch operation performed on the virtual action button is detected.
  • the execution module 1406 is further configured to: control the target virtual object to execute in the target interaction event when the first touch operation performed on the virtual action button is detected and the first touch operation ends in the target area second action.
  • the second display module 1404 is further configured to: when a second touch operation performed on the virtual action button is detected, display the target area in the display interface; wherein the first touch operation and the second touch The control operation is the same or different.
  • the first touch operation is a sliding operation
  • the second touch operation is any one of a sliding operation and a long-press operation
  • the long-press operation represents an operation whose pressing duration is greater than a duration threshold.
  • the first touch operation is a sliding operation; the execution module 1406 is further configured to: when it is detected that the virtual action button is pressed and the pressing point disappears after sliding to the target area, determine that the sliding operation ends in the target area; Or when it is detected that the virtual action button is dragged to overlap with the target area and then the dragging ends, it is determined that the sliding operation ends in the target area.
  • the execution module 1406 is further configured to: when it is detected that the virtual action button is pressed and the offset distance of the pressing point in the target direction is greater than the distance threshold, determine that a sliding operation performed on the virtual action button is detected, And use the sliding operation as the first touch operation.
  • the first prompt information includes direction prompt information, where the direction prompt information is used to prompt the target direction of the touch operation performed on the virtual action button.
  • the target interactive event corresponds to an interactive item, and the second action is an interactive action for the interactive item; or the target interactive event corresponds to an interactive area, and the second action is an interactive action for the interactive area; or the target interaction
  • the event corresponds to the interactive virtual object, and the second action is an interactive action for the interactive virtual object.
  • the interactive prop corresponding to the target interaction event is a virtual zipline
  • the second action is a zipline riding action on the virtual zipline, wherein the zipline riding action is used to make the target virtual object jump on and Pull on the virtual zipline and slide along the virtual zipline.
  • the interactable area corresponding to the target interaction event is a climbing area
  • the second action is a climbing action on the climbing area
  • the second display module 1404 is further configured to: when the target virtual object is simultaneously within the triggering range of multiple target interaction events and the second touch operation performed on the virtual action button is detected, display The interface displays target areas corresponding to multiple target interaction events, wherein the target area is used to trigger the target virtual object to perform a second action in the corresponding target interaction event; wherein the first touch operation is the same as the second touch operation or different.
  • the execution module 1406 is further configured to: when the first touch operation performed on the virtual action button is detected and the first touch operation ends at any target area, control the target virtual object to be in any target area The second action is executed in the target interaction event corresponding to the area.
  • the execution module 1406 is further configured to: when a third touch operation performed on the virtual action button is detected, control the target virtual object to end the target interaction event; wherein the first touch operation and the third touch The operation is the same or different.
  • the execution module 1406 is further configured to: when the target interaction event corresponds to the interactive prop, control the target virtual object to end the interactive action on the interactive prop; or when the target interaction event corresponds to the interactive area, control the target virtual object The object ends the interactive action on the interactive area; or when the target interactive event corresponds to the interactive virtual object, the target virtual object is controlled to end the interactive action on the interactive virtual object.
  • the interactive prop corresponding to the target interaction event when the interactive prop corresponding to the target interaction event is a virtual zipline, control the target virtual object to jump off the virtual zipline; when the interactable area corresponding to the target interaction event is a climbing area, control the target virtual object Jump out of the climbing area.
  • An embodiment of the present application provides an electronic device for implementing the above method for controlling a virtual object, where the electronic device may be a terminal device or a server as shown in FIG. 1 .
  • This embodiment is described by taking the electronic device as a terminal device as an example.
  • the electronic device includes a memory 1502 and a processor 1504, where a computer program is stored in the memory 1502, and the processor 1504 is configured to execute the steps in the above method embodiments through the computer program.
  • the aforementioned electronic device may be at least one network device among a plurality of network devices located in a computer network.
  • the above-mentioned processor may be configured to perform the following steps through a computer program: displaying the target virtual object and a virtual action button in the display interface, wherein the virtual action button is used to control the target virtual object to perform the first action; when When the target virtual object is within the triggering range of the target interaction event, first prompt information is displayed in the display interface, wherein the first prompt information is used to prompt to perform a touch operation on the virtual action button; During the first touch operation, the target virtual object is controlled to perform the second action in the target interaction event.
  • FIG. 15 is only for illustration, and the electronic device may also be a smart phone (such as an Android phone, an iOS phone, etc.), a tablet computer, a palmtop computer, and a mobile Internet device (Mobile Internet Devices, MID), PAD and other terminal equipment.
  • FIG. 15 does not limit the structure of the above-mentioned electronic device.
  • the electronic device may also include more or fewer components than those shown in FIG. 15 (eg, network interfaces, etc.), or have a different configuration than that shown in FIG. 15 .
  • the memory 1502 may be used to store software programs and modules, such as program instructions/modules corresponding to the method and device for controlling virtual objects in the embodiments of the present application, and the processor 1504 runs the software programs and modules stored in the memory 1502, thereby Executing various functional applications and data processing implements the above-described virtual object control method.
  • Memory 1502 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory.
  • memory 1502 may include memory located remotely from processor 1504, which may be connected to the terminal through a network. Examples of such networks include, but are not limited to, the Internet, an intranet, a local area network, a mobile communication network, and combinations thereof.
  • the memory 1502 may be, but is not limited to, used to store information such as virtual action buttons and virtual objects.
  • the above-mentioned memory 1502 may include, but is not limited to, the first display module 1402 , the second display module 1404 and the execution module 1406 in the control device of the above-mentioned virtual object.
  • it may also include, but is not limited to, other module units in the above-mentioned virtual object control device.
  • the transmission means 1506 described above is used to receive or transmit data via a network.
  • the above-mentioned networks may include wired networks and wireless networks.
  • the transmission device 1506 includes a network adapter (Network Interface Controller, NIC), which can be connected with other network devices and routers through a network cable to communicate with the Internet or a local area network.
  • the transmission device 1506 is a radio frequency (Radio Frequency, RF) module, which is used for wirelessly communicating with the Internet.
  • RF Radio Frequency
  • the above-mentioned electronic device also includes: a display 1508 for displaying a display interface (for example, a game screen and an interactive interface) of a control application (such as a game application) of a virtual object; a connection bus 1510 for connecting to the above-mentioned electronic device.
  • a display 1508 for displaying a display interface (for example, a game screen and an interactive interface) of a control application (such as a game application) of a virtual object
  • a connection bus 1510 for connecting to the above-mentioned electronic device.
  • the above-mentioned terminal device or server may be a node in a distributed system, wherein the distributed system may be a blockchain system, and the blockchain system may be communicated by the multiple nodes through a network A distributed system formed by connection in the form of.
  • a peer-to-peer (P2P, Peer To Peer) network can be formed between nodes, and electronic devices in any form, such as servers, terminals and other electronic devices, can become a node in the blockchain system by joining the peer-to-peer network.
  • Embodiments of the present application provide a computer program product or computer program, where the computer program product or computer program includes computer instructions (executable instructions), and the computer instructions are stored in a computer-readable storage medium.
  • the processor of the electronic device reads the computer instruction from the computer-readable storage medium, and the processor executes the computer instruction, so that the electronic device executes the virtual object control method provided in the various implementation manners of the above-mentioned virtual object control aspect, wherein , the computer program is configured to execute the steps in the above method embodiments when running.
  • the above-mentioned computer-readable storage medium may be configured to store a computer program for performing the steps of: displaying a target virtual object and a virtual action button in a display interface, wherein the virtual action button is used to control the target virtual object Execute the first action; when the target virtual object is located within the triggering range of the target interaction event, display first prompt information in the display interface, wherein the first prompt information is used to prompt to perform a touch operation on the virtual action button; when detecting When the first touch operation is performed on the virtual action button, the target virtual object is controlled to perform the second action in the target interaction event.
  • the storage medium may include: a flash disk, a read-only memory (Read-Only Memory, ROM), a random access device (Random Access Memory, RAM), a magnetic disk or an optical disk, and the like.
  • the integrated units in the above-mentioned embodiments are implemented in the form of software functional units and sold or used as independent products, they may be stored in the above-mentioned computer-readable storage medium.
  • the technical solution of the present application can be embodied in the form of a software product in essence, or the part that contributes to the prior art, or all or part of the technical solution, and the computer software product is stored in a storage medium.
  • Several instructions are included to cause one or more electronic devices (which may be personal computers, servers, or network devices, etc.) to execute all or part of the steps of the methods of the various embodiments of the present application.
  • the disclosed client may be implemented in other manners.
  • the device embodiments described above are only illustrative, for example, the division of units is only a logical function division. In actual implementation, there may be other division methods, for example, multiple units or components may be combined or integrated into Another system, or some features can be ignored, or not implemented.
  • the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of units or modules, and may be in electrical or other forms.
  • Units described as separate components may or may not be physically separated, and components shown as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units may be implemented in the form of hardware, or may be implemented in the form of software functional units.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种虚拟对象的控制方法和装置、存储介质、电子设备及计算机程序产品。其中,该方法包括:在显示界面中显示目标虚拟对象以及虚拟动作按钮,其中,虚拟动作按钮用于控制目标虚拟对象执行第一动作;当目标虚拟对象位于目标互动事件的触发范围内时,在显示界面中显示第一提示信息,其中,第一提示信息用于提示对虚拟动作按钮执行触控操作;当检测到对虚拟动作按钮执行的第一触控操作时,控制目标虚拟对象在目标互动事件中执行第二动作。

Description

虚拟对象的控制方法和装置、存储介质及电子设备
相关申请的交叉引用
本申请基于申请号为202011270984.7、申请日为2020年11月13日的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此引入本申请作为参考。
技术领域
本申请涉及计算机领域,具体而言,涉及一种虚拟对象的控制方法和装置、存储介质、电子设备及计算机程序产品。
背景技术
随着技术的发展和电子设备性能的提高,扩展了感知环境以及获取信息的渠道,尤其是虚拟显示技术,能够根据实际需求实现基于虚拟对象的多样化的交互。
目前的相关技术中,通常是在满足了互动条件时,显示一个按钮进行互动,导致会增加界面中的按钮控件,进而出现遮挡视野的缺陷。以游戏场景为例,当触发多种游戏操作时,触发按钮的区域中会排布很多用于实现触发的按钮,让用户无所适从,进而,在控制虚拟对象执行预定动作时,由于需要使用很多触发按钮,导致控制的效率过低。
针对上述的问题,目前尚未提出有效的解决方案。
发明内容
本申请实施例提供了一种虚拟对象的控制方法,包括:
在显示界面中显示目标虚拟对象以及虚拟动作按钮,其中,所述虚拟动作按钮用于控制所述目标虚拟对象执行第一动作;
当所述目标虚拟对象位于目标互动事件的触发范围内时,在所述显示界面中显示第一提示信息,其中,所述第一提示信息用于提示对所述虚拟动作按钮执行触控操作;
当检测到对所述虚拟动作按钮执行的第一触控操作时,控制所述目标虚拟对象在所述目标互动事件中执行第二动作。
本申请实施例提供了一种虚拟对象的控制装置,包括:
第一显示模块,配置为在显示界面中显示目标虚拟对象以及虚拟动作按钮,其中,所述虚拟动作按钮用于控制所述目标虚拟对象执行第一动作;
第二显示模块,配置为当所述目标虚拟对象位于目标互动事件的触发范围内时,在所述显示界面中显示第一提示信息,其中,所述第一提示信息用于提示对所述虚拟动作按钮执行触控操作;
执行模块,配置为当检测到对所述虚拟动作按钮执行的第一触控操作时,控制所述目标虚拟对象在所述目标互动事件中执行第二动作。
本申请实施例提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机指令,其中,所述计算机指令被处理器执行时实现本申请实施例提供的虚拟对象的控制方法。
本申请实施例提供了一种电子设备,包括存储器和处理器,上述存储器中存储有计算机指令,上述处理器被设置为通过所述计算机指令实现上述的虚拟对象的控制方法。
本申请实施例提供了一种计算机程序产品,包括计算机指令,所述计算机指令被处理器执行时实现本申请实施例提供的虚拟对象的控制方法。
附图说明
此处所说明的附图用来提供对本申请的进一步理解,构成本申请的一部分,本申请的示意性实施例及其说明用于解释本申请,并不构成对本申请的不当限定。在附图中:
图1是根据本申请实施例的一种虚拟对象的控制方法的应用环境的示意图;
图2是根据本申请实施例的一种虚拟对象的控制方法的流程示意图;
图3是根据本申请实施例的一种显示界面的示意图;
图4是根据本申请实施例的另一种显示界面的示意图;
图5是根据本申请实施例的又一种显示界面的示意图;
图6是根据本申请实施例的又一种显示界面的示意图;
图7是根据本申请实施例的又一种显示界面的示意图;
图8是根据本申请实施例的又一种显示界面的示意图;
图9A是根据本申请实施例的又一种显示界面的示意图;
图9B是根据本申请实施例的又一种显示界面的示意图;
图10是根据本申请实施例的又一种显示界面的示意图;
图11是根据本申请实施例的又一种显示界面的示意图;
图12是根据本申请实施例的又一种显示界面的示意图;
图13是根据本申请实施例的另一种虚拟对象的控制方法的流程示意图;
图14是根据本申请实施例的一种虚拟对象的控制装置的结构示意图;
图15是根据本申请实施例的一种电子设备的结构示意图。
具体实施方式
为了使本技术领域的人员更好地理解本申请方案,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分的实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都应当属于本申请保护的范围。
需要说明的是,本申请的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的本申请的实施例能够以除了在这里图示或描述的那些以外的顺序实施。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元的过程、方法、系统、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。在以下的描述中,所涉及的术语“多个”是指至少两个。
首先,在对本申请实施例进行描述的过程中出现的部分名词或者术语适用于如下解释:
1)虚拟场景:利用电子设备输出的区别于现实世界的场景,通过裸眼或设备的辅助能够形成对虚拟场景的视觉感知,例如通过显示屏幕输出的二维影像,通过立体投影、虚拟现实和增强现实技术等立体显示技术来输出的三维影像;此外,还可以通过各种可能的硬件形成听觉感知、触觉感知、嗅觉感知和运动感知等各种模拟现实世界的感知。虚拟场景可以是对真实世界的仿真环境,也可以是半仿真半虚构的虚拟环境,还可以是纯虚构的虚拟环境。虚拟场景可以是二维虚拟场景、2.5维虚拟场景或者三维虚拟场景中的任意一种,本申请实施例对虚拟场景的维度不加以限定。
2)虚拟对象:虚拟场景中可以进行交互的各种人和物的形象,或在虚拟场景中的可活动对象。该可活动对象可以是虚拟角色、虚拟动物、动漫人物等,比如,在虚拟场景中显示的人物、动物、植物、油桶、墙壁、石块等。该虚拟对象可以是该虚拟场景中的一个虚拟的用于代表用户的虚拟形象。虚拟场景中可以包括多个虚拟对象,每个虚拟对象在虚拟场景中具有自身的形状和体积,占据虚拟场景中的一部分空间。例如,该虚拟对象可以是虚拟场景中的虚拟角色,被用户或者人工智能(Artificial Intelligence,AI)所控制。
3)虚拟道具:又称可互动道具,供虚拟场景中的虚拟对象进行互动。例如,虚拟道具可以包括虚拟滑索,该虚拟滑索用于连接两个地点,玩家可以通过乘骑滑索来达到快速从一个地点转移到另一个地点。
下面结合实施例对本申请进行说明。
本申请实施例提供了一种虚拟对象的控制方法,在一些实施例中,上述虚拟对象的控制方法可以应用于如图1所示的由服务器101和终端设备103所构成的硬件环境中。如图1所示,服务器101通过网络与终端设备103进行连接,可用于为终端设备103或终端设备103中安装的应用107提供服务,应用可以是视频应用、即时通信应用、浏览器应用、教育应用、游戏应用等,还可以包括但不限于能够进行虚拟对象控制的其他应用。可在服务器101中或独立于服务器101设置数据库105,数据库105用于为服务器101提供数据存储服务,例如游戏数据存储服务。上述网络可以包括但不限于有线网络和无线网络,其中,该有线网络包括但不限于局域网、城域网和广域网,该无线网络包括但不限于蓝牙、WIFI及其他实现无线通信的网络。终端设备103可以是配置有虚拟对象的控制应用(即应用107)的终端设备,可以包括但不限于以下至少之一:手机(如Android手机、iOS手机等)、笔记本电脑、平板电脑、掌上电脑、移动互联网设备(Mobile Internet Devices,MID)、PAD、台式电脑、智能电视等。上述服务器101可以是单一服务器,也可以是由多个服务器组成的服务器集群,或者是云服务器,可以包括但不限于路由或者网关。通过终端设备103中配置的虚拟对象的控制的应用107的入口,可以启动应用107,并在应用107的显示界面中输出虚拟场景(包括目标虚拟对象以及虚拟动作按钮),以应用107为游戏应用举例,显示界面可以包括游戏画面(或称游戏界面)以及交互界面,当然这仅是一种示例,本实施例中对此不作任何限定。
结合图1所示,上述虚拟对象的控制方法可以在终端设备103通过如下步骤实现:
S1,在终端设备103启动应用107,并在应用107的显示界面中显示目标虚拟对象以及虚拟动作按钮,其中,虚拟动作按钮用于控制目标虚拟对象执行第一动作;
S2,当应用107中的目标虚拟对象位于目标互动事件的触发范围内时,在应用107的显示界面中显示第一提示信息,其中,第一提示信息用于提示对虚拟动作按钮执行触控操作;
S3,当在应用107中检测到对虚拟动作按钮执行的第一触控操作时,在应用107中控制目标虚拟对象在目标互动事件中执行第二动作。
以应用107为游戏应用举例,显示界面可以包括游戏画面以及交互界面,虚拟动作按钮以及第一提示信息可以显示于交互界面中。
在一些实施例中,上述虚拟对象的控制方法还可以通过包括但不限于配置于服务器101的应用程序实现,或者,由终端设备103或者服务器101结合实现上述虚拟对象的控制方法,例如,服务器101可以将虚拟场景的相关显示数据发送至终端设备103,以使终端设备103根据接收到的显示数据显示虚拟场景。上述仅是一种示例,本实施例不做具体的限定。
在一些实施例中,如图2所示,上述虚拟对象的控制方法包括:
S202,在显示界面中显示目标虚拟对象以及虚拟动作按钮,其中,虚拟动作按钮用于控制目标虚拟对象执行第一动作;
S204,当目标虚拟对象位于目标互动事件的触发范围内时,在显示界面中显示第一提示信息,其中,第一提示信息用于提示对虚拟动作按钮执行触控操作;
S206,当检测到对虚拟动作按钮执行的第一触控操作时,控制目标虚拟对象在目标互动事件中执行第二动作。
在一些实施例中,用于实现虚拟对象的控制方法的虚拟对象的控制应用可以包括但不限于游戏软件、app、小程序等,还可以包括但不限于配置在任何软件、app、小程序中的控制功能等,上述目标虚拟对象可以包括但不限于在虚拟对象的控制应用对应的后台或者服务器注册后,登录上述虚拟对象的控制应用时所控制的虚拟对象。
以虚拟对象的控制应用为游戏应用的情况举例,显示界面可以包括游戏画面以及交互界面,上述游戏画面以及交互界面可以包括但不限于被配置为在同一显示区域显示、将交互界面和游戏画面重叠显示、或者在不同显示区域显示(也即在显示界面中除上述游戏画面之外的区域显示交互界面)。其中,游戏画面可以用于显示目标虚拟对象,交互界面可以用于显示虚拟动作按钮以及第一提示信息,当然显示逻辑并不限于此。
在一些实施例中,图3是根据本申请实施例的一种显示界面的示意图,如图3所示,可以包括但不限于在游戏应用中显示游戏画面302以及交互界面304,其中,游戏画面302中包含有目标虚拟角色306(目标虚拟角色306为目标虚拟对象的一种示例),交互界面304中显示有虚拟动作按钮308。上述仅是一种示例,本实施例不做任何具体的限定。
在一些实施例中,上述虚拟动作按钮可以包括但不限于用于在检测到对上述虚拟动作按钮执行的触控操作的情况下,控制目标虚拟对象执行预设的第一动作。
例如,以图3中示出的虚拟动作按钮308为例,虚拟动作按钮308被配置为“跳跃”按钮,也即,在检测到对上述虚拟动作按钮308执行的触控操作的情况下,控制目标虚拟角色306执行“跳跃”操作。
第一动作可以根据虚拟对象的控制应用的情况进行灵活配置,例如,可以包括但不限于将上述第一动作配置为“跳跃”、“下蹲”、“趴下”、“弯腰”等虚拟对象的控制方式,还可以包括但不限于推、拉、举、抬、下压等虚拟对象的控制方式,还可以包括但不限于配置为与预设的互动事件匹配,例如,当互动事件为目标虚拟对象处于允许执行预定操作的区域时,控制目标虚拟对象执行预定操作;当互动事件为目标虚拟对象处于允许开启虚拟道具的区域时,控制目标虚拟对象执行开启虚拟道具的操作;当互动事件为目标虚拟对象处于允许使用虚拟道具的区域时,控制目标虚拟对象执行使用虚拟道具的操作。上述仅是一种示例,本实施例不做任何具体 的限定。
在一些实施例中,上述第一动作为与虚拟动作按钮初始绑定的动作,第二动作为在目标虚拟对象处于目标互动事件中时所执行的动作。上述第一提示信息用于提示对虚拟动作按钮执行触控操作,上述触控操作可以包括但不限于点击、长按、拖拽、松开、双击等,还可以包括但不限于根据按压力度分别配置不同的触控操作对应的效果,其中,长按是指按压时长大于时长阈值。
例如,在上述触控操作为拖拽的情况下,可以包括但不限于将上述第一提示信息配置为在虚拟动作按钮的周围添加对应的拖拽方向标记,图4是根据本申请实施例的显示界面的示意图,如图4所示,在目标虚拟角色位于目标互动事件的触发范围内的情况下,在虚拟动作按钮402的附近区域404显示第一提示信息406,可以通过包括但不限于箭头的形式来指示与拖拽操作对应的拖拽方向,还可以通过包括但不限于语音的形式来播放指示对虚拟动作按钮执行拖拽操作的语音信息等。
还例如,在上述触控操作为按压操作的情况下,可以包括但不限于将上述第一提示信息配置为在虚拟动作按钮的周围添加对应的按压力度标记,图5是根据本申请实施例的显示界面的示意图,如图5所示,在目标虚拟角色位于目标互动事件的触发范围内的情况下,在虚拟动作按钮502的附近区域504显示第一提示信息506,可以通过包括但不限于力度柱形图的形式来指示与按压操作对应的按压力度,其中,第一提示信息506中的阴影部分即为当前按压力度,可以在第一提示信息506中配置对应的力度阈值标识,用于告知用户在检测到达到力度阈值的按压力度后执行对应的第二动作,还可以通过包括但不限于在游戏画面中显示文字标识来指示对虚拟动作按钮执行按压操作,且不同按压力度对应的不同效果的文字标识。上述仅是一种示例,具体实现方式可以包括但不限于上述一种或者多种的组合,本实施例不做任何具体的限定。
在一些实施例中,上述目标互动事件可以包括但不限于对可互动道具的互动事件、对可互动区域的互动事件、对可互动虚拟对象的互动事件。例如,在游戏应用中,目标互动事件可以包括但不限于滑索、攀岩、游泳、乘骑、交谈、购物等。
本实施例通过将虚拟动作按钮配置为在目标互动事件的触发范围内的情况下,执行与预先配置的第一动作不同的第二动作,达到了同一个虚拟动作按钮在不同区域可以执行不同虚拟动作的目的,从而实现了提高虚拟对象的控制效率、优化用户的使用体验、降低虚拟按钮过多带来的遮蔽视野等不良影响的技术效果,进而解决了相关技术中存在的虚拟对象的控制效率较低的技术问题。
在一些实施例中,当检测到对虚拟动作按钮执行的第一触控操作时,控制目标虚拟对象在目标互动事件中执行第二动作,包括:当检测到对虚 拟动作按钮执行的第一触控操作、且第一触控操作结束于目标区域时,控制目标虚拟对象在目标互动事件中执行第二动作。例如,第一触控操作可以包括但不限于滑动操作。
通过本实施例,当检测到对虚拟动作按钮执行的第一触控操作、且第一触控操作结束于目标区域时,控制目标虚拟对象在目标互动事件中执行第二动作,能够灵活、简便地实现针对同一个虚拟动作按钮执行第一触控操作,根据第一触控操作的方向实现与虚拟动作按钮的原始功能不同的其他功能,进而,能够在目标虚拟对象位于目标互动事件的触发范围的情况下,降低虚拟动作按钮的显示数量,且能够完成多个虚拟动作,解决相关技术中存在的虚拟对象的控制方式复杂、控制效率较低的技术问题,达到优化用户的使用体验,提高用户控制目标虚拟对象完成多个操作的效率的技术效果。
在一些实施例中,上述目标区域可以包括但不限于由系统或者服务器预设,也可以由用户在终端设备根据实际需要进行灵活配置。可以包括但不限于将上述目标区域配置为与上述虚拟动作按钮绑定,如此,当检测到对虚拟动作按钮执行的第二触控操作时,显示上述目标区域。其中,第一触控操作与第二触控操作相同或不同,例如,第一触控操作为滑动操作,第二触控操作为滑动操作以及长按操作中的任意一种。值得说明的是,在显示界面包括游戏画面以及交互界面的情况下,目标区域可以显示于交互界面。
例如,图6是根据本申请实施例的显示界面的示意图,如图6所示,在显示界面602中显示有虚拟动作按钮604,当检测到对虚拟动作按钮604执行滑动操作时,上拉显示目标区域606,当检测到上述滑动操作在目标区域606上结束时,控制目标虚拟角色在目标互动事件中执行第二动作。这里,滑动操作即是第一触控操作,也是第二触控操作。
在一些实施例中,可以包括但不限于单独设置上述与目标互动事件对应的第二动作,还可以包括但不限于设置为与目标区域的显示标识对应的且与目标互动事件对应的第二动作。
在一些实施例中,虚拟对象的控制方法还包括:
S1,当检测到对虚拟动作按钮执行的滑动操作时,在显示界面中显示目标区域;或者
S2,当检测到对虚拟动作按钮执行的长按操作时,在显示界面中显示目标区域。
在一些实施例中,可以包括但不限于将上述虚拟动作按钮配置为通过滑动操作或者长按操作来显示上述目标区域,其中,执行滑动操作的情况可以如图6所示,执行长按操作的情况可以如图7所示,图7是根据本申请实施例的显示界面的示意图,当检测到对虚拟动作按钮704执行的长按操作时,在显示界面702中显示目标区域706。上述仅是一种示例,本实施 例不做任何具体的限定。
通过本实施例,当检测到对虚拟动作按钮执行的第二触控操作(如滑动操作或长按操作)时,显示上述目标区域,进而能够基于虚拟动作按钮执行不同的操作,以实现降低虚拟动作按钮的显示数量,解决相关技术中存在的虚拟角色的控制方式复杂、控制效率较低的技术问题,达到优化用户的使用体验,提高用户控制虚拟对象完成多个操作的效率的技术效果。
在一些实施例中,虚拟对象的控制方法还包括:当检测到虚拟动作按钮被按压、且按压点在目标方向上的偏移距离大于距离阈值时,确定检测到对虚拟动作按钮执行的滑动操作,并将滑动操作作为第一触控操作。例如,当检测到虚拟动作按钮被用户的手指按住、且手指在目标方向上的偏移距离大于预设的距离阈值时,确定检测到对虚拟动作按钮执行的滑动操作,并将滑动操作作为第一触控操作。值得说明的是,目标方向可以是第一提示信息指示的方向,也可以是任意方向;距离阈值可以由系统或者服务器预设,还可以根据当前终端设备所显示的显示界面的大小(如游戏画面的大小以及交互界面的大小中的至少之一)来灵活配置。
在一些实施例中,可以包括但不限于在上述虚拟动作按钮对应的显示区域设置按压检测触点,来检测上述虚拟动作按钮是否被按压(如是否被用户的手指按住)。例如,可以获取在虚拟动作按钮对应的显示区域的按压力度,并当按压力度超过预设的力度阈值、且保持的时长超过预设的保持时长阈值时,确定虚拟动作按钮被按压。
在一些实施例中,可以获取按压点在显示界面中的位移距离,以作为偏移距离。
例如,图8是根据本申请实施例的显示界面的示意图,如图8所示,该流程包括如下步骤:
S1,在显示界面802的虚拟动作按钮804处检测是否被按压(图8中以手指按压的情况为例);
S2,当检测到虚拟动作按钮804被按压时,获取按压点在目标方向806上的偏移距离;
S3,当偏移距离大于预定阈值、且按压点未消失(如图8中手指保持按住)时,确定检测到对虚拟动作按钮804执行的滑动操作。
通过本实施例,当检测到虚拟动作按钮被按压、且按压点在目标方向上的偏移距离大于距离阈值时,确定检测到对虚拟动作按钮执行的滑动操作,并将滑动操作作为第一触控操作,进而能够实现复用虚拟动作按钮,降低虚拟动作按钮的显示数量,解决相关技术中存在的虚拟对象的控制方式复杂、控制效率较低的技术问题,达到优化用户的使用体验,提高用户控制虚拟对象通过一个虚拟动作按钮完成多个操作的效率的技术效果。
在一些实施例中,第一触控操作为滑动操作;虚拟对象的控制方法还包括:当检测到虚拟动作按钮被按压、且按压点滑动到目标区域后消失时, 确定滑动操作结束于目标区域;或者当检测到虚拟动作按钮被拖拽到与目标区域重叠后结束拖拽时,确定滑动操作结束于目标区域。例如,当检测到虚拟动作按钮被手指按住、且手指滑动到目标区域后松开时,确定滑动操作结束于目标区域;或者当检测到虚拟动作按钮被拖拽到与目标区域重叠后松开时,确定滑动操作结束于目标区域。
在一些实施例中,可以在屏幕下方设置多个按压检测触点,当在虚拟动作按钮与目标区域之间的多个按压检测触点均检测到按压操作时,确定检测到按压点滑动到目标区域,如手指滑动到目标区域。然后,当在目标区域的检测触点未检测到按压操作时,确定滑动操作结束于目标区域,如手指在目标区域松开。
在一些实施例中,若检测到虚拟动作按钮被拖拽移动、且被移动至上述目标区域时位于目标区域的按压检测触点未检测到按压操作,则确定检测到虚拟动作按钮被拖拽到与目标区域重叠后结束拖拽(结束拖拽如手指松开)。
图9A以及图9B均是根据本申请实施例的显示界面的示意图,如图9A所示,图9A表示显示界面902中的虚拟动作按钮904被手指按住、且手指滑动到目标区域906后松开;图9B表示显示界面902中的虚拟动作按钮904被拖拽到与目标区域906重叠后松开。
在一些实施例中,虚拟对象的控制方法还包括:当检测到对虚拟动作按钮执行的第一触控操作、且第一触控操作的执行对象从虚拟动作按钮更新为目标区域时,在显示界面中显示第二提示信息,其中,第二提示信息用于提示在目标区域结束第一触控操作。
在一些实施例中,在显示界面中显示第二提示信息,包括:在显示界面中执行以下至少一种处理:更新目标区域的显示状态;显示文字标识以及动画效果中的至少之一;更新虚拟动作按钮的标识信息。例如,当检测到对虚拟动作按钮执行的第一触控操作、且第一触控操作的执行对象从虚拟动作按钮更新为目标区域时,可以通过包括但不限于高亮显示目标区域的方式提示用户,或者,通过文字标识、语音播报、游戏画面的动画效果以及更换虚拟动作按钮的标识信息等方式实现,具体实现过程可以包括但不限于上述一种或者多种的组合。上述仅是一种示例,本实施例不做任何具体的限定。
通过本实施例,当检测到虚拟动作按钮被按压、且按压点滑动到目标区域后消失时,确定滑动操作结束于目标区域;或者当检测到虚拟动作按钮被拖拽到与目标区域重叠后结束拖拽时,确定滑动操作结束于目标区域。如此,能够通过不同方式来确定滑动操作在目标区域上结束,以实现根据不同应用的模式(如不同游戏应用的游戏模式)来配置对应的检测方式,降低虚拟动作按钮的显示数量,解决相关技术中存在的虚拟对象的控制方式复杂、控制效率较低的技术问题,达到优化用户的使用体验,提高用户 控制虚拟对象完成多个操作的效率的技术效果。
在一些实施例中,第一提示信息包括方向提示信息,方向提示信息用于提示对虚拟动作按钮执行的触控操作的目标方向。
在一些实施例中,上述方向提示信息可以包括但不限于箭头、文字等标识信息,其中,上述箭头可以通过高亮、闪动等形式表示上述目标虚拟对象处于目标交互事件对应的触发范围中。
通过本实施例,采用在显示界面上显示方向提示信息,有助于用户直观获知触控操作的方向,进而,根据获知的方向执行触控操作,以实现基于同一个虚拟动作按钮控制目标虚拟对象执行不同动作的技术方案,解决相关技术中存在的虚拟对象的控制方式复杂、控制效率较低的技术问题,达到优化用户的使用体验,提高用户控制虚拟对象完成多个操作的效率的技术效果。
在一些实施例中,目标互动事件对应可互动道具,第二动作为对可互动道具的互动动作;或者目标互动事件对应可互动区域,第二动作为对可互动区域的互动动作;或者目标互动事件对应可互动虚拟对象,第二动作为对可互动虚拟对象的互动动作。
在一些实施例中,目标互动事件对应的可互动道具为虚拟滑索(即目标互动事件为滑索乘骑事件),第二动作为对虚拟滑索的乘骑滑索动作,其中,乘骑滑索动作用于使得目标虚拟对象跳上并拉住虚拟滑索,并沿着虚拟滑索滑动。
在一些实施例中,滑索乘骑事件可以包括但不限于目标虚拟对象位于能够触发目标虚拟对象使用滑索功能或者乘骑功能对应的区域中所触发的目标互动事件。
图10是根据本申请实施例的显示界面的示意图,如图10所示,例如,在显示界面1002中显示有目标虚拟角色1004以及虚拟滑索1006,当检测到对虚拟动作按钮1008执行的滑动操作移动至目标区域时,控制目标虚拟角色1004在滑索乘骑事件中跳上并拉住虚拟滑索1006,并沿着虚拟滑索1006滑动。上述仅是一种示例,本实施例不做任何其他具体的限定。
通过本实施例,在目标互动事件为滑索乘骑事件的情况下,控制目标虚拟对象在滑索乘骑事件中执行乘骑滑索动作,使得目标虚拟对象跳上并拉住虚拟滑索,并沿着虚拟滑索滑动,如此,增加了应用的玩法、优化了用户的使用体验。
在一些实施例中,目标互动事件对应的可互动区域为攀爬区域(即目标互动事件为攀爬事件),第二动作为对攀爬区域的攀爬动作。
在一些实施例中,攀爬事件可以包括但不限于目标虚拟对象位于能够触发虚拟对象使用攀爬功能对应的区域中所触发的目标互动事件。
图11是根据本申请实施例的显示界面的示意图,如图11所示,例如,在显示界面1102中显示有目标虚拟角色1104以及虚拟攀爬区域1106(即 攀爬区域),当检测到对虚拟动作按钮1108执行的滑动操作移动至目标区域时,控制目标虚拟角色1104在虚拟攀爬区域1106执行攀爬动作。上述仅是一种示例,本实施例不做任何其他具体的限定。
通过本实施例,在目标互动事件为攀爬事件的情况下,控制目标虚拟对象在攀爬事件中执行攀爬动作,如此,增加了应用的玩法、优化了用户的使用体验。
在一些实施例中,虚拟对象的控制方法还包括:当目标虚拟对象同时位于多个目标互动事件的触发范围内、且检测到对虚拟动作按钮执行的第二触控操作时,在显示界面中显示多个目标互动事件分别对应的目标区域,其中,目标区域用于触发目标虚拟对象在对应的目标互动事件中执行第二动作;其中,第一触控操作与第二触控操作相同或不同。例如,目标互动事件包括第一互动事件以及第二互动事件,当目标虚拟对象同时位于第一互动事件的触发范围以及第二互动事件的触发范围内、且检测到对虚拟动作按钮执行的第二触控操作时,在显示界面中显示第一区域和第二区域,其中,第一区域是第一互动事件对应的目标区域,第一区域用于触发目标虚拟对象在第一互动事件中执行第二动作A,第二区域是第二互动事件对应的目标区域,第二区域用于触发目标虚拟对象在第二互动事件中执行第二动作B。
在一些实施例中,第二动作可以包括但不限于使用滑索、攀岩、游泳、乘骑、交谈、购物等任何互动动作。
在一些实施例中,可以包括但不限于将第二动作A以及第二动作B设置为不同的虚拟动作,还可以包括但不限于将第二动作A以及第二动作B设置为相同的虚拟动作,但分别对应于不同的可互动虚拟对象。
在一些实施例中,第一区域和第二区域可以由系统或者服务器进行灵活设置,第一区域和第二动作A之间的匹配关系以及第二区域和第二动作B之间的匹配关系可以包括但不限于由系统或者服务器进行灵活设置。
通过本实施例,当目标虚拟对象同时位于多个目标互动事件的触发范围内、且检测到对虚拟动作按钮执行的第二触控操作时,在显示界面中显示多个目标互动事件分别对应的目标区域,其中,目标区域用于触发目标虚拟对象在对应的目标互动事件中执行第二动作。如此,用户可以根据实际需要进行选择,避免在显示界面中显示分别对应于不同目标互动事件的虚拟动作按钮,本申请实施例能够实现基于同一个虚拟动作按钮控制目标虚拟对象执行不同动作的技术方案,解决相关技术中存在的虚拟对象的控制方式复杂、控制效率较低的技术问题,达到优化用户的使用体验,提高用户控制虚拟对象完成多个互动操作的效率的技术效果。
在一些实施例中,当检测到对虚拟动作按钮执行的第一触控操作时,控制目标虚拟对象在目标互动事件中执行第二动作,包括:当检测到对虚拟动作按钮执行的第一触控操作、且第一触控操作结束于任意一个目标区 域时,控制目标虚拟对象在任意一个目标区域对应的目标互动事件中执行第二动作。例如,当检测到对虚拟动作按钮执行的第一触控操作、且第一触控操作结束于第一区域时,控制目标虚拟对象在第一互动事件中执行第二动作A;当检测到对虚拟动作按钮执行的第一触控操作、且第一触控操作结束于第二区域时,控制目标虚拟对象在第二互动事件中执行第二动作B。
在一些实施例中,图12是根据本申请实施例的显示界面的示意图,如图12所示,在显示界面1202中显示有虚拟动作按钮1204,当检测到对虚拟动作按钮执行的滑动操作时,在显示界面1202中显示第一区域1206以及第二区域1208,如图12中的阴影部分。当滑动操作结束于第一区域1206时,控制目标虚拟角色执行第二动作A;当滑动操作结束于第二区域1208时,控制目标虚拟角色执行第二动作B。这里,滑动操作既是第一触控操作,也是第二触控操作。上述仅是一种示例,本实施例不做任何具体的限定。
在一些实施例中,第一互动事件为滑索乘骑事件,第二动作A包括乘骑滑索动作,乘骑滑索动作用于使得目标虚拟对象跳上并拉住虚拟滑索,并沿着虚拟滑索滑动;第二互动事件为攀爬事件,第二动作B包括攀爬动作。
通过本实施例,当目标虚拟对象同时位于多个目标互动事件的触发范围内、且检测到对虚拟动作按钮执行的第二触控操作时,在显示界面中显示多个目标互动事件分别对应的目标区域,其中,目标区域用于触发目标虚拟对象在对应的目标互动事件中执行第二动作。如此,避免了在显示界面中显示分别对应于不同目标互动事件的多个虚拟动作按钮,能够实现基于同一个虚拟动作按钮控制目标虚拟对象执行不同动作的技术方案,解决相关技术中存在的虚拟对象的控制方式复杂、控制效率较低的技术问题,达到优化用户的使用体验,提高用户控制虚拟对象完成多个互动操作的效率的技术效果。
在一些实施例中,控制目标虚拟对象在目标互动事件中执行第二动作之后,还包括:当检测到对虚拟动作按钮执行的第三触控操作时,控制目标虚拟对象结束目标互动事件;其中,第一触控操作与第三触控操作相同或不同。
在一些实施例中,上述第三触控操作可以包括但不限于设置为与第一触控操作相同,例如,在第一触控操作被配置为点击操作的情况下,可以将第三触控操作也配置为点击操作。上述第三触控操作还可以包括但不限于设置为与第一触控操作不同,例如,在第一触控操作被配置为按压操作的情况下,可以将第三触控操作配置为松开操作。
换言之,可以根据实际情况,通过获取第三触控操作,并响应于上述第三触控操作,以结束上述目标互动事件。
在一些实施例中,控制目标虚拟对象结束目标互动事件,包括:当目标互动事件对应可互动道具时,控制目标虚拟对象结束对可互动道具的互动动作;或者当目标互动事件对应可互动区域时,控制目标虚拟对象结束对可互动区域的互动动作;或者当目标互动事件对应可互动虚拟对象时,控制目标虚拟对象结束对可互动虚拟对象的互动动作。
在一些实施例中,当目标互动事件对应可互动道具时,控制目标虚拟对象结束对可互动道具的互动动作,包括:当目标互动事件对应的可互动道具为虚拟滑索时,控制目标虚拟对象跳下虚拟滑索。即在目标互动事件为滑索乘骑事件的情况下,控制目标虚拟对象跳下虚拟滑索。
在一些实施例中,当目标互动事件对应可互动区域时,控制目标虚拟对象结束对可互动区域的互动动作,包括:当目标互动事件对应的可互动区域为攀爬区域时,控制目标虚拟对象跳离攀爬区域。即在目标互动事件为攀爬事件的情况下,控制目标虚拟对象跳离攀爬区域。上述仅是一种示例,本实施例不做任何具体的限定。
需要说明的是,对于前述的各方法实施例,为了简单描述,故将其都表述为一系列的动作组合,但是本领域技术人员应该知悉,本申请并不受所描述的动作顺序的限制,因为依据本申请,某些步骤可以采用其他顺序或者同时进行。其次,本领域技术人员也应该知悉,说明书中所描述的实施例均属于示例实施例,所涉及的动作和模块并不一定是本申请所必须的。
下面结合游戏虚拟场景的示例,对本申请进行解释说明。
图13是根据本申请实施例的另一种虚拟对象的控制方法的流程示意图,如图13所示,该流程包括但不限于如下步骤:
S1302,在游戏应用中开始一局游戏;
S1304,检测到跳跃按钮被用户的手指按压(对应于前述的获取对虚拟动作按钮的第一触控操作);
S1306,判断手指在y轴的偏移距离是否超过x像素;
S1308,当偏移距离未超过x像素时,检测到松手(手指松开)时播放跳跃动作(对应于前述的第一动作);
S1310,结束当前流程;
S1312,当偏移距离已超过x像素时,触发“乘骑”区域(对应于前述的目标区域);
S1314,判断手指是否滑动至“乘骑区域”;
S1316,当手指未滑动至“乘骑”区域时,检测到松手时不触发任何动作;
S1318,结束当前流程;
S1320,当手指滑动至“乘骑”区域时,“乘骑”区域被激活,变为高亮状态;
S1322,判断手指松开时手指的位置是否仍在“乘骑”区域;
S1324,当手指松开、且松开的位置不在“乘骑”区域时,不触发任何动作;
S1326,当手指松开时、且松开的位置在“乘骑”区域时,触发“乘骑”滑索动作(对应于前述的第二动作);
S1328,结束当前流程。
通过本实施例,在用户按下跳跃按钮时判定手指是否在y轴有偏移,偏移距离未超过某个值x时松手判定为“点击”,触发跳跃行为;偏移距离超过某个值x时判定为“滑动”,则“乘骑”区域被触发显示出来。此时,判定手指是否滑动到“乘骑”区域上,如否,则此时松手不触发任何动作;如果是,则“乘骑”区域被激活高亮,提示玩家松手即将触发。最后判定手指是否在“乘骑”区域松开,如是,则触发“乘骑”滑索动作;如否,则不触发任何动作。因此,在游戏应用中,利用同一个虚拟动作按钮的不同交互方式(点击、滑动、不同方向的滑动)实现了多种操作,减少了多个按钮对界面的遮挡,使得视野更为集中,解决相关技术中存在的虚拟对象的控制方式复杂、控制效率较低的技术问题,达到优化用户的使用体验,提高用户控制虚拟对象完成多个操作的效率的技术效果。
本申请实施例提供了一种用于实施上述虚拟对象的控制方法的虚拟对象的控制装置。如图14所示,该装置包括:第一显示模块1402,配置为在显示界面中显示目标虚拟对象以及虚拟动作按钮,其中,虚拟动作按钮用于控制目标虚拟对象执行第一动作;第二显示模块1404,配置为当目标虚拟对象位于目标互动事件的触发范围内时,在显示界面中显示第一提示信息,其中,第一提示信息用于提示对虚拟动作按钮执行触控操作;执行模块1406,配置为当检测到对虚拟动作按钮执行的第一触控操作时,控制目标虚拟对象在目标互动事件中执行第二动作。
在一些实施例中,执行模块1406还配置为:当检测到对虚拟动作按钮执行的第一触控操作、且第一触控操作结束于目标区域时,控制目标虚拟对象在目标互动事件中执行第二动作。
在一些实施例中,第二显示模块1404还配置为:当检测到对虚拟动作按钮执行的第二触控操作时,在显示界面中显示目标区域;其中,第一触控操作与第二触控操作相同或不同。
在一些实施例中,第一触控操作为滑动操作,第二触控操作为滑动操作以及长按操作中的任意一种;长按操作表示按压时长大于时长阈值的操作。
在一些实施例中,第一触控操作为滑动操作;执行模块1406还配置为:当检测到虚拟动作按钮被按压、且按压点滑动到目标区域后消失时,确定滑动操作结束于目标区域;或者当检测到虚拟动作按钮被拖拽到与目标区域重叠后结束拖拽时,确定滑动操作结束于目标区域。
在一些实施例中,执行模块1406还配置为:当检测到虚拟动作按钮被 按压、且按压点在目标方向上的偏移距离大于距离阈值时,确定检测到对虚拟动作按钮执行的滑动操作,并将滑动操作作为第一触控操作。
在一些实施例中,第一提示信息包括方向提示信息,方向提示信息用于提示对虚拟动作按钮执行的触控操作的目标方向。
在一些实施例中,目标互动事件对应可互动道具,第二动作为对可互动道具的互动动作;或者目标互动事件对应可互动区域,第二动作为对可互动区域的互动动作;或者目标互动事件对应可互动虚拟对象,第二动作为对可互动虚拟对象的互动动作。
在一些实施例中,目标互动事件对应的可互动道具为虚拟滑索,第二动作为对虚拟滑索的乘骑滑索动作,其中,乘骑滑索动作用于使得目标虚拟对象跳上并拉住虚拟滑索,并沿着虚拟滑索滑动。
在一些实施例中,目标互动事件对应的可互动区域为攀爬区域,第二动作为对攀爬区域的攀爬动作。
在一些实施例中,第二显示模块1404还配置为::当目标虚拟对象同时位于多个目标互动事件的触发范围内、且检测到对虚拟动作按钮执行的第二触控操作时,在显示界面中显示多个目标互动事件分别对应的目标区域,其中,目标区域用于触发目标虚拟对象在对应的目标互动事件中执行第二动作;其中,第一触控操作与第二触控操作相同或不同。
在一些实施例中,执行模块1406还配置为:当检测到对虚拟动作按钮执行的第一触控操作、且第一触控操作结束于任意一个目标区域时,控制目标虚拟对象在任意一个目标区域对应的目标互动事件中执行第二动作。
在一些实施例中,执行模块1406还配置为:当检测到对虚拟动作按钮执行的第三触控操作时,控制目标虚拟对象结束目标互动事件;其中,第一触控操作与第三触控操作相同或不同。
在一些实施例中,执行模块1406还配置为:当目标互动事件对应可互动道具时,控制目标虚拟对象结束对可互动道具的互动动作;或者当目标互动事件对应可互动区域时,控制目标虚拟对象结束对可互动区域的互动动作;或者当目标互动事件对应可互动虚拟对象时,控制目标虚拟对象结束对可互动虚拟对象的互动动作。
在一些实施例中,当目标互动事件对应的可互动道具为虚拟滑索时,控制目标虚拟对象跳下虚拟滑索;当目标互动事件对应的可互动区域为攀爬区域时,控制目标虚拟对象跳离攀爬区域。
本申请实施例提供了一种用于实施上述虚拟对象的控制方法的电子设备,该电子设备可以是图1所示的终端设备或服务器。本实施例以该电子设备为终端设备为例来说明。如图15所示,该电子设备包括存储器1502和处理器1504,该存储器1502中存储有计算机程序,该处理器1504被设置为通过计算机程序执行上述方法实施例中的步骤。
在一些实施例中,上述电子设备可以是位于计算机网络的多个网络设 备中的至少一个网络设备。
在一些实施例中,上述处理器可以被设置为通过计算机程序执行以下步骤:在显示界面中显示目标虚拟对象以及虚拟动作按钮,其中,虚拟动作按钮用于控制目标虚拟对象执行第一动作;当目标虚拟对象位于目标互动事件的触发范围内时,在显示界面中显示第一提示信息,其中,第一提示信息用于提示对虚拟动作按钮执行触控操作;当检测到对虚拟动作按钮执行的第一触控操作时,控制目标虚拟对象在目标互动事件中执行第二动作。
在一些实施例中,本领域普通技术人员可以理解,图15所示的结构仅为示意,电子设备也可以是智能手机(如Android手机、iOS手机等)、平板电脑、掌上电脑以及移动互联网设备(Mobile Internet Devices,MID)、PAD等终端设备。图15并不对上述电子设备的结构造成限定。例如,电子设备还可包括比图15中所示更多或者更少的组件(如网络接口等),或者具有与图15所示不同的配置。
其中,存储器1502可用于存储软件程序以及模块,如本申请实施例中的虚拟对象的控制方法和装置对应的程序指令/模块,处理器1504通过运行存储在存储器1502内的软件程序以及模块,从而执行各种功能应用以及数据处理,即实现上述的虚拟对象的控制方法。存储器1502可包括高速随机存储器,还可以包括非易失性存储器,如一个或者多个磁性存储装置、闪存、或者其他非易失性固态存储器。在一些实例中,存储器1502可包括相对于处理器1504远程设置的存储器,这些远程存储器可以通过网络连接至终端。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。其中,存储器1502可以但不限于用于存储虚拟动作按钮与虚拟对象等信息。作为一种示例,如图15所示,上述存储器1502中可以但不限于包括上述虚拟对象的控制装置中的第一显示模块1402、第二显示模块1404及执行模块1406。此外,还可以包括但不限于上述虚拟对象的控制装置中的其他模块单元。
在一些实施例中,上述的传输装置1506用于经由一个网络接收或者发送数据。上述的网络可包括有线网络及无线网络。在一个示例中,传输装置1506包括一个网络适配器(Network Interface Controller,NIC),其可通过网线与其他网络设备与路由器相连从而可与互联网或局域网进行通讯。在一个示例中,传输装置1506为射频(Radio Frequency,RF)模块,其用于通过无线方式与互联网进行通讯。
此外,上述电子设备还包括:显示器1508,用于显示虚拟对象的控制应用(如游戏应用)的显示界面(例如可以包括游戏画面和交互界面);连接总线1510,用于连接上述电子设备中的各个模块部件。
在其他实施例中,上述终端设备或者服务器可以是一个分布式系统中的一个节点,其中,该分布式系统可以为区块链系统,该区块链系统可以 是由该多个节点通过网络通信的形式连接形成的分布式系统。其中,节点之间可以组成点对点(P2P,Peer To Peer)网络,任意形式的电子设备,比如服务器、终端等电子设备都可以通过加入该点对点网络而成为该区块链系统中的一个节点。
本申请实施例提供了一种计算机程序产品或计算机程序,该计算机程序产品或计算机程序包括计算机指令(可执行指令),该计算机指令存储在计算机可读存储介质中。电子设备的处理器从计算机可读存储介质读取该计算机指令,处理器执行该计算机指令,使得该电子设备执行上述虚拟对象的控制方面的各种实现方式中提供的虚拟对象的控制方法,其中,该计算机程序被设置为运行时执行上述方法实施例中的步骤。
在一些实施例中,上述计算机可读存储介质可以被设置为存储用于执行以下步骤的计算机程序:在显示界面中显示目标虚拟对象以及虚拟动作按钮,其中,虚拟动作按钮用于控制目标虚拟对象执行第一动作;当目标虚拟对象位于目标互动事件的触发范围内时,在显示界面中显示第一提示信息,其中,第一提示信息用于提示对虚拟动作按钮执行触控操作;当检测到对虚拟动作按钮执行的第一触控操作时,控制目标虚拟对象在目标互动事件中执行第二动作。
在一些实施例中,本领域普通技术人员可以理解上述实施例的各种方法中的全部或部分步骤是可以通过程序来指令终端设备相关的硬件来完成,该程序可以存储于一计算机可读存储介质中,存储介质可以包括:闪存盘、只读存储器(Read-Only Memory,ROM)、随机存取器(Random Access Memory,RAM)、磁盘或光盘等。
上述本申请实施例序号仅仅为了描述,不代表实施例的优劣。
上述实施例中的集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在上述计算机可读取的存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在存储介质中,包括若干指令用以使得一台或多台电子设备(可为个人计算机、服务器或者网络设备等)执行本申请各个实施例方法的全部或部分步骤。
在本申请的上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见其他实施例的相关描述。
在本申请所提供的几个实施例中,应该理解到,所揭露的客户端,可通过其它的方式实现。其中,以上所描述的装置实施例仅仅是示意性的,例如单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,单元或模块的间接耦合或通信 连接,可以是电性或其它的形式。
作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
以上仅是本申请的示例实施方式,应当指出,对于本技术领域的普通技术人员来说,在不脱离本申请原理的前提下,还可以做出若干改进和润饰,这些改进和润饰也应视为本申请的保护范围。

Claims (22)

  1. 一种虚拟对象的控制方法,由电子设备执行,所述方法包括:
    在显示界面中显示目标虚拟对象以及虚拟动作按钮,其中,所述虚拟动作按钮用于控制所述目标虚拟对象执行第一动作;
    当所述目标虚拟对象位于目标互动事件的触发范围内时,在所述显示界面中显示第一提示信息,其中,所述第一提示信息用于提示对所述虚拟动作按钮执行触控操作;
    当检测到对所述虚拟动作按钮执行的第一触控操作时,控制所述目标虚拟对象在所述目标互动事件中执行第二动作。
  2. 根据权利要求1所述的方法,其中,所述当检测到对所述虚拟动作按钮执行的第一触控操作时,控制所述目标虚拟对象在所述目标互动事件中执行第二动作,包括:
    当检测到对所述虚拟动作按钮执行的第一触控操作、且所述第一触控操作结束于目标区域时,控制所述目标虚拟对象在所述目标互动事件中执行所述第二动作。
  3. 根据权利要求2所述的方法,其中,所述方法还包括:
    当检测到对所述虚拟动作按钮执行的第二触控操作时,在所述显示界面中显示所述目标区域;
    其中,所述第一触控操作与所述第二触控操作相同或不同。
  4. 根据权利要求3所述的方法,其中,所述第一触控操作为滑动操作,所述第二触控操作为所述滑动操作以及长按操作中的任意一种;所述长按操作表示按压时长大于时长阈值的操作。
  5. 根据权利要求2所述的方法,其中,所述第一触控操作为滑动操作;所述方法还包括:
    当检测到所述虚拟动作按钮被按压、且按压点滑动到所述目标区域后消失时,确定所述滑动操作结束于所述目标区域;或者
    当检测到所述虚拟动作按钮被拖拽到与所述目标区域重叠后结束拖拽时,确定所述滑动操作结束于所述目标区域。
  6. 根据权利要求2所述的方法,其中,所述方法还包括:
    当检测到对所述虚拟动作按钮执行的第一触控操作、且所述第一触控操作的执行对象从所述虚拟动作按钮更新为所述目标区域时,在所述显示界面中显示第二提示信息,其中,所述第二提示信息用于提示在所述目标区域结束所述第一触控操作。
  7. 根据权利要求6所述的方法,其中,所述在所述显示界面中显示第二提示信息,包括:
    在所述显示界面中执行以下至少一种处理:
    更新所述目标区域的显示状态;
    显示文字标识以及动画效果中的至少之一;
    更新所述虚拟动作按钮的标识信息。
  8. 根据权利要求1所述的方法,其中,所述方法还包括:
    当检测到所述虚拟动作按钮被按压、且按压点在目标方向上的偏移距离大于距离阈值时,确定检测到对所述虚拟动作按钮执行的滑动操作,并将所述滑动操作作为第一触控操作。
  9. 根据权利要求1所述的方法,其中,所述第一提示信息包括方向提示信息,所述方向提示信息用于提示对所述虚拟动作按钮执行的触控操作的目标方向。
  10. 根据权利要求1所述的方法,其中,所述目标互动事件对应可互动道具,所述第二动作为对所述可互动道具的互动动作;或者
    所述目标互动事件对应可互动区域,所述第二动作为对所述可互动区域的互动动作;或者
    所述目标互动事件对应可互动虚拟对象,所述第二动作为对所述可互动虚拟对象的互动动作。
  11. 根据权利要求10所述的方法,其中,所述目标互动事件对应的可互动道具为虚拟滑索,所述第二动作为对所述虚拟滑索的乘骑滑索动作,其中,所述乘骑滑索动作用于使得所述目标虚拟对象跳上并拉住所述虚拟滑索,并沿着所述虚拟滑索滑动。
  12. 根据权利要求10所述的方法,其中,所述目标互动事件对应的可互动区域为攀爬区域,所述第二动作为对所述攀爬区域的攀爬动作。
  13. 根据权利要求1所述的方法,其中,所述方法还包括:
    当所述目标虚拟对象同时位于多个目标互动事件的触发范围内、且检测到对所述虚拟动作按钮执行的第二触控操作时,在所述显示界面中显示所述多个目标互动事件分别对应的目标区域,其中,所述目标区域用于触发所述目标虚拟对象在对应的目标互动事件中执行第二动作;
    其中,所述第一触控操作与所述第二触控操作相同或不同。
  14. 根据权利要求13所述的方法,其中,所述当检测到对所述虚拟动作按钮执行的第一触控操作时,控制所述目标虚拟对象在所述目标互动事件中执行第二动作,包括:
    当检测到对所述虚拟动作按钮执行的第一触控操作、且所述第一触控操作结束于任意一个目标区域时,控制所述目标虚拟对象在所述任意一个目标区域对应的目标互动事件中执行第二动作。
  15. 根据权利要求1至14中任一项所述的方法,其中,所述控制所述目标虚拟对象在所述目标互动事件中执行第二动作之后,所述方法还包括:
    当检测到对所述虚拟动作按钮执行的第三触控操作时,控制所述目标虚拟对象结束所述目标互动事件;
    其中,所述第一触控操作与所述第三触控操作相同或不同。
  16. 根据权利要求15所述的方法,其中,所述控制所述目标虚拟对象结束所述目标互动事件,包括:
    当所述目标互动事件对应可互动道具时,控制所述目标虚拟对象结束对所述可互动道具的互动动作;或者
    当所述目标互动事件对应可互动区域时,控制所述目标虚拟对象结束对所述可互动区域的互动动作;或者
    当所述目标互动事件对应可互动虚拟对象时,控制所述目标虚拟对象结束对所述可互动虚拟对象的互动动作。
  17. 根据权利要求16所述的方法,其中,所述当所述目标互动事件对应可互动道具时,控制所述目标虚拟对象结束对所述可互动道具的互动动作,包括:
    当所述目标互动事件对应的可互动道具为虚拟滑索时,控制所述目标虚拟对象跳下所述虚拟滑索;
    所述当所述目标互动事件对应可互动区域时,控制所述目标虚拟对象结束对所述可互动区域的互动动作,包括:
    当所述目标互动事件对应的可互动区域为攀爬区域时,控制所述目标虚拟对象跳离所述攀爬区域。
  18. 根据权利要求1至14中任一项所述的方法,其中,所述第一提示信息包括拖拽方向标记以及按压力度标记中的至少之一;
    其中,所述拖拽方向标记用于提示对所述虚拟动作按钮执行拖拽操作;所述按压力度标记用于提示对所述虚拟动作按钮执行按压操作。
  19. 一种虚拟对象的控制装置,所述装置包括:
    第一显示模块,配置为在显示界面中显示目标虚拟对象以及虚拟动作按钮,其中,所述虚拟动作按钮用于控制所述目标虚拟对象执行第一动作;
    第二显示模块,配置为当所述目标虚拟对象位于目标互动事件的触发范围内时,在所述显示界面中显示第一提示信息,其中,所述第一提示信息用于提示对所述虚拟动作按钮执行触控操作;
    执行模块,配置为当检测到对所述虚拟动作按钮执行的第一触控操作时,控制所述目标虚拟对象在所述目标互动事件中执行第二动作。
  20. 一种计算机可读存储介质,所述计算机可读存储介质存储有计算机指令,其中,所述计算机指令被处理器执行时实现权利要求1至18任一项中所述的虚拟对象的控制方法。
  21. 一种电子设备,包括存储器和处理器,所述存储器中存储有计算机指令,所述处理器被设置为通过所述计算机指令实现权利要求1至18任一项中所述的虚拟对象的控制方法。
  22. 一种计算机程序产品,包括计算机指令,所述计算机指令被处理器执行时实现权利要求1至18任一项中所述的虚拟对象的控制方法。
PCT/CN2021/123270 2020-11-13 2021-10-12 虚拟对象的控制方法和装置、存储介质及电子设备 WO2022100339A1 (zh)

Priority Applications (8)

Application Number Priority Date Filing Date Title
EP21815878.0A EP4026595A4 (en) 2020-11-13 2021-10-12 CONTROL METHOD AND DEVICE FOR VIRTUAL OBJECTS, STORAGE MEDIUM AND ELECTRONIC DEVICE
BR112022001065A BR112022001065A2 (pt) 2020-11-13 2021-10-12 Método e aparelho de controle de objeto virtual, meio de armazenamento e dispositivo eletrônico
CA3146804A CA3146804A1 (en) 2020-11-13 2021-10-12 Virtual object control method and apparatus, storage medium, and electronic device
JP2022514177A JP7418554B2 (ja) 2020-11-13 2021-10-12 仮想オブジェクトの制御方法及び装置、電子機器並びにコンピュータプログラム
KR1020227002073A KR102721446B1 (ko) 2020-11-13 2021-10-12 가상 객체 제어 방법 및 장치, 저장 매체 및 전자 기기
AU2021307015A AU2021307015B2 (en) 2020-11-13 2021-10-12 Virtual object control method and apparatus, storage medium, and electronic device
US17/585,331 US12090404B2 (en) 2020-11-13 2022-01-26 Virtual object control method and apparatus, storage medium, and electronic device
SA522431616A SA522431616B1 (ar) 2020-11-13 2022-02-07 طريقة وجهاز للتحكم في جسم افتراضي، ووسط تخزين، وجهاز إلكتروني

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011270984.7 2020-11-13
CN202011270984.7A CN112245918B (zh) 2020-11-13 2020-11-13 虚拟角色的控制方法和装置、存储介质及电子设备

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/585,331 Continuation US12090404B2 (en) 2020-11-13 2022-01-26 Virtual object control method and apparatus, storage medium, and electronic device

Publications (1)

Publication Number Publication Date
WO2022100339A1 true WO2022100339A1 (zh) 2022-05-19

Family

ID=74265721

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/123270 WO2022100339A1 (zh) 2020-11-13 2021-10-12 虚拟对象的控制方法和装置、存储介质及电子设备

Country Status (3)

Country Link
CN (1) CN112245918B (zh)
TW (1) TW202218723A (zh)
WO (1) WO2022100339A1 (zh)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA3146804A1 (en) 2020-11-13 2022-05-13 Tencent Technology (Shenzhen) Company Limited Virtual object control method and apparatus, storage medium, and electronic device
CN112245918B (zh) * 2020-11-13 2023-01-31 腾讯科技(深圳)有限公司 虚拟角色的控制方法和装置、存储介质及电子设备
CN113082688B (zh) * 2021-03-31 2024-02-13 网易(杭州)网络有限公司 游戏中虚拟角色的控制方法、装置、存储介质及设备
CN113325951B (zh) * 2021-05-27 2024-03-29 百度在线网络技术(北京)有限公司 基于虚拟角色的操作控制方法、装置、设备以及存储介质
CN113318430B (zh) * 2021-05-28 2024-08-20 网易(杭州)网络有限公司 虚拟角色的姿态调整方法、装置、处理器及电子装置
CN113559516B (zh) * 2021-07-30 2023-07-14 腾讯科技(深圳)有限公司 虚拟角色的控制方法和装置、存储介质及电子设备
CN113891138B (zh) * 2021-09-27 2024-05-14 深圳市腾讯信息技术有限公司 互动操作提示方法和装置、存储介质及电子设备
CN113975803B (zh) * 2021-10-28 2023-08-25 腾讯科技(深圳)有限公司 虚拟角色的控制方法和装置、存储介质及电子设备
CN113986079B (zh) * 2021-10-28 2023-07-14 腾讯科技(深圳)有限公司 虚拟按钮的设置方法和装置、存储介质及电子设备
CN113893527B (zh) * 2021-11-01 2023-07-14 北京字跳网络技术有限公司 一种交互控制方法、装置、电子设备及存储介质
CN113975798B (zh) * 2021-11-09 2023-07-04 北京字跳网络技术有限公司 一种交互控制方法、装置以及计算机存储介质
CN118113179A (zh) * 2022-11-23 2024-05-31 腾讯科技(深圳)有限公司 虚拟对象的互动方法和装置、存储介质及电子设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160199728A1 (en) * 2015-01-08 2016-07-14 LINE Plus Corporation Game methods for controlling game using virtual buttons and systems for performing the same
CN110013671A (zh) * 2019-05-05 2019-07-16 腾讯科技(深圳)有限公司 动作执行方法和装置、存储介质及电子装置
CN110215691A (zh) * 2019-07-17 2019-09-10 网易(杭州)网络有限公司 一种游戏中虚拟角色的移动控制方法及装置
CN110270086A (zh) * 2019-07-17 2019-09-24 网易(杭州)网络有限公司 一种游戏中虚拟角色的移动控制方法及装置
CN112245918A (zh) * 2020-11-13 2021-01-22 腾讯科技(深圳)有限公司 虚拟角色的控制方法和装置、存储介质及电子设备

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4258850B2 (ja) * 2004-12-28 2009-04-30 株式会社セガ 画像処理装置およびその方法
CN108509139B (zh) * 2018-03-30 2019-09-10 腾讯科技(深圳)有限公司 虚拟对象的移动控制方法、装置、电子装置及存储介质
CN111773681B (zh) * 2020-08-03 2024-07-09 网易(杭州)网络有限公司 控制虚拟游戏角色的方法及装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160199728A1 (en) * 2015-01-08 2016-07-14 LINE Plus Corporation Game methods for controlling game using virtual buttons and systems for performing the same
CN110013671A (zh) * 2019-05-05 2019-07-16 腾讯科技(深圳)有限公司 动作执行方法和装置、存储介质及电子装置
CN110215691A (zh) * 2019-07-17 2019-09-10 网易(杭州)网络有限公司 一种游戏中虚拟角色的移动控制方法及装置
CN110270086A (zh) * 2019-07-17 2019-09-24 网易(杭州)网络有限公司 一种游戏中虚拟角色的移动控制方法及装置
CN112245918A (zh) * 2020-11-13 2021-01-22 腾讯科技(深圳)有限公司 虚拟角色的控制方法和装置、存储介质及电子设备

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
DAN WANG YI QIE YOU HE FANG [FORGET EVERYTHING]: "Easy Way to Rope Jump in Apex", 18 May 2019 (2019-05-18), CN, XP055924489, Retrieved from the Internet <URL:https://tieba.baidu.com/p/6135169914> *

Also Published As

Publication number Publication date
CN112245918A (zh) 2021-01-22
CN112245918B (zh) 2023-01-31
TW202218723A (zh) 2022-05-16

Similar Documents

Publication Publication Date Title
WO2022100339A1 (zh) 虚拟对象的控制方法和装置、存储介质及电子设备
CN105159687B (zh) 一种信息处理方法、终端及计算机存储介质
US10695674B2 (en) Information processing method and apparatus, storage medium and electronic device
CN112370781B (zh) 操作控制方法和装置、存储介质及电子设备
JP6637589B2 (ja) 情報処理方法、端末、及びコンピュータ記憶媒体
CN105148517B (zh) 一种信息处理方法、终端及计算机存储介质
CN109557998B (zh) 信息交互方法、装置、存储介质和电子装置
EP3385830B1 (en) Data transmission method and device
AU2021307015B2 (en) Virtual object control method and apparatus, storage medium, and electronic device
WO2017161904A1 (zh) 壁纸图片的显示方法和装置
CN105335064A (zh) 一种信息处理方法、终端和计算机存储介质
CN105915766B (zh) 基于虚拟现实的控制方法和装置
KR102700298B1 (ko) 가상 객체 제어 방법 및 장치, 저장 매체, 및 전자 디바이스
CN112370780B (zh) 虚拟控件的显示方法和装置、存储介质及电子设备
WO2023035812A1 (zh) 虚拟角色的控制方法和装置、存储介质及电子设备
CN111111219B (zh) 虚拟道具的控制方法和装置、存储介质及电子装置
KR102721446B1 (ko) 가상 객체 제어 방법 및 장치, 저장 매체 및 전자 기기
CN115350472A (zh) 一种云游戏控件配置方法、装置、设备和存储介质
CN113813599A (zh) 虚拟角色的控制方法和装置、存储介质及电子设备
WO2024041098A1 (zh) 虚拟角色状态的设置方法和装置、存储介质及电子设备
CN105630402B (zh) 图像数据存储位置的变更方法和变更装置
WO2024207873A1 (zh) 虚拟场景的交互方法、装置、电子设备、计算机可读存储介质及计算机程序产品

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2021815878

Country of ref document: EP

Effective date: 20211209

ENP Entry into the national phase

Ref document number: 2022514177

Country of ref document: JP

Kind code of ref document: A

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112022001065

Country of ref document: BR

ENP Entry into the national phase

Ref document number: 2021307015

Country of ref document: AU

Date of ref document: 20211012

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 112022001065

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20220119

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 522431616

Country of ref document: SA