WO2022100339A1 - 虚拟对象的控制方法和装置、存储介质及电子设备 - Google Patents
虚拟对象的控制方法和装置、存储介质及电子设备 Download PDFInfo
- Publication number
- WO2022100339A1 WO2022100339A1 PCT/CN2021/123270 CN2021123270W WO2022100339A1 WO 2022100339 A1 WO2022100339 A1 WO 2022100339A1 CN 2021123270 W CN2021123270 W CN 2021123270W WO 2022100339 A1 WO2022100339 A1 WO 2022100339A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- target
- virtual
- action
- virtual object
- touch operation
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 89
- 230000009471 action Effects 0.000 claims abstract description 272
- 230000003993 interaction Effects 0.000 claims abstract description 106
- 238000004590 computer program Methods 0.000 claims abstract description 13
- 230000002452 interceptive effect Effects 0.000 claims description 110
- 238000003825 pressing Methods 0.000 claims description 32
- 230000009194 climbing Effects 0.000 claims description 25
- 230000000694 effects Effects 0.000 claims description 15
- 230000008569 process Effects 0.000 claims description 11
- 230000002493 climbing effect Effects 0.000 claims description 6
- 238000010586 diagram Methods 0.000 description 24
- 230000001960 triggered effect Effects 0.000 description 9
- 238000005516 engineering process Methods 0.000 description 7
- 238000001514 detection method Methods 0.000 description 6
- 230000006870 function Effects 0.000 description 6
- 230000008447 perception Effects 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000006872 improvement Effects 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 230000009191 jumping Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 239000011435 rock Substances 0.000 description 2
- 230000009182 swimming Effects 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- 230000003416 augmentation Effects 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000016776 visual perception Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
Definitions
- the present application relates to the field of computers, and in particular, to a method and apparatus for controlling a virtual object, a storage medium, an electronic device, and a computer program product.
- buttons for triggering will be arranged in the area of the trigger button, which makes users at a loss. button, resulting in inefficient control.
- An embodiment of the present application provides a method for controlling a virtual object, including:
- the target virtual object When the target virtual object is located within the triggering range of the target interaction event, displaying first prompt information in the display interface, wherein the first prompt information is used to prompt to perform a touch operation on the virtual action button;
- the target virtual object is controlled to perform a second action in the target interaction event.
- the embodiment of the present application provides a control device for a virtual object, including:
- a first display module configured to display a target virtual object and a virtual action button in the display interface, wherein the virtual action button is used to control the target virtual object to perform a first action
- the second display module is configured to display first prompt information in the display interface when the target virtual object is within the triggering range of the target interaction event, wherein the first prompt information is used to prompt the virtual Action buttons perform touch operations;
- the execution module is configured to control the target virtual object to perform a second action in the target interaction event when a first touch operation performed on the virtual action button is detected.
- Embodiments of the present application provide a computer-readable storage medium, where computer instructions are stored in the computer-readable storage medium, wherein, when the computer instructions are executed by a processor, the virtual object control method provided by the embodiments of the present application is implemented.
- An embodiment of the present application provides an electronic device, including a memory and a processor, where computer instructions are stored in the memory, and the processor is configured to implement the above-mentioned virtual object control method through the computer instructions.
- the embodiments of the present application provide a computer program product, including computer instructions, and when the computer instructions are executed by a processor, implement the virtual object control method provided by the embodiments of the present application.
- FIG. 1 is a schematic diagram of an application environment of a method for controlling a virtual object according to an embodiment of the present application
- FIG. 2 is a schematic flowchart of a method for controlling a virtual object according to an embodiment of the present application
- FIG. 3 is a schematic diagram of a display interface according to an embodiment of the present application.
- FIG. 4 is a schematic diagram of another display interface according to an embodiment of the present application.
- FIG. 5 is a schematic diagram of yet another display interface according to an embodiment of the present application.
- FIG. 6 is a schematic diagram of another display interface according to an embodiment of the present application.
- FIG. 7 is a schematic diagram of yet another display interface according to an embodiment of the present application.
- FIG. 8 is a schematic diagram of yet another display interface according to an embodiment of the present application.
- FIG. 9A is a schematic diagram of yet another display interface according to an embodiment of the present application.
- FIG. 9B is a schematic diagram of yet another display interface according to an embodiment of the present application.
- FIG. 10 is a schematic diagram of another display interface according to an embodiment of the present application.
- FIG. 11 is a schematic diagram of yet another display interface according to an embodiment of the present application.
- FIG. 12 is a schematic diagram of another display interface according to an embodiment of the present application.
- FIG. 13 is a schematic flowchart of another method for controlling a virtual object according to an embodiment of the present application.
- FIG. 14 is a schematic structural diagram of a control device for a virtual object according to an embodiment of the present application.
- FIG. 15 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
- Virtual scene using the scene output by electronic equipment that is different from the real world, the visual perception of the virtual scene can be formed through the assistance of naked eyes or equipment, such as the two-dimensional image output by the display screen, through stereo projection, virtual reality and augmentation. Reality technology and other stereoscopic display technologies to output three-dimensional images; in addition, various perceptions of the real world, such as auditory perception, tactile perception, olfactory perception and motion perception, can be formed through various possible hardware.
- the virtual scene can be a simulated environment of the real world, a semi-simulated and semi-fictional virtual environment, or a purely fictional virtual environment.
- the virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, or a three-dimensional virtual scene, and the embodiment of the present application does not limit the dimension of the virtual scene.
- Virtual objects the images of various people and objects that can interact in the virtual scene, or the movable objects in the virtual scene.
- the movable objects may be virtual characters, virtual animals, cartoon characters, etc., for example, characters, animals, plants, oil barrels, walls, stones, etc. displayed in the virtual scene.
- the virtual object may be a virtual avatar representing the user in the virtual scene.
- the virtual scene may include multiple virtual objects, and each virtual object has its own shape and volume in the virtual scene and occupies a part of the space in the virtual scene.
- the virtual object may be a virtual character in a virtual scene, which is controlled by a user or artificial intelligence (Artificial Intelligence, AI).
- AI Artificial Intelligence
- Virtual props also known as interactive props, for the interaction of virtual objects in the virtual scene.
- virtual props may include a virtual zipline that connects two locations, and a player can ride the zipline to quickly move from one location to another.
- Embodiments of the present application provide a method for controlling a virtual object.
- the above-mentioned method for controlling a virtual object may be applied to a hardware environment composed of a server 101 and a terminal device 103 as shown in FIG. 1 .
- the server 101 is connected to the terminal device 103 through the network, and can be used to provide services for the terminal device 103 or the application 107 installed in the terminal device 103, and the application can be a video application, an instant messaging application, a browser application, an education application Applications, game applications, etc., may also include but are not limited to other applications capable of virtual object control.
- a database 105 may be provided in the server 101 or independently of the server 101, and the database 105 is used to provide the server 101 with data storage services, such as game data storage services.
- the above-mentioned networks may include but are not limited to wired networks and wireless networks, wherein the wired networks include but are not limited to local area networks, metropolitan area networks and wide area networks, and the wireless networks include but are not limited to Bluetooth, WIFI and other networks that implement wireless communication.
- the terminal device 103 may be a terminal device configured with a virtual object control application (ie, the application 107 ), which may include, but is not limited to, at least one of the following: a mobile phone (such as an Android mobile phone, an iOS mobile phone, etc.), a laptop computer, a tablet computer, a handheld computer , Mobile Internet Devices (MID), PAD, desktop computer, smart TV, etc.
- a mobile phone such as an Android mobile phone, an iOS mobile phone, etc.
- MID Mobile Internet Devices
- PAD desktop computer
- smart TV smart TV
- the above-mentioned server 101 may be a single server, a server cluster composed of multiple servers, or a cloud server, which may include but not limited to a router or a gateway.
- the application 107 can be started, and the virtual scene (including the target virtual object and the virtual action button) can be output in the display interface of the application 107, and the application 107 is a game application
- the display interface may include a game screen (or called a game interface) and an interactive interface.
- this is only an example, which is not limited in this embodiment.
- control method for virtual objects can be implemented in the terminal device 103 through the following steps:
- S1 start the application 107 on the terminal device 103, and display the target virtual object and the virtual action button in the display interface of the application 107, wherein the virtual action button is used to control the target virtual object to perform the first action;
- the display interface may include a game screen and an interactive interface, and the virtual action button and the first prompt information may be displayed in the interactive interface.
- the above-mentioned control method for virtual objects may also be implemented by an application program including, but not limited to, configured on the server 101 , or, the terminal device 103 or the server 101 implements the above-mentioned control method for virtual objects in combination, for example, the server 101
- the relevant display data of the virtual scene may be sent to the terminal device 103, so that the terminal device 103 displays the virtual scene according to the received display data.
- the above is only an example, and this embodiment does not make a specific limitation.
- control method for a virtual object includes:
- the virtual object control application used to implement the virtual object control method may include but not limited to game software, app, applet, etc., and may also include but not limited to be configured in any software, app, applet, etc.
- the above-mentioned target virtual objects may include, but are not limited to, virtual objects controlled when logging in to the control application of the virtual object after registering with the background or server corresponding to the control application of the virtual object.
- the display interface may include a game screen and an interactive interface
- the above-mentioned game screen and interactive interface may include but are not limited to being configured to be displayed in the same display area, overlapping the interactive interface and the game screen. Display, or display in different display areas (that is, display the interactive interface in the area of the display interface other than the above-mentioned game screen).
- the game screen may be used to display the target virtual object
- the interactive interface may be used to display the virtual action button and the first prompt information.
- the display logic is not limited to this.
- FIG. 3 is a schematic diagram of a display interface according to an embodiment of the present application. As shown in FIG. 3 , it may include, but is not limited to, displaying a game screen 302 and an interactive interface 304 in a game application, wherein the game screen A target virtual character 306 is included in 302 (target virtual character 306 is an example of a target virtual object), and a virtual action button 308 is displayed in the interactive interface 304 .
- target virtual character 306 is an example of a target virtual object
- a virtual action button 308 is displayed in the interactive interface 304 .
- the above-mentioned virtual action button may include, but is not limited to, control the target virtual object to perform a preset first action when a touch operation performed on the above-mentioned virtual action button is detected.
- the virtual action button 308 is configured as a "jump" button, that is, when a touch operation performed on the above-mentioned virtual action button 308 is detected, the control The target avatar 306 performs a "jump" operation.
- the first action can be flexibly configured according to the control application of the virtual object, for example, it may include but not limited to configuring the above-mentioned first action as virtual objects such as "jump”, “squat”, “get down”, “bend over”, etc.
- the control methods of objects can also include but are not limited to the control methods of virtual objects such as pushing, pulling, lifting, lifting, pressing, etc., and can also include but are not limited to being configured to match the preset interactive events.
- the interactive event when the target virtual object is in the area that allows to perform predetermined operations, control the target virtual object to perform the predetermined operation; when the interactive event is that the target virtual object is in the area where virtual props are allowed to be opened, control the target virtual object to perform the operation of opening virtual props; When the target virtual object is in an area where virtual props are allowed to be used, the target virtual object is controlled to perform an operation using virtual props.
- the first action is an action initially bound to the virtual action button
- the second action is an action performed when the target virtual object is in the target interaction event.
- the above-mentioned first prompt information is used to prompt to perform a touch operation on the virtual action button.
- the above-mentioned touch operation may include but is not limited to clicking, long pressing, dragging, releasing, double-clicking, etc. Configure effects corresponding to different touch operations, where a long press means that the pressing duration is greater than the duration threshold.
- FIG. 4 is according to an embodiment of the present application.
- a schematic diagram of the display interface, as shown in FIG. 4 when the target avatar is located within the triggering range of the target interaction event, the first prompt information 406 is displayed in the vicinity 404 of the virtual action button 402, which can be displayed by including but not limited to arrows.
- the dragging direction corresponding to the dragging operation can be indicated in the form of .
- FIG. 5 is according to an embodiment of the present application. A schematic diagram of the display interface, as shown in FIG.
- the first prompt information 506 is displayed in the vicinity of the virtual action button 502 in the area 504, which can be displayed by including but not limited to the force
- the pressing force corresponding to the pressing operation is indicated in the form of a bar graph, wherein the shaded part in the first prompt information 506 is the current pressing force, and the corresponding force threshold value identifier can be configured in the first prompt information 506 to inform
- the user performs the corresponding second action after detecting the pressing force that reaches the force threshold, and can also instruct the virtual action button to perform the pressing operation by including but not limited to displaying a text mark on the game screen, and different effects corresponding to different pressing forces. text identification.
- a specific implementation manner may include, but is not limited to, a combination of one or more of the above, and this embodiment does not make any specific limitation.
- target interaction events may include, but are not limited to, interaction events for interactable props, interaction events for interactable areas, and interaction events for interactable virtual objects.
- target interaction events may include, but are not limited to, ziplining, rock climbing, swimming, riding, talking, shopping, and the like.
- the virtual action button is configured to perform a second action different from the preconfigured first action when the target interaction event is triggered within the trigger range, so that the same virtual action button can perform different virtual actions in different areas.
- the purpose of the action is to achieve the technical effect of improving the control efficiency of virtual objects, optimizing the user experience, and reducing the adverse effects of obscuring the field of view caused by too many virtual buttons, thereby solving the control efficiency of virtual objects in related technologies. Less technical issues.
- controlling the target virtual object to perform the second action in the target interaction event includes: when the first touch operation performed on the virtual action button is detected When the control operation is performed and the first touch operation ends in the target area, the target virtual object is controlled to perform the second action in the target interaction event.
- the first touch operation may include, but is not limited to, a sliding operation.
- the target virtual object is controlled to perform the second action in the target interaction event, which can be flexible and convenient
- the first touch operation is implemented for the same virtual action button, and other functions different from the original function of the virtual action button are realized according to the direction of the first touch operation, and further, the target virtual object can be located in the trigger range of the target interaction event.
- the number of virtual action buttons displayed is reduced, and multiple virtual actions can be completed, so as to solve the technical problems of complex control methods and low control efficiency of virtual objects existing in the related art, to optimize the user experience and improve the user experience.
- the above-mentioned target area may include, but is not limited to, preset by the system or the server, and may also be flexibly configured by the user on the terminal device according to actual needs. This may include, but is not limited to, configuring the above-mentioned target area to be bound to the above-mentioned virtual action button.
- the above-mentioned target area is displayed.
- the first touch operation is the same as or different from the second touch operation.
- the first touch operation is a sliding operation
- the second touch operation is any one of a sliding operation and a long press operation.
- the target area may be displayed on the interactive interface.
- FIG. 6 is a schematic diagram of a display interface according to an embodiment of the present application.
- a virtual action button 604 is displayed in the display interface 602 , and when a sliding operation on the virtual action button 604 is detected, the display is pulled up to display In the target area 606, when it is detected that the above sliding operation ends on the target area 606, the target virtual character is controlled to perform the second action in the target interaction event.
- the sliding operation is both the first touch operation and the second touch operation.
- the above-mentioned second action corresponding to the target interaction event may be separately set, including but not limited to, and may also include, but not limited to, the second action corresponding to the display identifier of the target area and corresponding to the target interaction event. .
- control method of the virtual object further includes:
- the above-mentioned virtual action button may be configured to display the above-mentioned target area through a sliding operation or a long-pressing operation, including but not limited to, where the sliding operation is performed as shown in FIG.
- FIG. 7 is a schematic diagram of a display interface according to an embodiment of the present application.
- a target area 706 is displayed in the display interface 702 .
- the method for controlling a virtual object further includes: when it is detected that the virtual action button is pressed and the offset distance of the pressing point in the target direction is greater than a distance threshold, determining that a sliding operation performed on the virtual action button is detected , and use the sliding operation as the first touch operation. For example, when it is detected that the virtual action button is pressed by the user's finger and the offset distance of the finger in the target direction is greater than a preset distance threshold, it is determined that a sliding operation performed on the virtual action button is detected, and the sliding operation is used as The first touch operation.
- the target direction can be the direction indicated by the first prompt information, or can be any direction;
- the distance threshold can be preset by the system or the server, and can also be based on the size of the display interface (such as the game screen) displayed by the current terminal device. at least one of the size and the size of the interactive interface) to be flexibly configured.
- it may include, but is not limited to, setting a press detection contact in the display area corresponding to the virtual action button to detect whether the virtual action button is pressed (eg, whether it is pressed by the user's finger). For example, the pressing force in the display area corresponding to the virtual action button can be acquired, and when the pressing force exceeds a preset force threshold and the holding time exceeds the preset holding time threshold, it is determined that the virtual action button is pressed.
- the displacement distance of the pressing point in the display interface can be obtained as the offset distance.
- FIG. 8 is a schematic diagram of a display interface according to an embodiment of the present application. As shown in FIG. 8 , the process includes the following steps:
- the first touch operation is a sliding operation
- the method for controlling the virtual object further includes: when it is detected that the virtual action button is pressed and the pressing point disappears after sliding to the target area, determining that the sliding operation ends in the target area ; or when it is detected that the virtual action button is dragged to overlap with the target area and then the dragging ends, it is determined that the sliding operation ends in the target area. For example, when it is detected that the virtual action button is pressed by the finger and the finger is slid to the target area and then released, it is determined that the sliding operation ends in the target area; or when it is detected that the virtual action button is dragged to overlap the target area and then released , it is determined that the sliding operation ends in the target area.
- multiple press detection contacts may be set at the bottom of the screen, and when multiple press detection contacts between the virtual action button and the target area all detect a press operation, it is determined that the detected press point slides to the target area, such as swiping your finger to the target area. Then, when no pressing operation is detected at the detection contact in the target area, it is determined that the sliding operation ends in the target area, for example, the finger is released from the target area.
- the press detection contact located in the target area does not detect a pressing operation when it is moved to the target area, it is determined that the virtual action button is dragged to the target area. End the drag after it overlaps with the target area (end drag as if the finger is released).
- FIG. 9A and 9B are schematic diagrams of a display interface according to an embodiment of the present application.
- FIG. 9A shows that the virtual action button 904 in the display interface 902 is pressed by a finger, and the finger is slid to the target area 906 and then released On;
- FIG. 9B shows that the virtual action button 904 in the display interface 902 is dragged to overlap with the target area 906 and then released.
- the method for controlling a virtual object further includes: when a first touch operation performed on the virtual action button is detected, and the execution object of the first touch operation is updated from the virtual action button to the target area, in the display Second prompt information is displayed on the interface, wherein the second prompt information is used to prompt the end of the first touch operation in the target area.
- displaying the second prompt information on the display interface includes: performing at least one of the following processes on the display interface: updating the display state of the target area; displaying at least one of a text mark and an animation effect; updating a virtual Identification information for the action button.
- the prompt may be prompted by means including but not limited to highlighting the target area
- the specific implementation process may include but not limited to one or more combinations of the above. The above is only an example, and this embodiment does not make any specific limitation.
- the first prompt information includes direction prompt information, where the direction prompt information is used to prompt the target direction of the touch operation performed on the virtual action button.
- the above-mentioned direction prompt information may include, but is not limited to, arrows, characters and other identification information, wherein the above-mentioned arrows may indicate that the above-mentioned target virtual object is in the triggering range corresponding to the target interaction event by means of highlighting or flashing.
- displaying the direction prompt information on the display interface helps the user to intuitively know the direction of the touch operation, and then execute the touch operation according to the learned direction, so as to control the target virtual object based on the same virtual action button
- the technical solution for performing different actions solves the technical problems of complex control methods and low control efficiency of virtual objects in related technologies, and achieves the technical effect of optimizing user experience and improving the efficiency of users controlling virtual objects to complete multiple operations.
- the target interactive event corresponds to an interactive item, and the second action is an interactive action for the interactive item; or the target interactive event corresponds to an interactive area, and the second action is an interactive action for the interactive area; or the target interaction
- the event corresponds to the interactive virtual object, and the second action is an interactive action for the interactive virtual object.
- the interactable prop corresponding to the target interaction event is a virtual zipline (that is, the target interaction event is a zipline riding event), and the second action is a riding zipline action on the virtual zipline, wherein the ride
- the zipline action is used to make the target virtual object jump on and hold the virtual zipline, and slide along the virtual zipline.
- the zipline riding event may include, but is not limited to, a target interaction event triggered by the target virtual object being located in an area corresponding to triggering the target virtual object to use the zipline function or the riding function.
- FIG. 10 is a schematic diagram of a display interface according to an embodiment of the present application.
- a target virtual character 1004 and a virtual zipline 1006 are displayed on the display interface 1002 .
- the target avatar 1004 is controlled to jump on and hold the virtual zipline 1006 during the zipline riding event, and slide along the virtual zipline 1006 .
- the target virtual object is controlled to perform a zipline riding action in the zipline riding event, so that the target virtual object jumps on and pulls the virtual zipline, And slide along the virtual zipline, so that the gameplay of the application is increased and the user experience is optimized.
- the interactable area corresponding to the target interaction event is a climbing area (ie, the target interaction event is a climbing event), and the second action is a climbing action on the climbing area.
- the climbing event may include, but is not limited to, a target interaction event triggered when the target virtual object is located in an area corresponding to triggering the virtual object to use the climbing function.
- FIG. 11 is a schematic diagram of a display interface according to an embodiment of the present application.
- a target virtual character 1104 and a virtual climbing area 1106 are displayed in the display interface 1102 .
- the target avatar 1104 is controlled to perform a climbing action in the virtual climbing area 1106 .
- the above is only an example, and this embodiment does not make any other specific limitations.
- the target interaction event is a climbing event
- the target virtual object is controlled to perform a climbing action in the climbing event, thus increasing the gameplay of the application and optimizing the user experience.
- the method for controlling a virtual object further includes: when the target virtual object is simultaneously within the triggering range of multiple target interaction events and a second touch operation performed on the virtual action button is detected, in the display interface Displaying target areas corresponding to multiple target interaction events, wherein the target area is used to trigger the target virtual object to perform a second action in the corresponding target interaction event; wherein the first touch operation is the same as or different from the second touch operation .
- the target interactive event includes a first interactive event and a second interactive event, when the target virtual object is located within the triggering range of the first interactive event and the triggering range of the second interactive event at the same time, and the second interactive event executed on the virtual action button is detected
- a first area and a second area are displayed on the display interface, wherein the first area is the target area corresponding to the first interactive event, and the first area is used to trigger the target virtual object to execute the first area in the first interactive event.
- Two actions A the second area is the target area corresponding to the second interactive event, and the second area is used to trigger the target virtual object to perform the second action B in the second interactive event.
- the second action may include, but is not limited to, any interactive action using a zipline, rock climbing, swimming, riding, talking, shopping, and the like.
- it may include, but is not limited to, setting the second action A and the second action B as different virtual actions, and may also include, but not limited to, setting the second action A and the second action B as the same virtual action , but correspond to different interactive virtual objects.
- the first area and the second area may be flexibly set by the system or the server, and the matching relationship between the first area and the second action A and the matching relationship between the second area and the second action B may be Including but not limited to flexible settings by the system or server.
- the corresponding corresponding multiple target interaction events are displayed on the display interface.
- a target area wherein the target area is used to trigger the target virtual object to perform the second action in the corresponding target interaction event.
- the embodiment of the present application can realize the technical solution of controlling the target virtual object to perform different actions based on the same virtual action button , to solve the technical problems of complex control methods and low control efficiency of virtual objects existing in the related art, to achieve the technical effect of optimizing user experience and improving the efficiency of users controlling virtual objects to complete multiple interactive operations.
- controlling the target virtual object to perform the second action in the target interaction event includes: when the first touch operation performed on the virtual action button is detected When the control operation is performed and the first touch operation ends in any target area, the target virtual object is controlled to perform the second action in the target interaction event corresponding to any target area. For example, when the first touch operation performed on the virtual action button is detected and the first touch operation ends in the first area, the target virtual object is controlled to perform the second action A in the first interaction event; When the first touch operation performed by the virtual action button and the first touch operation ends in the second area, the target virtual object is controlled to perform the second action B in the second interaction event.
- FIG. 12 is a schematic diagram of a display interface according to an embodiment of the present application.
- a virtual action button 1204 is displayed in the display interface 1202.
- the first area 1206 and the second area 1208 are displayed in the display interface 1202 , such as the shaded parts in FIG. 12 .
- the sliding operation ends in the first area 1206, the target avatar is controlled to perform the second action A; when the sliding operation ends in the second area 1208, the target avatar is controlled to perform the second action B.
- the sliding operation is both the first touch operation and the second touch operation.
- the first interactive event is a zipline riding event
- the second action A includes a zipline riding action
- the zipline riding action is used to make the target virtual object jump on and pull on the virtual zipline, and move along the zipline. slide along the virtual zipline;
- the second interactive event is a climbing event, and the second action B includes a climbing action.
- the corresponding corresponding multiple target interaction events are displayed on the display interface.
- a target area wherein the target area is used to trigger the target virtual object to perform the second action in the corresponding target interaction event.
- the method further includes: when a third touch operation performed on the virtual action button is detected, controlling the target virtual object to end the target interaction event; wherein , the first touch operation is the same as or different from the third touch operation.
- the above-mentioned third touch operation may include, but is not limited to, be set to be the same as the first touch operation.
- the third touch operation may be Actions are also configured as click actions.
- the above-mentioned third touch operation may also include, but is not limited to, setting different from the first touch operation.
- the third touch operation may be configured as a release operate.
- the above-mentioned target interaction event can be ended by acquiring the third touch operation and responding to the above-mentioned third touch operation.
- controlling the target virtual object to end the target interactive event includes: when the target interactive event corresponds to an interactive item, controlling the target virtual object to end the interactive action on the interactive item; or when the target interactive event corresponds to an interactive area , control the target virtual object to end the interactive action on the interactive area; or when the target interactive event corresponds to the interactive virtual object, control the target virtual object to end the interactive action on the interactive virtual object.
- controlling the target virtual object to end the interactive action on the interactive prop includes: when the interactive prop corresponding to the target interactive event is a virtual zipline, controlling the target virtual object Jump off the virtual zipline. That is, when the target interaction event is a zipline riding event, the target virtual object is controlled to jump off the virtual zipline.
- controlling the target virtual object to end the interactive action on the interactive area includes: when the interactive area corresponding to the target interactive event is a climbing area, controlling the target virtual object Jump out of the climbing area. That is, when the target interaction event is a climbing event, the target virtual object is controlled to jump away from the climbing area.
- FIG. 13 is a schematic flowchart of another method for controlling a virtual object according to an embodiment of the present application. As shown in FIG. 13 , the process includes but is not limited to the following steps:
- the apparatus includes: a first display module 1402 configured to display a target virtual object and a virtual action button in a display interface, wherein the virtual action button is used to control the target virtual object to perform a first action; a second display Module 1404, configured to display first prompt information in the display interface when the target virtual object is within the triggering range of the target interaction event, wherein the first prompt information is used to prompt to perform a touch operation on the virtual action button; execute module 1406 is configured to control the target virtual object to perform a second action in the target interaction event when a first touch operation performed on the virtual action button is detected.
- the execution module 1406 is further configured to: control the target virtual object to execute in the target interaction event when the first touch operation performed on the virtual action button is detected and the first touch operation ends in the target area second action.
- the second display module 1404 is further configured to: when a second touch operation performed on the virtual action button is detected, display the target area in the display interface; wherein the first touch operation and the second touch The control operation is the same or different.
- the first touch operation is a sliding operation
- the second touch operation is any one of a sliding operation and a long-press operation
- the long-press operation represents an operation whose pressing duration is greater than a duration threshold.
- the first touch operation is a sliding operation; the execution module 1406 is further configured to: when it is detected that the virtual action button is pressed and the pressing point disappears after sliding to the target area, determine that the sliding operation ends in the target area; Or when it is detected that the virtual action button is dragged to overlap with the target area and then the dragging ends, it is determined that the sliding operation ends in the target area.
- the execution module 1406 is further configured to: when it is detected that the virtual action button is pressed and the offset distance of the pressing point in the target direction is greater than the distance threshold, determine that a sliding operation performed on the virtual action button is detected, And use the sliding operation as the first touch operation.
- the first prompt information includes direction prompt information, where the direction prompt information is used to prompt the target direction of the touch operation performed on the virtual action button.
- the target interactive event corresponds to an interactive item, and the second action is an interactive action for the interactive item; or the target interactive event corresponds to an interactive area, and the second action is an interactive action for the interactive area; or the target interaction
- the event corresponds to the interactive virtual object, and the second action is an interactive action for the interactive virtual object.
- the interactive prop corresponding to the target interaction event is a virtual zipline
- the second action is a zipline riding action on the virtual zipline, wherein the zipline riding action is used to make the target virtual object jump on and Pull on the virtual zipline and slide along the virtual zipline.
- the interactable area corresponding to the target interaction event is a climbing area
- the second action is a climbing action on the climbing area
- the second display module 1404 is further configured to: when the target virtual object is simultaneously within the triggering range of multiple target interaction events and the second touch operation performed on the virtual action button is detected, display The interface displays target areas corresponding to multiple target interaction events, wherein the target area is used to trigger the target virtual object to perform a second action in the corresponding target interaction event; wherein the first touch operation is the same as the second touch operation or different.
- the execution module 1406 is further configured to: when the first touch operation performed on the virtual action button is detected and the first touch operation ends at any target area, control the target virtual object to be in any target area The second action is executed in the target interaction event corresponding to the area.
- the execution module 1406 is further configured to: when a third touch operation performed on the virtual action button is detected, control the target virtual object to end the target interaction event; wherein the first touch operation and the third touch The operation is the same or different.
- the execution module 1406 is further configured to: when the target interaction event corresponds to the interactive prop, control the target virtual object to end the interactive action on the interactive prop; or when the target interaction event corresponds to the interactive area, control the target virtual object The object ends the interactive action on the interactive area; or when the target interactive event corresponds to the interactive virtual object, the target virtual object is controlled to end the interactive action on the interactive virtual object.
- the interactive prop corresponding to the target interaction event when the interactive prop corresponding to the target interaction event is a virtual zipline, control the target virtual object to jump off the virtual zipline; when the interactable area corresponding to the target interaction event is a climbing area, control the target virtual object Jump out of the climbing area.
- An embodiment of the present application provides an electronic device for implementing the above method for controlling a virtual object, where the electronic device may be a terminal device or a server as shown in FIG. 1 .
- This embodiment is described by taking the electronic device as a terminal device as an example.
- the electronic device includes a memory 1502 and a processor 1504, where a computer program is stored in the memory 1502, and the processor 1504 is configured to execute the steps in the above method embodiments through the computer program.
- the aforementioned electronic device may be at least one network device among a plurality of network devices located in a computer network.
- the above-mentioned processor may be configured to perform the following steps through a computer program: displaying the target virtual object and a virtual action button in the display interface, wherein the virtual action button is used to control the target virtual object to perform the first action; when When the target virtual object is within the triggering range of the target interaction event, first prompt information is displayed in the display interface, wherein the first prompt information is used to prompt to perform a touch operation on the virtual action button; During the first touch operation, the target virtual object is controlled to perform the second action in the target interaction event.
- FIG. 15 is only for illustration, and the electronic device may also be a smart phone (such as an Android phone, an iOS phone, etc.), a tablet computer, a palmtop computer, and a mobile Internet device (Mobile Internet Devices, MID), PAD and other terminal equipment.
- FIG. 15 does not limit the structure of the above-mentioned electronic device.
- the electronic device may also include more or fewer components than those shown in FIG. 15 (eg, network interfaces, etc.), or have a different configuration than that shown in FIG. 15 .
- the memory 1502 may be used to store software programs and modules, such as program instructions/modules corresponding to the method and device for controlling virtual objects in the embodiments of the present application, and the processor 1504 runs the software programs and modules stored in the memory 1502, thereby Executing various functional applications and data processing implements the above-described virtual object control method.
- Memory 1502 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory.
- memory 1502 may include memory located remotely from processor 1504, which may be connected to the terminal through a network. Examples of such networks include, but are not limited to, the Internet, an intranet, a local area network, a mobile communication network, and combinations thereof.
- the memory 1502 may be, but is not limited to, used to store information such as virtual action buttons and virtual objects.
- the above-mentioned memory 1502 may include, but is not limited to, the first display module 1402 , the second display module 1404 and the execution module 1406 in the control device of the above-mentioned virtual object.
- it may also include, but is not limited to, other module units in the above-mentioned virtual object control device.
- the transmission means 1506 described above is used to receive or transmit data via a network.
- the above-mentioned networks may include wired networks and wireless networks.
- the transmission device 1506 includes a network adapter (Network Interface Controller, NIC), which can be connected with other network devices and routers through a network cable to communicate with the Internet or a local area network.
- the transmission device 1506 is a radio frequency (Radio Frequency, RF) module, which is used for wirelessly communicating with the Internet.
- RF Radio Frequency
- the above-mentioned electronic device also includes: a display 1508 for displaying a display interface (for example, a game screen and an interactive interface) of a control application (such as a game application) of a virtual object; a connection bus 1510 for connecting to the above-mentioned electronic device.
- a display 1508 for displaying a display interface (for example, a game screen and an interactive interface) of a control application (such as a game application) of a virtual object
- a connection bus 1510 for connecting to the above-mentioned electronic device.
- the above-mentioned terminal device or server may be a node in a distributed system, wherein the distributed system may be a blockchain system, and the blockchain system may be communicated by the multiple nodes through a network A distributed system formed by connection in the form of.
- a peer-to-peer (P2P, Peer To Peer) network can be formed between nodes, and electronic devices in any form, such as servers, terminals and other electronic devices, can become a node in the blockchain system by joining the peer-to-peer network.
- Embodiments of the present application provide a computer program product or computer program, where the computer program product or computer program includes computer instructions (executable instructions), and the computer instructions are stored in a computer-readable storage medium.
- the processor of the electronic device reads the computer instruction from the computer-readable storage medium, and the processor executes the computer instruction, so that the electronic device executes the virtual object control method provided in the various implementation manners of the above-mentioned virtual object control aspect, wherein , the computer program is configured to execute the steps in the above method embodiments when running.
- the above-mentioned computer-readable storage medium may be configured to store a computer program for performing the steps of: displaying a target virtual object and a virtual action button in a display interface, wherein the virtual action button is used to control the target virtual object Execute the first action; when the target virtual object is located within the triggering range of the target interaction event, display first prompt information in the display interface, wherein the first prompt information is used to prompt to perform a touch operation on the virtual action button; when detecting When the first touch operation is performed on the virtual action button, the target virtual object is controlled to perform the second action in the target interaction event.
- the storage medium may include: a flash disk, a read-only memory (Read-Only Memory, ROM), a random access device (Random Access Memory, RAM), a magnetic disk or an optical disk, and the like.
- the integrated units in the above-mentioned embodiments are implemented in the form of software functional units and sold or used as independent products, they may be stored in the above-mentioned computer-readable storage medium.
- the technical solution of the present application can be embodied in the form of a software product in essence, or the part that contributes to the prior art, or all or part of the technical solution, and the computer software product is stored in a storage medium.
- Several instructions are included to cause one or more electronic devices (which may be personal computers, servers, or network devices, etc.) to execute all or part of the steps of the methods of the various embodiments of the present application.
- the disclosed client may be implemented in other manners.
- the device embodiments described above are only illustrative, for example, the division of units is only a logical function division. In actual implementation, there may be other division methods, for example, multiple units or components may be combined or integrated into Another system, or some features can be ignored, or not implemented.
- the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of units or modules, and may be in electrical or other forms.
- Units described as separate components may or may not be physically separated, and components shown as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
- each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
- the above-mentioned integrated units may be implemented in the form of hardware, or may be implemented in the form of software functional units.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (22)
- 一种虚拟对象的控制方法,由电子设备执行,所述方法包括:在显示界面中显示目标虚拟对象以及虚拟动作按钮,其中,所述虚拟动作按钮用于控制所述目标虚拟对象执行第一动作;当所述目标虚拟对象位于目标互动事件的触发范围内时,在所述显示界面中显示第一提示信息,其中,所述第一提示信息用于提示对所述虚拟动作按钮执行触控操作;当检测到对所述虚拟动作按钮执行的第一触控操作时,控制所述目标虚拟对象在所述目标互动事件中执行第二动作。
- 根据权利要求1所述的方法,其中,所述当检测到对所述虚拟动作按钮执行的第一触控操作时,控制所述目标虚拟对象在所述目标互动事件中执行第二动作,包括:当检测到对所述虚拟动作按钮执行的第一触控操作、且所述第一触控操作结束于目标区域时,控制所述目标虚拟对象在所述目标互动事件中执行所述第二动作。
- 根据权利要求2所述的方法,其中,所述方法还包括:当检测到对所述虚拟动作按钮执行的第二触控操作时,在所述显示界面中显示所述目标区域;其中,所述第一触控操作与所述第二触控操作相同或不同。
- 根据权利要求3所述的方法,其中,所述第一触控操作为滑动操作,所述第二触控操作为所述滑动操作以及长按操作中的任意一种;所述长按操作表示按压时长大于时长阈值的操作。
- 根据权利要求2所述的方法,其中,所述第一触控操作为滑动操作;所述方法还包括:当检测到所述虚拟动作按钮被按压、且按压点滑动到所述目标区域后消失时,确定所述滑动操作结束于所述目标区域;或者当检测到所述虚拟动作按钮被拖拽到与所述目标区域重叠后结束拖拽时,确定所述滑动操作结束于所述目标区域。
- 根据权利要求2所述的方法,其中,所述方法还包括:当检测到对所述虚拟动作按钮执行的第一触控操作、且所述第一触控操作的执行对象从所述虚拟动作按钮更新为所述目标区域时,在所述显示界面中显示第二提示信息,其中,所述第二提示信息用于提示在所述目标区域结束所述第一触控操作。
- 根据权利要求6所述的方法,其中,所述在所述显示界面中显示第二提示信息,包括:在所述显示界面中执行以下至少一种处理:更新所述目标区域的显示状态;显示文字标识以及动画效果中的至少之一;更新所述虚拟动作按钮的标识信息。
- 根据权利要求1所述的方法,其中,所述方法还包括:当检测到所述虚拟动作按钮被按压、且按压点在目标方向上的偏移距离大于距离阈值时,确定检测到对所述虚拟动作按钮执行的滑动操作,并将所述滑动操作作为第一触控操作。
- 根据权利要求1所述的方法,其中,所述第一提示信息包括方向提示信息,所述方向提示信息用于提示对所述虚拟动作按钮执行的触控操作的目标方向。
- 根据权利要求1所述的方法,其中,所述目标互动事件对应可互动道具,所述第二动作为对所述可互动道具的互动动作;或者所述目标互动事件对应可互动区域,所述第二动作为对所述可互动区域的互动动作;或者所述目标互动事件对应可互动虚拟对象,所述第二动作为对所述可互动虚拟对象的互动动作。
- 根据权利要求10所述的方法,其中,所述目标互动事件对应的可互动道具为虚拟滑索,所述第二动作为对所述虚拟滑索的乘骑滑索动作,其中,所述乘骑滑索动作用于使得所述目标虚拟对象跳上并拉住所述虚拟滑索,并沿着所述虚拟滑索滑动。
- 根据权利要求10所述的方法,其中,所述目标互动事件对应的可互动区域为攀爬区域,所述第二动作为对所述攀爬区域的攀爬动作。
- 根据权利要求1所述的方法,其中,所述方法还包括:当所述目标虚拟对象同时位于多个目标互动事件的触发范围内、且检测到对所述虚拟动作按钮执行的第二触控操作时,在所述显示界面中显示所述多个目标互动事件分别对应的目标区域,其中,所述目标区域用于触发所述目标虚拟对象在对应的目标互动事件中执行第二动作;其中,所述第一触控操作与所述第二触控操作相同或不同。
- 根据权利要求13所述的方法,其中,所述当检测到对所述虚拟动作按钮执行的第一触控操作时,控制所述目标虚拟对象在所述目标互动事件中执行第二动作,包括:当检测到对所述虚拟动作按钮执行的第一触控操作、且所述第一触控操作结束于任意一个目标区域时,控制所述目标虚拟对象在所述任意一个目标区域对应的目标互动事件中执行第二动作。
- 根据权利要求1至14中任一项所述的方法,其中,所述控制所述目标虚拟对象在所述目标互动事件中执行第二动作之后,所述方法还包括:当检测到对所述虚拟动作按钮执行的第三触控操作时,控制所述目标虚拟对象结束所述目标互动事件;其中,所述第一触控操作与所述第三触控操作相同或不同。
- 根据权利要求15所述的方法,其中,所述控制所述目标虚拟对象结束所述目标互动事件,包括:当所述目标互动事件对应可互动道具时,控制所述目标虚拟对象结束对所述可互动道具的互动动作;或者当所述目标互动事件对应可互动区域时,控制所述目标虚拟对象结束对所述可互动区域的互动动作;或者当所述目标互动事件对应可互动虚拟对象时,控制所述目标虚拟对象结束对所述可互动虚拟对象的互动动作。
- 根据权利要求16所述的方法,其中,所述当所述目标互动事件对应可互动道具时,控制所述目标虚拟对象结束对所述可互动道具的互动动作,包括:当所述目标互动事件对应的可互动道具为虚拟滑索时,控制所述目标虚拟对象跳下所述虚拟滑索;所述当所述目标互动事件对应可互动区域时,控制所述目标虚拟对象结束对所述可互动区域的互动动作,包括:当所述目标互动事件对应的可互动区域为攀爬区域时,控制所述目标虚拟对象跳离所述攀爬区域。
- 根据权利要求1至14中任一项所述的方法,其中,所述第一提示信息包括拖拽方向标记以及按压力度标记中的至少之一;其中,所述拖拽方向标记用于提示对所述虚拟动作按钮执行拖拽操作;所述按压力度标记用于提示对所述虚拟动作按钮执行按压操作。
- 一种虚拟对象的控制装置,所述装置包括:第一显示模块,配置为在显示界面中显示目标虚拟对象以及虚拟动作按钮,其中,所述虚拟动作按钮用于控制所述目标虚拟对象执行第一动作;第二显示模块,配置为当所述目标虚拟对象位于目标互动事件的触发范围内时,在所述显示界面中显示第一提示信息,其中,所述第一提示信息用于提示对所述虚拟动作按钮执行触控操作;执行模块,配置为当检测到对所述虚拟动作按钮执行的第一触控操作时,控制所述目标虚拟对象在所述目标互动事件中执行第二动作。
- 一种计算机可读存储介质,所述计算机可读存储介质存储有计算机指令,其中,所述计算机指令被处理器执行时实现权利要求1至18任一项中所述的虚拟对象的控制方法。
- 一种电子设备,包括存储器和处理器,所述存储器中存储有计算机指令,所述处理器被设置为通过所述计算机指令实现权利要求1至18任一项中所述的虚拟对象的控制方法。
- 一种计算机程序产品,包括计算机指令,所述计算机指令被处理器执行时实现权利要求1至18任一项中所述的虚拟对象的控制方法。
Priority Applications (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP21815878.0A EP4026595A4 (en) | 2020-11-13 | 2021-10-12 | CONTROL METHOD AND DEVICE FOR VIRTUAL OBJECTS, STORAGE MEDIUM AND ELECTRONIC DEVICE |
BR112022001065A BR112022001065A2 (pt) | 2020-11-13 | 2021-10-12 | Método e aparelho de controle de objeto virtual, meio de armazenamento e dispositivo eletrônico |
CA3146804A CA3146804A1 (en) | 2020-11-13 | 2021-10-12 | Virtual object control method and apparatus, storage medium, and electronic device |
JP2022514177A JP7418554B2 (ja) | 2020-11-13 | 2021-10-12 | 仮想オブジェクトの制御方法及び装置、電子機器並びにコンピュータプログラム |
KR1020227002073A KR102721446B1 (ko) | 2020-11-13 | 2021-10-12 | 가상 객체 제어 방법 및 장치, 저장 매체 및 전자 기기 |
AU2021307015A AU2021307015B2 (en) | 2020-11-13 | 2021-10-12 | Virtual object control method and apparatus, storage medium, and electronic device |
US17/585,331 US12090404B2 (en) | 2020-11-13 | 2022-01-26 | Virtual object control method and apparatus, storage medium, and electronic device |
SA522431616A SA522431616B1 (ar) | 2020-11-13 | 2022-02-07 | طريقة وجهاز للتحكم في جسم افتراضي، ووسط تخزين، وجهاز إلكتروني |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011270984.7 | 2020-11-13 | ||
CN202011270984.7A CN112245918B (zh) | 2020-11-13 | 2020-11-13 | 虚拟角色的控制方法和装置、存储介质及电子设备 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/585,331 Continuation US12090404B2 (en) | 2020-11-13 | 2022-01-26 | Virtual object control method and apparatus, storage medium, and electronic device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022100339A1 true WO2022100339A1 (zh) | 2022-05-19 |
Family
ID=74265721
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/123270 WO2022100339A1 (zh) | 2020-11-13 | 2021-10-12 | 虚拟对象的控制方法和装置、存储介质及电子设备 |
Country Status (3)
Country | Link |
---|---|
CN (1) | CN112245918B (zh) |
TW (1) | TW202218723A (zh) |
WO (1) | WO2022100339A1 (zh) |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA3146804A1 (en) | 2020-11-13 | 2022-05-13 | Tencent Technology (Shenzhen) Company Limited | Virtual object control method and apparatus, storage medium, and electronic device |
CN112245918B (zh) * | 2020-11-13 | 2023-01-31 | 腾讯科技(深圳)有限公司 | 虚拟角色的控制方法和装置、存储介质及电子设备 |
CN113082688B (zh) * | 2021-03-31 | 2024-02-13 | 网易(杭州)网络有限公司 | 游戏中虚拟角色的控制方法、装置、存储介质及设备 |
CN113325951B (zh) * | 2021-05-27 | 2024-03-29 | 百度在线网络技术(北京)有限公司 | 基于虚拟角色的操作控制方法、装置、设备以及存储介质 |
CN113318430B (zh) * | 2021-05-28 | 2024-08-20 | 网易(杭州)网络有限公司 | 虚拟角色的姿态调整方法、装置、处理器及电子装置 |
CN113559516B (zh) * | 2021-07-30 | 2023-07-14 | 腾讯科技(深圳)有限公司 | 虚拟角色的控制方法和装置、存储介质及电子设备 |
CN113891138B (zh) * | 2021-09-27 | 2024-05-14 | 深圳市腾讯信息技术有限公司 | 互动操作提示方法和装置、存储介质及电子设备 |
CN113975803B (zh) * | 2021-10-28 | 2023-08-25 | 腾讯科技(深圳)有限公司 | 虚拟角色的控制方法和装置、存储介质及电子设备 |
CN113986079B (zh) * | 2021-10-28 | 2023-07-14 | 腾讯科技(深圳)有限公司 | 虚拟按钮的设置方法和装置、存储介质及电子设备 |
CN113893527B (zh) * | 2021-11-01 | 2023-07-14 | 北京字跳网络技术有限公司 | 一种交互控制方法、装置、电子设备及存储介质 |
CN113975798B (zh) * | 2021-11-09 | 2023-07-04 | 北京字跳网络技术有限公司 | 一种交互控制方法、装置以及计算机存储介质 |
CN118113179A (zh) * | 2022-11-23 | 2024-05-31 | 腾讯科技(深圳)有限公司 | 虚拟对象的互动方法和装置、存储介质及电子设备 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160199728A1 (en) * | 2015-01-08 | 2016-07-14 | LINE Plus Corporation | Game methods for controlling game using virtual buttons and systems for performing the same |
CN110013671A (zh) * | 2019-05-05 | 2019-07-16 | 腾讯科技(深圳)有限公司 | 动作执行方法和装置、存储介质及电子装置 |
CN110215691A (zh) * | 2019-07-17 | 2019-09-10 | 网易(杭州)网络有限公司 | 一种游戏中虚拟角色的移动控制方法及装置 |
CN110270086A (zh) * | 2019-07-17 | 2019-09-24 | 网易(杭州)网络有限公司 | 一种游戏中虚拟角色的移动控制方法及装置 |
CN112245918A (zh) * | 2020-11-13 | 2021-01-22 | 腾讯科技(深圳)有限公司 | 虚拟角色的控制方法和装置、存储介质及电子设备 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4258850B2 (ja) * | 2004-12-28 | 2009-04-30 | 株式会社セガ | 画像処理装置およびその方法 |
CN108509139B (zh) * | 2018-03-30 | 2019-09-10 | 腾讯科技(深圳)有限公司 | 虚拟对象的移动控制方法、装置、电子装置及存储介质 |
CN111773681B (zh) * | 2020-08-03 | 2024-07-09 | 网易(杭州)网络有限公司 | 控制虚拟游戏角色的方法及装置 |
-
2020
- 2020-11-13 CN CN202011270984.7A patent/CN112245918B/zh active Active
-
2021
- 2021-10-12 WO PCT/CN2021/123270 patent/WO2022100339A1/zh active Application Filing
- 2021-10-25 TW TW110139560A patent/TW202218723A/zh unknown
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160199728A1 (en) * | 2015-01-08 | 2016-07-14 | LINE Plus Corporation | Game methods for controlling game using virtual buttons and systems for performing the same |
CN110013671A (zh) * | 2019-05-05 | 2019-07-16 | 腾讯科技(深圳)有限公司 | 动作执行方法和装置、存储介质及电子装置 |
CN110215691A (zh) * | 2019-07-17 | 2019-09-10 | 网易(杭州)网络有限公司 | 一种游戏中虚拟角色的移动控制方法及装置 |
CN110270086A (zh) * | 2019-07-17 | 2019-09-24 | 网易(杭州)网络有限公司 | 一种游戏中虚拟角色的移动控制方法及装置 |
CN112245918A (zh) * | 2020-11-13 | 2021-01-22 | 腾讯科技(深圳)有限公司 | 虚拟角色的控制方法和装置、存储介质及电子设备 |
Non-Patent Citations (1)
Title |
---|
DAN WANG YI QIE YOU HE FANG [FORGET EVERYTHING]: "Easy Way to Rope Jump in Apex", 18 May 2019 (2019-05-18), CN, XP055924489, Retrieved from the Internet <URL:https://tieba.baidu.com/p/6135169914> * |
Also Published As
Publication number | Publication date |
---|---|
CN112245918A (zh) | 2021-01-22 |
CN112245918B (zh) | 2023-01-31 |
TW202218723A (zh) | 2022-05-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2022100339A1 (zh) | 虚拟对象的控制方法和装置、存储介质及电子设备 | |
CN105159687B (zh) | 一种信息处理方法、终端及计算机存储介质 | |
US10695674B2 (en) | Information processing method and apparatus, storage medium and electronic device | |
CN112370781B (zh) | 操作控制方法和装置、存储介质及电子设备 | |
JP6637589B2 (ja) | 情報処理方法、端末、及びコンピュータ記憶媒体 | |
CN105148517B (zh) | 一种信息处理方法、终端及计算机存储介质 | |
CN109557998B (zh) | 信息交互方法、装置、存储介质和电子装置 | |
EP3385830B1 (en) | Data transmission method and device | |
AU2021307015B2 (en) | Virtual object control method and apparatus, storage medium, and electronic device | |
WO2017161904A1 (zh) | 壁纸图片的显示方法和装置 | |
CN105335064A (zh) | 一种信息处理方法、终端和计算机存储介质 | |
CN105915766B (zh) | 基于虚拟现实的控制方法和装置 | |
KR102700298B1 (ko) | 가상 객체 제어 방법 및 장치, 저장 매체, 및 전자 디바이스 | |
CN112370780B (zh) | 虚拟控件的显示方法和装置、存储介质及电子设备 | |
WO2023035812A1 (zh) | 虚拟角色的控制方法和装置、存储介质及电子设备 | |
CN111111219B (zh) | 虚拟道具的控制方法和装置、存储介质及电子装置 | |
KR102721446B1 (ko) | 가상 객체 제어 방법 및 장치, 저장 매체 및 전자 기기 | |
CN115350472A (zh) | 一种云游戏控件配置方法、装置、设备和存储介质 | |
CN113813599A (zh) | 虚拟角色的控制方法和装置、存储介质及电子设备 | |
WO2024041098A1 (zh) | 虚拟角色状态的设置方法和装置、存储介质及电子设备 | |
CN105630402B (zh) | 图像数据存储位置的变更方法和变更装置 | |
WO2024207873A1 (zh) | 虚拟场景的交互方法、装置、电子设备、计算机可读存储介质及计算机程序产品 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2021815878 Country of ref document: EP Effective date: 20211209 |
|
ENP | Entry into the national phase |
Ref document number: 2022514177 Country of ref document: JP Kind code of ref document: A |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112022001065 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 2021307015 Country of ref document: AU Date of ref document: 20211012 Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 112022001065 Country of ref document: BR Kind code of ref document: A2 Effective date: 20220119 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 522431616 Country of ref document: SA |