WO2023165184A1 - 控制操作的执行方法和装置、存储介质及电子设备 - Google Patents

控制操作的执行方法和装置、存储介质及电子设备 Download PDF

Info

Publication number
WO2023165184A1
WO2023165184A1 PCT/CN2022/134889 CN2022134889W WO2023165184A1 WO 2023165184 A1 WO2023165184 A1 WO 2023165184A1 CN 2022134889 W CN2022134889 W CN 2022134889W WO 2023165184 A1 WO2023165184 A1 WO 2023165184A1
Authority
WO
WIPO (PCT)
Prior art keywords
control
touch
response
target
target control
Prior art date
Application number
PCT/CN2022/134889
Other languages
English (en)
French (fr)
Inventor
何方
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to US18/215,211 priority Critical patent/US20230342021A1/en
Publication of WO2023165184A1 publication Critical patent/WO2023165184A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/22Setup operations, e.g. calibration, key configuration or button assignment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1018Calibration; Key and button assignment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the present invention relates to the field of computers, in particular to a management method and device for controlling operations, a storage medium and electronic equipment.
  • the controlled virtual character in the virtual game scene is usually controlled through the controls with different control functions configured on the display interface, so that it can complete the virtual game set in the virtual game scene. Task.
  • touch operations on various controls on the display interface are often multi-touch operations, that is, it is often detected that multiple fingers touch the screen of the mobile phone at the same time.
  • the steering control is used to adjust the orientation of the controlled virtual character to the target orientation according to the target orientation selected by the touch adjustment operation on the control. towards.
  • a multi-touch operation is detected on the steering control, there will be a superposition or cancellation of the steering speed, that is: 1) If multiple fingers slide in different directions, the orientation of the controlled virtual character will The adjustment fails due to conflict; 2) If multiple fingers slide in the same direction, the adjustment speed of the controlled virtual character's orientation will be too fast.
  • the current control operations aimed at multi-point triggers can easily trigger control operations that are not intended to be executed, resulting in response results that cannot meet the real needs of users.
  • the multi-point triggering control operation in the related art will make the user have to make multiple attempts to trigger the control operation that is really intended to be selected, resulting in the problems of complicated control operation process and slow response efficiency.
  • a method and device for executing a control operation, a storage medium, and an electronic device are provided.
  • a method for executing a control operation is executed by an electronic device, and includes: acquiring a touch event, wherein the touch event carries at least two touch points Operation information; the touch operation information includes the operation positions of at least two touch points; when the operation position matches the control position of at least one object control in the display interface, determining the response corresponding to at least one object control takes precedence level label; and according to the response priority label, determine the target control from at least one object control, and execute the target control operation indicated by the target control.
  • a control operation execution device including: an acquisition unit, configured to acquire a touch event, wherein the touch event carries at least two touch points Operation information; the touch operation information includes the operation positions of at least two touch points; the determining unit is configured to determine at least one object when the operation position matches the control position of at least one object control in the display interface A response priority label corresponding to the control; and an execution unit, configured to determine a target control from at least one object control according to the response priority label, and execute the target control operation indicated by the target control.
  • a computer-readable storage medium where computer-readable instructions are stored in the computer-readable storage medium, wherein the computer-readable instructions are configured to be executed at runtime Execution methods for the above control operations.
  • a computer program product includes computer-readable instructions, and the computer-readable instructions are stored in a computer-readable storage medium.
  • the processor of the computer device reads the computer-readable instructions from the computer-readable storage medium, and the processor executes the computer-readable instructions, so that the computer device executes the execution method of the above control operation.
  • an electronic device including a memory and a processor, the memory stores computer-readable instructions, and the processor is configured to execute the above-mentioned computer-readable instructions through the computer-readable instructions. The execution method of the control operation.
  • FIG. 1 is a schematic diagram of a hardware environment of an optional method for executing a control operation according to an embodiment of the present invention
  • FIG. 2 is a flow chart of an optional method for executing a control operation according to an embodiment of the present invention
  • Fig. 3 is a schematic diagram of an optional execution method of a control operation according to an embodiment of the present invention.
  • FIG. 4 is a schematic diagram of another optional method for executing a control operation according to an embodiment of the present invention.
  • Fig. 5 is a schematic diagram of another optional method for executing a control operation according to an embodiment of the present invention.
  • Fig. 6 is a schematic diagram of another optional method for executing a control operation according to an embodiment of the present invention.
  • Fig. 7 is a schematic diagram of another optional method for executing a control operation according to an embodiment of the present invention.
  • Fig. 8 is a schematic diagram of another optional method for executing a control operation according to an embodiment of the present invention.
  • FIG. 9 is a flow chart of another optional method for executing a control operation according to an embodiment of the present invention.
  • Fig. 10 is a schematic diagram of another optional method for executing a control operation according to an embodiment of the present invention.
  • Fig. 11 is a flowchart of another optional method for executing a control operation according to an embodiment of the present invention.
  • Fig. 12 is a flowchart of another optional method for executing a control operation according to an embodiment of the present invention.
  • FIG. 13 is a flow chart of another optional method for executing a control operation according to an embodiment of the present invention.
  • Fig. 14 is a flowchart of another optional method for executing a control operation according to an embodiment of the present invention.
  • Fig. 15 is a flowchart of another optional method for executing a control operation according to an embodiment of the present invention.
  • Fig. 16 is a schematic structural diagram of an optional execution device for a control operation according to an embodiment of the present invention.
  • Fig. 17 is a schematic structural diagram of an optional electronic device according to an embodiment of the present invention.
  • a method for executing a control operation is provided.
  • the above method for executing a control operation may be applied to, but not limited to, the hardware environment shown in FIG. 1
  • a control operation execution system wherein the control operation execution system may include but not limited to a terminal device 102 , a network 104 , a server 106 and a database 108 .
  • a target client is running in the terminal device 102 (as shown in FIG. 1 , the target client is a shooting game application client as an example).
  • the terminal device 102 includes a human-computer interaction screen, a processor and a memory.
  • the human-computer interaction screen is used to display a virtual game scene (the virtual shooting game scene as shown in Figure 1), and is also used to provide a human-computer interaction interface to receive human-computer interaction operations for controlling the controlled virtual objects in the virtual scene, The virtual object will complete the game tasks set in the virtual scene.
  • the processor is configured to generate an interaction instruction in response to the above human-computer interaction operation, and send the interaction instruction to the server.
  • the memory is used to store related attribute data, such as object attribute information of the controlled virtual object, and held prop attribute information.
  • a processing engine is included in the server 106 for performing store or read operations on the database 108 .
  • the processing engine reads from the database 108 the respective positions of each virtual object and the aiming and shooting information of the shooting props used.
  • a touch event is acquired in the terminal device 102, wherein the touch event carries touch operation information of at least two touch points; the touch operation information indicates at least two When the operating positions of each touch point match the control position of at least one object control in the display interface, determine the response priority label corresponding to at least one object control; according to the response priority label, determine from the object control Exit the target control, and perform the target control operation indicated by the target control.
  • step S108 is executed, the terminal device 102 sends the target control operation result information to the server 106 through the network 104, it can be understood that the target control result information is used to indicate the operation result of the target control operation.
  • the server 106 will execute step S110.
  • the server 106 calculates the scene response information based on the target control operation result information. It can be understood that the scene response information is the response information determined according to the target control operation result information combined with the virtual scene. Then in step S112 , the server 106 sends the scene response information to the terminal device 102 through the network 104 .
  • step S110 may also be completed by the terminal device 102 .
  • the above-mentioned terminal device may be a terminal device configured with a target client, which may include but not limited to at least one of the following: mobile phones (such as Android mobile phones, iOS mobile phones, etc.), notebook computers, tablet computers, Handheld PCs, MIDs (Mobile Internet Devices, Mobile Internet Devices), PADs, desktop computers, smart TVs, etc.
  • the target client can be a video client, an instant messaging client, a browser client, an education client, and other clients that support the provision of shooting game tasks.
  • the above-mentioned network may include but not limited to: wired network, wireless network, wherein, the wired network includes: local area network, metropolitan area network and wide area network, and the wireless network includes: bluetooth, WIFI and other networks that realize wireless communication.
  • the above server may be a single server, or a server cluster composed of multiple servers, or a cloud server. The foregoing is only an example, and no limitation is set in this embodiment.
  • the execution method of the above-mentioned control operation can be applied, but not limited to, to a game-type terminal application (Application, APP for short) that completes a predetermined confrontation game task in a virtual scene, such as a multiplayer online tactical competition Shooting game application in the game (Multiplayer Online Battle Arena referred to as MOBA) application, wherein the above-mentioned confrontation game task can be, but not limited to, the virtual object in the virtual scene controlled by the current player through human-computer interaction control operation and the virtual object controlled by other players Game tasks completed through confrontational interaction.
  • a game-type terminal application Application, APP for short
  • MOBA Multiplayer Online Battle Arena
  • the confrontation game tasks can be run in applications (such as non-standalone game APPs) in the form of plug-ins and applets, or in game engines (such as independent running game apps).
  • game APP The types of the aforementioned game applications may include, but are not limited to, at least one of the following: two-dimensional (Two Dimension, referred to as 2D) game applications, three-dimensional (Three Dimension, referred to as 3D) game applications, virtual reality (Virtual Reality, referred to as VR) game applications, Augmented Reality (AR for short) game application, Mixed Reality (MR for short) game application.
  • 2D two-dimensional
  • Three-dimensional Three-dimensional
  • VR virtual reality
  • AR Augmented Reality
  • MR Mixed Reality
  • a touch event is obtained; when the touch operation information indicates that the respective operation positions of at least two touch points match the control position of at least one object control in the display interface, determine at least one The response priority label corresponding to the object control; according to the response priority label, determine the target control from the object control, and execute the target control operation indicated by the target control, so that when multiple touch information included in the touch event is detected
  • the final executed control operation is determined according to the matching relationship between the touch event and the object control, thereby solving the technical problem of high complexity in the execution of the existing control operation.
  • the method for performing the above control operation may be performed by an electronic device.
  • the method is applied to a terminal as an example for illustration, including the following steps:
  • the above-mentioned touch event may be a touch operation event received by the mobile terminal and applied to the display interface, such as a click operation event, a long press operation event, a drag operation event, a double click operation event, and the like.
  • the touch operation information may include, but not limited to, event marker information corresponding to the touch event, touch position information, touch pressing force information, touch pressing time information, and the like.
  • the touch operation information of a long-press touch operation event may include: touch event A (that is, the event mark information), (40px, 120px) (that is, the position information of the event on the operation interface ), 5N (touch pressing force information), 0.1s (touch pressing time information).
  • the operation position refers to the function position of the touch point in the display interface, which can be obtained specifically according to the touch position information in the touch operation information, so as to determine the area touched by the touch point in the display interface. It can be understood that the specific content of the above touch event and touch operation information is just an example, and the type of touch operation event and the type of touch operation information are not limited here.
  • the above-mentioned touch events may also include touch operation events of multiple touch points.
  • the electronic device detects that The long press operation of the control 301 detects the left sliding operation of the right finger on the display interface. It can be understood that the above two touch operation events can be regarded as touch events in this embodiment.
  • the object control may be an operation control for triggering various control operations in the display interface, and may include but not limited to attack control, movement control, aiming control, reload control, and skill control.
  • the object controls may also include display areas in the display interface other than the above-mentioned operation controls. It can be understood that, in this embodiment, performing a touch operation on a display area other than the above-mentioned controls in the display interface can also produce a corresponding control effect, such as: detecting a sliding screen in a display area of a non-operating control In the case of operation, the viewing angle of the virtual character in the display interface can be controlled to change correspondingly.
  • Figure 3 shows the scene picture in the virtual perspective of the virtual character in the figure, in the shown virtual scene, there is a virtual wardrobe in the left corner, assuming that the right hand in Figure 3 is in the non- Please press the touch point B in the display area of the operation control to "swipe to the left", and then the display interface is shown in Figure 4.
  • the virtual angle of view of the virtual character in the figure has changed, that is, the angle of view of the virtual character Correspondingly, it is moved to the right, and another virtual wardrobe in the right corner of the virtual scene is displayed.
  • response priority tag in this embodiment, corresponding to each object control, a response priority label is pre-configured to indicate the corresponding priority of the touch operation acting on the control.
  • the response priority labels can be the same or different. Its configuration principles can be determined according to actual needs.
  • the electronic device when the electronic device detects that the player controls the virtual character to perform the task of "virtual pathfinding" in the virtual scene, it needs to configure the player with a sensitive perspective steering control effect, and then the response of the display area of the non-operating controls can be given priority.
  • the response priority of the control is lower than the priority of the display area of the non-manipulation control.
  • the electronic device responds preferentially to the "swipe screen” operation used to switch the virtual character's perspective, thereby improving The response effect of the player's perspective turning during the process of controlling the virtual character to perform the "virtual pathfinding" task.
  • the response priority of the "shooting control” can be configured as "1”
  • the response priority of the display area of the non-operation control can be configured as a value greater than "1”
  • the priority of the response is the highest, and the response priority of the touch operation in the display area of the non-operating control is lower than that of the display area of the shooting.
  • the electronic device when the player controls the virtual character to perform the task of "targeting and shooting", when the player triggers multiple touch operations, the electronic device first responds to the touch operation of the "shooting control” first, and responds to the touch operation used to switch the virtual character. The lag response of the "swipe screen” operation of the perspective, and then improve the shooting response effect of the player controlling the virtual character to perform the "shooting mission” task.
  • S206 Determine the target control from at least one object control according to the response priority label, and execute the target control operation indicated by the target control.
  • the target control is an object control that needs to be responded determined according to the response priority label, for example, it may be an object control with the highest response priority.
  • the target control operation is the control operation that needs to be executed triggered by the corresponding target control. The above method will be specifically described below in conjunction with FIG. 3 and FIG. 4 .
  • FIG. 3 it is a schematic diagram of a game interface in which a player controls a virtual character to perform a shooting task in a virtual scene.
  • a plurality of operation controls for controlling the virtual character and a perspective scene picture corresponding to the current virtual character are shown.
  • a virtual wardrobe is displayed in the left corner of the virtual room.
  • the interface shows an attack control 301, which is used to control the virtual character to perform an attack operation; a movement control 302, which is used to control the virtual character to perform a movement operation; a skill control 303, which is used to control the virtual character to use virtual skills; a shooting control 304, It is used to control the virtual character to perform another attack operation; the reload control 305 is used to control the virtual character to perform a magazine change operation; the scope control 306 is used to control the virtual character to perform an aiming operation.
  • the virtual character in the game interface is in the task state of "virtual pathfinder", and then the response priority of each of the above-mentioned controls is determined.
  • the response priority of the display area of the non-operation control is "1”
  • the response priority of the attack control 301 is "2”
  • the response priority of the mobile control 302 is "3”
  • the response priority of the skill control 303 is "" 4"
  • the response priority of the shooting control 304 is "2”
  • the response priority of the reload control 305 is "5"
  • the response priority of the scope control 306 is "6".
  • the trigger events obtained in this interface include the touch operation information of touch point A and the touch operation information of touch point B, wherein the touch operation of touch point A is a click operation , its touch operation information includes: label information: touch event A, touch position: (40px, 120px), touch pressing force: 5N, touch pressing time: touch duration: 0.1s; touch point B
  • touch operation is a left-swipe operation, and its operation information includes: label information: touch event B, touch position: (540px, 150px), touch pressure: 5N, touch duration: 0.5s.
  • the electronic device can determine that the touch point A acts on the attack control 301 according to the touch operation information of the touch point A, and can determine the touch point B according to the touch operation information of the touch point B. It acts on the display interface of non-operating controls. Furthermore, according to the response priority "2" of the attack control 301 and the response priority "1" of the display area of the non-operation control, the electronic device determines that the target control is the display area of the non-operation control, that is, the electronic device executes the corresponding touch point B. touch response operation.
  • the electronic device Since the operation effect of "swiping to the left" on the display area of the non-operating controls is "switching the viewing angle to the right", and then the display interface is shown in Figure 4, the electronic device displays the scene picture in which the viewing angle is switched to the right based on the original viewing angle. That is, the figure not only shows the virtual wardrobe in the left corner of the room, but also shows the virtual wardrobe in the right corner of the virtual room due to the rightward switch of the viewing angle. Thus, the implementation effect on the target control operation is realized.
  • the trigger events acquired in this interface include touch operation information of touch point C and touch operation information of touch point B, where the touch operation of touch point C is a left swipe Operation, its touch operation information includes: label information: touch event C, touch position: (40px, 40px), touch pressure: 5N, touch press time: touch duration: 0.5s; touch point B
  • touch operation is a left-swipe operation, and its operation information includes: label information: touch event B, touch position: (540px, 150px), touch pressure: 5N, touch duration: 0.5s.
  • the electronic device can determine that the touch point C acts on the movement control 302 according to the touch operation information of the touch point C, and the effect is to control the virtual character to move to the left. And according to the above touch operation information of the touch point B, it can be determined that the touch point B acts on the display interface of the non-operation control. Furthermore, according to the response priority "3" of the mobile control 302 and the response priority "1" of the display area of the non-operation control, the electronic device determines that the target control is the display area of the non-operation control, that is, the electronic device executes the corresponding touch point B. touch response operation.
  • the electronic device Since the operation effect of "swipe left" on the display area of the non-operation control is "switch the viewing angle to the right", and then the display interface is shown in Figure 4, the electronic device displays the scene where the viewing angle is switched to the right based on the original viewing angle picture. That is, the figure not only shows the virtual wardrobe in the left corner of the room, but also shows the virtual wardrobe in the right corner of the virtual room due to the rightward switch of the viewing angle. Thus, the implementation effect on the target control operation is realized.
  • the electronic device determines that the target control is the mobile control 302 , that is, executes a control operation corresponding to the mobile control 302 . Furthermore, as shown in FIG. 6 , it shows that the virtual character has moved a certain distance to the left from the original position, but the viewing direction of the virtual character does not change. Therefore, the virtual scene of other viewing angles is not shown in FIG. 6 .
  • the electronic device acquires a touch event; when the touch operation information indicates that the operation positions of at least two touch points match the control position of at least one object control in the display interface, determine A response priority label corresponding to at least one object control; according to the response priority label, determine the target control from the object controls, and execute the target control operation indicated by the target control, so that when multiple touch events are detected,
  • the final executed control operation is determined according to the matching relationship between the touch event and the object control, thereby solving the technical problem of high complexity in the execution of the existing control operation.
  • determining the target control from at least one object control according to the response priority label above, and executing the target control operation indicated by the target control includes:
  • the operating positions of at least two touch points match the control positions of multiple object controls, that is, the touch event acts on at least two object controls, so it is necessary to select from at least The target control that needs to execute the response is determined among the two object controls.
  • the above priority labels can be configured for each object control before the game starts, can be configured in the game according to the specific scene, can also be configured in the game according to the player's setting operation, or can be configured according to the real-time game The scene is configured.
  • the electronic device acquires three touch operations of touch events, touch operation D, touch operation E, and touch operation F, and the touch points corresponding to the touch operations respectively correspond to the touch points of the non-operating controls respectively.
  • the display area the priority is “1”
  • the mobile control 302 the priority is "3”
  • the mirror control 306 the priority is "4"
  • touch operation D touch operation E
  • the response sorting results of the corresponding object controls of the touch operation F are: the display area of the non-operation control (priority is "1"), the mobile control 302 (priority is "3"), and the mirror control 306 (priority level is "4").
  • the electronic device determines that the display area of the non-operational control is the target control, and executes the indicated touch operation D acting on the display area of the non-operational control, and then displays the scene after switching the corresponding viewing angle.
  • the response order of the object controls is sorted according to the response priority tags, and the sorting result is obtained; when the sorting result indicates that the object control with the highest response priority is an object control, the highest The object control of the response priority is determined as the target control, and the target control operation indicated by the target control is executed, and then in the case that there are multiple touch operations in the current display interface, according to the multiple target controls corresponding to the multiple touch operations
  • the response priority determines the final execution of the control operation, thereby avoiding the problem of control response confusion in the presence of multiple touch operations, and solving the existing technical problem of high complexity in the execution of existing control operations .
  • the above further includes:
  • touch point I acts on the attack control 301
  • the action time is 6:00.00 minutes after the game starts, and the duration is 0.5 seconds, that is, it lasts until 6:00.50 minutes after the game starts
  • the touch point J acts on the shooting control On 304
  • the action time is 6:00.10 minutes later, and the duration is 0.3 seconds, that is, it lasts until 6:00.40 minutes after the game starts.
  • the touch operation of the touch point 1 exists for a period of time, and another touch operation is detected within the period of time.
  • the electronic device determines the target control according to the priorities of the two controls corresponding to the touch point I and the touch point J.
  • the response priority of the continue fake attack control 301 is "2"
  • the response priority of the shooting control 304 is "2". It can be seen that the corresponding priorities of the two controls are the same.
  • the electronic device determines that the earliest object control is the attack control 301 , and then executes the control operation corresponding to the attack control 301 .
  • the attack mode corresponding to the attack control 301 is to perform sniper shooting
  • the corresponding attack mode to the shooting control 304 is to perform shotgun shooting.
  • the final control result is to control the virtual character to perform sniper shooting.
  • the electronic device determines the control operation time of each of the at least two object controls; the object with the earliest control operation time The control is determined as the target control, and the target control operation indicated by the target control is executed, thereby avoiding the problem of control response confusion in the case of multiple touch operations, and solving the complicated operation mode of the execution of existing control operations high technical issues.
  • the above-mentioned determining the target control from at least one object control according to the response priority label, and executing the target control operation indicated by the target control includes:
  • the touch operation corresponding to the touch point F and the touch point G acts on the display area of the non-operation control
  • the operation mode is "slide to the left”
  • the touch operation of the touch point H acts on the mobile control 302
  • the operation method is "swipe left”.
  • the response priority of the display area of the non-operation control is "1”
  • the response priority of the mobile control 302 is "3”
  • the electronic device needs to further determine the operation corresponding to the target control from the two operations on the display area of the non-operation control.
  • the operation time corresponding to touch point F is 5:00.00 minutes after the game starts, and the duration is 0.5 seconds
  • the operation time corresponding to touch point G is 5:00.15 minutes after the game starts, and the duration is 0.5 seconds. Furthermore, it is determined that the operation corresponding to the touch point F is the earliest operation, so the control operation corresponding to the area corresponding to the touch point F is executed, that is, the virtual perspective of the game scene is switched to the right correspondingly.
  • the electronic device determines an object control as the target control;
  • the touch time of each touch point on the target control; the operation triggered by the touch point with the earliest touch time is determined as the target control operation to be performed;
  • the target control operation is executed, thereby realizing the display interface.
  • the final control operation to be executed is determined according to the order of the touch time of the touch points, thereby avoiding control response confusion in the case of multiple touch operations.
  • the above further includes:
  • Sorting the response order of the object controls according to the updated response priority label to obtain an updated sorting result
  • FIG. 8 there are three touch operations in the display interface of the electronic device, corresponding to touch point F, touch point G, and touch point H respectively.
  • the operation mode is "swipe left"
  • the touch point H acts on the mobile control 302
  • the operation mode is "swipe left”.
  • the response priority of the display area of the non-operation control is "1”
  • the response priority of the mobile control 302 is "3”
  • it can be determined that the sorting result is the display area of the non-operation control and the display area of the non-operation control (tied for the first place).
  • Mobile control 302. Furthermore, the electronic device further determines the operation corresponding to the target control from the two operations on the display area of the non-operation control.
  • the electronic device determines that the operation corresponding to the touch point F is the operation to be performed, and executes the control operation corresponding to the area corresponding to the touch point F, assuming that the touch operation of the touch point F ends, that is, in the display interface
  • the remaining touch points are touch point G and touch point H.
  • the electronic device can further determine that the target control is the display area of the non-operating control corresponding to the touch point G from the controls corresponding to the touch point G and the touch point H according to the priority label according to the method described above, and Execute the operation corresponding to the touch point G, that is, continue to control the virtual character in the virtual scene to switch the viewing angle to the right.
  • the electronic device when the number of at least two touch points changes, the electronic device obtains the updated response priority label; according to the updated response priority label, the response order of the object control Perform sorting to obtain the updated sorting results; determine the target control operation to be performed according to the updated sorting results, so that when the number of touch points changes, according to the updated response priority labels and sorting results, dynamically
  • the next control operation to be executed is determined, thereby avoiding the problem of confusion of control responses in the case of multiple touch operations, and solving the technical problem of high complexity in the execution of existing control operations.
  • the touch event includes at least one of a click operation event, a long press operation event, a drag operation event or a double click operation event.
  • the click operation event refers to the operation event that the user clicks the touch point on the display interface
  • the long press operation event refers to the operation event that the user performs a long press on the touch point on the display interface
  • the drag operation event refers to the operation event that the user presses the touch point on the display interface.
  • the double-click operation event refers to the operation event of double-clicking the touch point by the user on the display interface.
  • Touch events may include at least one type of operation event.
  • the touch event includes at least one of a click operation event, a long press operation event, a drag operation event, or a double-click operation event, so that when different types of operation events are triggered, the The matching relationship of the controls determines the control operation to be finally executed, thereby solving the technical problem of high complexity in the execution of the existing control operation.
  • the response priority label corresponding to each object control is configured according to the scene to which the display interface belongs.
  • the images displayed on the display interface may be different, and the types, quantities and distributions of object controls on the display interface may also be different.
  • the response priority labels corresponding to the object controls included in the display interface are configured according to the scene to which the display interface belongs.
  • the response priority label can be configured by default according to the scene to which the display interface belongs, or can be customized by the user according to the operation needs in the scene.
  • the response priority label corresponding to the object control is obtained by configuring the scene to which the display interface belongs, so that the response priority of the object control in the display interface can be more in line with the needs of the scene, which is conducive to improving the control operation in different scenarios. processing efficiency.
  • determining the response priority label corresponding to the at least one object control includes:
  • a completed sliding screen operation includes three event phases, which can be described as pressing, sliding and releasing.
  • the above-mentioned area information of the object control may include but not limited to position information and range information of the object control in the display interface, so as to determine the object control corresponding to the touch operation.
  • the area information of the object control can be determined and saved before the game starts, or can be changed according to the player's settings during the game, and then updated and saved. For example, before the game starts, save the position information and range information of multiple controls in the interface such as attack controls, movement controls, mirror controls, etc., as well as non-control display areas; If the position and size of the control and the mobile control are adjusted, the saved position information and range information of the attack control and the mobile control will be updated correspondingly.
  • the method for determining the area information of the object control is not limited here.
  • the method for executing the control operation may also include:
  • touch operation information is acquired first.
  • the acquired touch operation information includes at least a "current finger index" (that is, identification information of the current touch operation), and screen position information corresponding to the touch operation.
  • the position information of the current touch operation is compared with the saved response area information of the object control, and then the position corresponding to the current touch operation is determined.
  • control position, control size and priority label can be packaged, and the "current finger index", that is, the identification information of the current touch operation, is used as a key to store them together in the data management container. .
  • the electronic device obtains the response area information of the object control in the display interface; the operation position of each touch point indicated by the touch operation information is sequentially compared with the response area information of each object control The indicated control response area is compared to obtain the comparison result; when the comparison result indicates that the operation position of the touch point is located in the control response area of the object control, determine the operation position of the touch point and the object control Match the position of the control, and obtain the response priority label of the object control that has been determined to be matched; save the response priority label of the object control that has been determined to be matched in the management container, so as to accurately obtain the trigger event of each touch operation , so as to realize the precise detection and response of the touch operation.
  • determining the target control from at least one object control according to the response priority label above, and executing the target control operation indicated by the target control includes:
  • touch operation information which at least includes the "current finger index", which is the identification information of the current touch operation, and the screen position information corresponding to the touch operation.
  • the electronic device sorts the object controls corresponding to all the data in the Pressed management container according to the acquired priority labels.
  • the electronic device reads the response priority tags of the determined matching object controls from the management container; according to the response priority tags, determines the target control from the determined matching object controls, and executes the target control The indicated target control operation, so as to realize the final target control to be executed according to the saved response priority label, and then realize the precise detection and response of the touch operation.
  • the above further includes: removing the response priority label of the target control from the management container.
  • the electronic device in addition to detecting the events IE_Pressed and IE_Repeat, the electronic device also detects the event IE_Released, that is, detects whether the touch event ends.
  • the touch operation information acquired by the electronic device also includes "current finger index” and “pressed screen position", and then "current finger index” is used as the keyword in Pressed
  • the data is acquired in the management container, and when the data is acquired, the electronic device deletes the acquired data in the management container.
  • the electronic device removes the response priority label of the target control from the management container after executing the target control operation indicated by the target control, and then sorts the data of the touch event in IE_Repeat During the process, the released touch operation can be directly ignored, thereby avoiding the error response of multiple touch operations.
  • the above also includes:
  • the electronic device before obtaining the touch event, needs to assign a response priority label to each object control, and determine a response position and a response area of each object control.
  • the response priority label and the determination of the response position and response area of each object control can be set according to the game setting before the game starts, and can also be set according to the player's setting operation during the game. After each setting, the above information is packaged and saved in the management container, so that the above data can be extracted and used in subsequent operations.
  • the electronic device assigns a response priority label to each object control; obtains the response area information of the object control, wherein the response area information is used to indicate the control response area of the object control in the display interface;
  • the control's response priority label and response area information are encapsulated and saved in the management container, so that the priority, position and area range of the object control can be set according to the needs, so that the response to the touch operation is more accurate , and further solve the technical problem of the high complexity of the existing control operation execution mode.
  • Touch events include events triggered by swipe screen operations; management containers include swipe screen managers.
  • controls related to sliding screen operations include “skill control”, “scope control”, “shooting control” and “screen”.
  • the sliding screen operation on the “skill control” can be used to adjust the aiming direction of the virtual skill;
  • the sliding screen operation on the “scope control” can be used to adjust the aiming direction of the shooting operation;
  • the sliding operation of the screen can be used to adjust the shooting direction of the shooting prop, and the sliding operation performed on the "screen” can be used to adjust the viewing angle of the current virtual character.
  • the above four controls are all related to the slide screen operation, but the operation effects are different.
  • the swipe screen operation is a long-term operation (compared to the instantaneous click operation)
  • the player may perform multiple swipe screen operations at the same time within a period of time, and then needs to control and respond to multiple operations at the same time.
  • the electronic device saves the control position, control size and priority label in the data management container in the sliding screen takeover manager, so that subsequent operations can call specific data.
  • data refresh will be triggered, and at the same time, data refresh will also be triggered in the case of customizing and modifying the position of the control. For example, the player adjusts the position and size of the controls during the game. Corresponding to the adjustment operation of the player, the data in the data management container is refreshed to ensure the accuracy of the management of subsequent control operations.
  • the sliding screen data includes control position, control size and response priority label; encapsulating and saving the sliding screen data of the sliding screen operation includes: encapsulating the control position, control size and response priority label, and Save the result of encapsulation processing to the data management container of the sliding screen manager.
  • control position refers to the position of the control used to respond to the sliding screen operation in the display interface
  • control size refers to the size of the control used to respond to the sliding screen operation
  • response priority label is used to respond to the sliding screen operation.
  • the controls are each assigned a label.
  • the electronic device encapsulates the position of the control, the size of the control, and the response priority tag, and saves the result of the encapsulation into the data management container of the slide screen manager.
  • control position, control size and response priority label are packaged and stored in the data management container of the slide screen manager, so as to realize the timely storage of the slide screen data, which is beneficial to ensure the management of subsequent control operations accuracy.
  • the electronic device obtains the Index (ie, the current finger index) and Location (ie, the pressed screen position) information of the touch operation, and records the relevant information, and then the electronic device enters the relevant data into the slide Priority sorting is performed in the screen takeover manager, and finally the perspective steering operation corresponding to the sliding screen operation is performed according to the sorting result.
  • the Index ie, the current finger index
  • Location ie, the pressed screen position
  • Event A the right-swipe operation performed by touch point A is detected and recorded by IE_Pressed as "Event A”, and the position information is (40px, 120px).
  • the electronic device then inputs Index “event A” and position information (40px, 120px) into the slide screen takeover manager, and in the slide screen takeover manager, according to the saved control position information in the management container and the position information of "event A” ( 40px, 120px), and then determine that the control corresponding to "event A” is the attack control 301, and obtain its corresponding priority "2” according to the response priority label configured by the attack control 301. Finally, the electronic device saves "Event A", location information (40px, 120px) and priority label "2" in the Pressed container.
  • the electronic device then inputs Index “event B” and location information (540px, 150px) into the slide screen takeover manager, and in the slide screen takeover manager, according to the saved control location information in the management container and the location information of "event B" ( 540px, 150px), and then determine that the control corresponding to "event B” is the screen, and obtain its corresponding priority "1” according to the response priority label of the screen configuration. Finally, the electronic device saves "event B", location information (540px, 150px) and priority label "1” into the container of Pressed.
  • the electronic device again inputs the Index "event B" of the touch point B into the sliding screen takeover manager, and performs data acquisition in the Pressed container.
  • the electronic device After obtaining the data "event B", location information (540px, 150px) and priority label “1” corresponding to Index “event B”, and the data "event A” of touch point A, location information (40px, 120px ) and the priority label "2", the electronic device compares the two events and the control priorities, determines that the "event B" corresponding to the touch point B is the target event, and executes the "event B". Furthermore, the display interface is shown in FIG. 4 , and the perspective of the virtual character is turned to the right.
  • the electronic device After determining that the execution of the event corresponding to the Index “event B” is completed, the electronic device deletes the data corresponding to the Index “event B” in the Pressed container. The electronics then sequence the remaining events in the Pressed container for execution. Then it is determined to execute "event A", that is, in the display interface after the viewing angle is turned as shown in FIG. 4 , the virtual character is controlled to perform the shooting operation.
  • a method for executing a control operation is provided, which is executed by an electronic device, and the method includes:
  • the display interface includes an object control for responding to a touch event, and the object control is set at a control position in the display interface;
  • the touch event is used to trigger the execution of a control operation of the target type;
  • the target control is a control satisfying a response priority condition among at least one object control.
  • the display interface is an interface displayed on the screen of the electronic device, and the display interface includes at least one object control for responding to touch events, and the object control is set at the position of the control in the display interface, that is, in the electronic device The corresponding object control is displayed at the position of the control in the display interface.
  • Touch events can be triggered by various types of operation events.
  • a control operation refers to an operation that needs to be triggered by a user through a touch event.
  • the touch events triggered by the user for at least one object control are all used to trigger the execution of a target type of control operation, that is, to trigger the same type of control operation.
  • the user triggers a touch event for multiple object controls, such as triggering a sliding screen operation for multiple object controls, so as to trigger a control operation for executing camera steering.
  • object controls For target type control operations, different object controls have different response priorities.
  • Response priority conditions are used to control the sequence of responses of object controls to touch events.
  • the response priority condition can be to respond according to the response order of response priority from high to low, then the object control with the highest response priority can be determined as the target control, and the target control belonging to the target type indicated by the target control can be executed operate.
  • the electronic device displays a display interface, and an object control for responding to a touch event is displayed at a control position of the display interface, and a user can trigger an interaction with respect to the object control.
  • a touch event is triggered for at least two touch points in the display interface, and the operating positions of the at least two touch points match the control position of at least one object control in the display interface
  • the electronic device executes the target The target control operation of the target type indicated by the control.
  • the user can trigger a touch event on at least two touch points in the display interface, and the electronic device determines the respective operation positions of the at least two touch points, and compares the operation positions with the location of at least one object control in the display interface.
  • the electronic device determines the target control from at least one object control according to the response priority condition, and executes the target control operation indicated by the target control and belongs to the target type.
  • the electronic device executes the target control operation belonging to the target type indicated by the target control satisfying the response priority condition, so that when multiple touch events are detected, the final execution is determined according to the matching relationship between the touch event and the object control
  • the control operation further solves the technical problem of high complexity in the operation mode of the execution of the existing control operation.
  • the touch event includes an event triggered by a sliding screen operation; the target control operation includes an angle of view steering operation.
  • the screen sliding operation is an interactive operation of sliding the screen triggered by the user on the object control in the display interface
  • the angle of view steering operation is a control operation for changing the corresponding angle of view of the display content on the display interface. By changing the angle of view, the display content in the display interface can be changed.
  • At least two touch points in the display interface trigger an event triggered by a sliding screen operation, and the respective operation positions of the at least two touch points match the control position of at least one object control in the display interface
  • the electronic device executes the view steering operation indicated by the target control that meets the response priority condition, so that when multiple sliding screen operations are detected, select the view steering operation indicated by the target control that meets the response priority condition Execute, thereby solving the problem of high complexity in the operation mode of the existing angle of view steering operation.
  • an apparatus for executing a control operation for implementing the above method for executing a control operation.
  • the device includes:
  • the acquiring unit 1602 is configured to acquire a touch event, wherein the touch event carries touch operation information of at least two touch points; the touch operation information includes the respective operations of the at least two touch points Location;
  • a determining unit 1604 configured to determine a response priority label corresponding to at least one object control when the operation position matches the control position of at least one object control in the display interface;
  • the execution unit 1606 is configured to determine a target control from at least one object control according to the response priority label, and execute the target control operation indicated by the target control.
  • an electronic device for implementing the method for executing the above control operation, and the electronic device may be a terminal device or a server as shown in FIG. 17 .
  • This embodiment is described by taking the electronic device as a terminal device as an example.
  • the electronic device includes a memory 1702 and a processor 1704, the memory 1702 stores computer-readable instructions, and the processor 1704 is configured to execute any one of the method embodiments described above through the computer-readable instructions. step.
  • the foregoing electronic device may be located in at least one network device among multiple network devices in the computer network.
  • the above-mentioned processor may be configured to execute the following steps through computer-readable instructions:
  • the touch event carries touch operation information of at least two touch points;
  • the touch operation information includes the respective operation positions of at least two touch points;
  • the structure shown in Figure 16 is only for illustration, and the electronic device can also be a vehicle-mounted terminal, a smart phone (such as an Android phone, an iOS phone, etc.), a tablet computer, a handheld computer, and a mobile Internet Devices (Mobile Internet Devices, MID), PAD and other terminal equipment.
  • FIG. 17 does not limit the structure of the above-mentioned electronic equipment.
  • the electronic device may also include more or fewer components than those shown in FIG. 17 (such as a network interface, etc.), or have a different configuration from that shown in FIG. 17 .
  • the memory 1702 can be used to store software programs and modules, such as the program instructions/modules corresponding to the execution method and device of the control operation in the embodiment of the present invention, and the processor 171704 runs the software programs and modules stored in the memory 1702, thereby Executing various functional applications and data processing is to realize the execution method of the above-mentioned control operation.
  • the memory 1702 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory.
  • the memory 1702 may further include a memory that is remotely located relative to the processor 1704, and these remote memories may be connected to the terminal through a network.
  • the memory 1702 may be specifically, but not limited to, used for storing operation information and other information.
  • the memory 1702 may include, but is not limited to, the acquisition unit 1602 , the determination unit 1604 , and the execution unit 1606 in the execution device for the above-mentioned control operations.
  • it may also include but not limited to other module units in the above-mentioned device for executing the control operation, which will not be repeated in this example.
  • the above-mentioned transmission device 1706 is configured to receive or send data via a network.
  • the specific examples of the above-mentioned network may include a wired network and a wireless network.
  • the transmission device 1706 includes a network adapter (Network Interface Controller, NIC), which can be connected to other network devices and routers through a network cable so as to communicate with the Internet or a local area network.
  • the transmission device 1706 is a radio frequency (Radio Frequency, RF) module, which is used to communicate with the Internet in a wireless manner.
  • RF Radio Frequency
  • the above-mentioned electronic device further includes: a display 1708 for displaying a virtual scene for controlling the virtual character, and a connection bus 1710 for connecting various module components in the above-mentioned electronic device.
  • the above-mentioned terminal device or server may be a node in a distributed system, wherein the distributed system may be a block chain system, and the block chain system may be composed of the multiple nodes communicating through the network A distributed system formed by connecting in the form of .
  • nodes can form a peer-to-peer (P2P, Peer To Peer) network, and any form of computing equipment, such as servers, terminals and other electronic devices, can become a node in the blockchain system by joining the peer-to-peer network.
  • P2P peer-to-peer
  • Peer To Peer Peer To Peer
  • a computer program product includes computer readable instructions, and the computer readable instructions include program code for executing the method shown in the flowchart.
  • the computer readable instructions may be downloaded and installed from a network via the communications portion, and/or from removable media.
  • the central processing unit When the computer-readable instructions are executed by the central processing unit, various functions provided by the embodiments of the present application are performed.
  • a computer-readable storage medium is provided, a processor of a computer device reads the computer-readable instructions from the computer-readable storage medium, and the processor executes the computer-readable instructions, so that the computer device executes Executor method for the above control operations.
  • the above-mentioned computer-readable storage medium may be configured to store computer-readable instructions for performing the following steps:
  • the touch event carries touch operation information of at least two touch points;
  • the touch operation information includes the respective operation positions of at least two touch points;
  • the storage medium may include: a flash disk, a read-only memory (Read-Only Memory, ROM), a random access device (Random Access Memory, RAM), a magnetic disk or an optical disk, and the like.
  • the integrated units in the above embodiments are realized in the form of software function units and sold or used as independent products, they can be stored in the above computer-readable storage medium.
  • the essence of the technical solution of the present invention or the part that contributes to the prior art or all or part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium.
  • Several instructions are included to make one or more computer devices (which may be personal computers, servers or network devices, etc.) execute all or part of the steps of the above-mentioned methods in various embodiments of the present invention.
  • the disclosed client can be implemented in other ways.
  • the device embodiments described above are only schematic.
  • the division of the above-mentioned units is only a logical function division.
  • there may be other division methods for example, multiple units or components can be combined or integrated. to another system, or some features may be ignored, or not implemented.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of units or modules may be in electrical or other forms.
  • the units described above as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or may be distributed to multiple network units. Part or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present invention may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units can be implemented in the form of hardware or in the form of software functional units.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种控制操作的执行方法,该方法包括:获取触控事件,其中,触控事件中携带有至少两个触控点的触控操作信息(S202);在触控操作信息指示至少两个触控点各自所在的操作位置与显示界面中至少一个对象控件所在的控件位置相匹配的情况下,确定至少一个对象控件对应的响应优先级标签(S204);按照响应优先级标签,从对象控件中确定出目标控件,并执行目标控件所指示的目标控制操作(S206)。

Description

控制操作的执行方法和装置、存储介质及电子设备
本申请要求于2022年03月01日提交中国专利局、申请号为2022101999879、发明名称为“控制操作的执行方法和装置、存储介质及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本发明涉及计算机领域,具体而言,涉及一种控制操作的管理方法和装置、存储介质及电子设备。
背景技术
如今在很多三维仿真游戏应用中,通常是通过显示界面上配置的具有不同控制功能的控件,来对虚拟游戏场景中受控的虚拟角色进行控制,以使其完成虚拟游戏场景中设置的虚拟游戏任务。
在实际应用过程中,对显示界面上的各个控件的触控操作往往是多点触控操作,即常常会检测到多个手指同时接触手机屏幕。例如,以用于调整受控虚拟角色的朝向的转向控件为例,该转向控件用于根据在控件上的触控调整操作所选定的目标朝向,来将受控虚拟角色的朝向调整至目标朝向。然而,当在转向控件上检测到多点触控操作时,就会出现转向速度的叠加或抵消的情况,即:1)若多个手指向不同方向滑动,会使得对受控虚拟角色的朝向调整发生冲突而失败;2)若多个手指向同一方向滑动,会使得对受控虚拟角色的朝向调整速度过快。
也就是说,目前针对多点触发的控制操作,很容易触发并非真实意图所要执行的控制操作,导致得到无法满足用户真实需求的响应结果。换言之,在相关技术中多点触发的控制操作,将使得用户不得不进行多次尝试,才能触发真正意图选定的控制操作,造成控制操作过程较复杂,响应效率较慢的问题。
发明内容
根据本申请提供的各种实施例,提供了一种控制操作的执行方法和装置、存储介质及电子设备。
根据本发明实施例的一个方面,提供了一种控制操作的执行方法,该方法由电子设备执行,包括:获取触控事件,其中,触控事件中携带有至少两个触控点的触控操作信息;触控操作信息包括至少两个触控点各自所在的操作位置;在操作位置与显示界面中至少一个对象控件所在的控件位置相匹配的情况下,确定至少一个对象控件对应的响应优先级标签;及按照响应优先级标签,从至少一个对象控件中确定出目标控件,并执行目标控件所指示的目标控制操作。
根据本发明实施例的另一方面,还提供了一种控制操作的执行装置,包括:获取单元,用于获取触控事件,其中,触控事件中携带有至少两个触控点的触控操作信息;触控操作信息包括至少两个触控点各自所在的操作位置;确定单元,用于在操作位置与显示界面中至少一个对象控件所在的控件位置相匹配的情况下,确定至少一个对象控件对应的响应优先级标签;及执行单元,用于按照响应优先级标签,从至少一个对象控件中确定出目标控件,并执行目标控件所指示的目标控制操作。
根据本发明实施例的又一方面,还提供了一种计算机可读的存储介质,该计算机可读的存储介质中存储有计算机可读指令,其中,该计算机可读指令被设置为运行时执行上述控制操作的执行方法。
根据本申请实施例的又一个方面,提供一种计算机程序产品,该计算机程序产品包括计算机可读指令,该计算机可读指令存储在计算机可读存储介质中。计算机设备的处理器从计算机可读存储介质读取该计算机可读指令,处理器执行该计算机可读指 令,使得该计算机设备执行如以上控制操作的执行方法。
根据本发明实施例的又一方面,还提供了一种电子设备,包括存储器和处理器,上述存储器中存储有计算机可读指令,上述处理器被设置为通过所述计算机可读指令执行上述的控制操作的执行方法。
本申请的一个或多个实施例的细节在下面的附图和描述中提出。本申请的其它特征、目的和优点将从说明书、附图以及权利要求书变得明显。
附图说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。在附图中:
图1是根据本发明实施例的一种可选的控制操作的执行方法的硬件环境的示意图;
图2是根据本发明实施例的一种可选的控制操作的执行方法的流程图;
图3是根据本发明实施例的一种可选的控制操作的执行方法的示意图;
图4是根据本发明实施例的另一种可选的控制操作的执行方法的示意图;
图5是根据本发明实施例的又一种可选的控制操作的执行方法的示意图;
图6是根据本发明实施例的又一种可选的控制操作的执行方法的示意图;
图7是根据本发明实施例的又一种可选的控制操作的执行方法的示意图;
图8是根据本发明实施例的又一种可选的控制操作的执行方法的示意图;
图9是根据本发明实施例的另一种可选的控制操作的执行方法的流程图;
图10是根据本发明实施例的又一种可选的控制操作的执行方法的示意图;
图11是根据本发明实施例的又一种可选的控制操作的执行方法的流程图;
图12是根据本发明实施例的又一种可选的控制操作的执行方法的流程图;
图13是根据本发明实施例的另一种可选的控制操作的执行方法的流程图;
图14是根据本发明实施例的又一种可选的控制操作的执行方法的流程图;
图15是根据本发明实施例的又一种可选的控制操作的执行方法的流程图;
图16是根据本发明实施例的一种可选的控制操作的执行装置的结构示意图;
图17是根据本发明实施例的一种可选的电子设备的结构示意图。
具体实施方式
为了使本技术领域的人员更好地理解本发明方案,下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分的实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都应当属于本发明保护的范围。
需要说明的是,本发明的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的本发明的实施例能够以除了在这里图示或描述的那些以外的顺序实施。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元的过程、方法、系统、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。
可以理解的是,在本申请的具体实施方式中,涉及到操作信息、控制操作等相关的数据,当本申请以上实施例运用到具体产品或技术中时,需要获得用户的授权许可或者同意,且相关数据的收集、使用和处理需要遵守相关国家和地区的相关法律法规 和标准。
根据本发明实施例的一个方面,提供了一种控制操作的执行方法,作为一种可选的实施方式,上述控制操作的执行方法可以但不限于应用于如图1所示的硬件环境中的控制操作的执行系统,其中,该控制操作的执行系统可以包括但不限于终端设备102、网络104、服务器106及数据库108。终端设备102中运行有目标客户端(如图1所示,以目标客户端为一种射击游戏应用客户端为例)。上述终端设备102中包括人机交互屏幕,处理器及存储器。人机交互屏幕用于显示虚拟游戏场景(如图1所示的虚拟射击游戏场景),还用于提供人机交互接口以接收用于控制虚拟场景中受控的虚拟对象的人机交互操作,该虚拟对象将完成虚拟场景中设置的游戏任务。处理器用于响应上述人机交互操作生成交互指令,并将该交互指令发送给服务器。存储器用于存储相关属性数据,如所控制的虚拟对象的对象属性信息,及所持有的道具属性信息等。
此外,服务器106中包括处理引擎,处理引擎用于对数据库108执行存储或读取操作。具体地,处理引擎从数据库108中读取各个虚拟对象各自的位置及其所使用的射击道具的瞄准射击信息。
具体过程如以下步骤:如步骤S102-S106,在终端设备102中获取触控事件,其中,触控事件中携带有至少两个触控点的触控操作信息;在触控操作信息指示至少两个触控点各自所在的操作位置与显示界面中至少一个对象控件所在的控件位置相匹配的情况下,确定至少一个对象控件对应的响应优先级标签;按照响应优先级标签,从对象控件中确定出目标控件,并执行目标控件所指示的目标控制操作。然后执行步骤S108,终端设备102通过网络104发送该目标控制操作结果信息给服务器106,可以理解的是,目标控制结果信息用于指示目标控制操作的操作结果。服务器106将执行步骤S110,服务器106基于目标控制操作结果信息,计算场景响应信息,可以理解的是,场景响应信息是根据目标控制操作结果信息结合虚拟场景确定出的响应信息。然后如步骤S112,服务器106将场景响应信息通过网络104发送给终端设备102。
作为另一种可选的实施方式,在终端设备102具备较强大的计算处理能力时,上述步骤S110也可以由终端设备102来完成。这里为示例,本实施例中对此不作任何限定。
可选地,在本实施例中,上述终端设备可以是配置有目标客户端的终端设备,可以包括但不限于以下至少之一:手机(如Android手机、iOS手机等)、笔记本电脑、平板电脑、掌上电脑、MID(Mobile Internet Devices,移动互联网设备)、PAD、台式电脑、智能电视等。目标客户端可以是视频客户端、即时通信客户端、浏览器客户端、教育客户端等支持提供射击游戏任务的客户端。上述网络可以包括但不限于:有线网络,无线网络,其中,该有线网络包括:局域网、城域网和广域网,该无线网络包括:蓝牙、WIFI及其他实现无线通信的网络。上述服务器可以是单一服务器,也可以是由多个服务器组成的服务器集群,或者是云服务器。上述仅是一种示例,本实施例中对此不作任何限定。
可选地,在本实施例中,上述控制操作的执行方法可以但不限于应用于在虚拟场景中完成既定对抗游戏任务的游戏类终端应用(Application,简称APP)中,如多人在线战术竞技游戏(Multiplayer Online Battle Arena简称为MOBA)应用中射击游戏应用,其中,上述对抗游戏任务可以但不限于是当前玩家通过人机交互控制操作的执行虚拟场景中的虚拟对象与其他玩家控制的虚拟对象通过对抗互动完成的游戏任务,这里的对抗游戏任务可以但不限于以插件、小程序形式运行在应用(如非独立运行的游戏APP)中,或在游戏引擎中运行在应用(如独立运行的游戏APP)中。上述游戏应用的类型可以包括但不限于以下至少之一:二维(Two Dimension,简称2D)游戏应 用、三维(Three Dimension,简称3D)游戏应用、虚拟现实(Virtual Reality,简称VR)游戏应用、增强现实(Augmented Reality,简称AR)游戏应用、混合现实(Mixed Reality,简称MR)游戏应用。以上只是一种示例,本实施例对此不作任何限定。
在本发明实施例中,获取触控事件;在触控操作信息指示至少两个触控点各自所在的操作位置与显示界面中至少一个对象控件所在的控件位置相匹配的情况下,确定至少一个对象控件对应的响应优先级标签;按照响应优先级标签,从对象控件中确定出目标控件,并执行目标控件所指示的目标控制操作,从而在检测到触控事件包括的多个触控信息的情况下,根据触控事件与对象控件的匹配关系确定最终执行的控制操作,进而解决了现有控制操作的执行的操作方式复杂度高的技术问题。
上述仅是一种示例,本实施例中对此不作任何限定。
作为一种可选的实施方式,如图2所示,上述控制操作的执行方法可由电子设备执行,本实施例中,以该方法应用于终端为例进行说明,包括以下步骤:
S202,获取触控事件,其中,触控事件中携带有至少两个触控点的触控操作信息;触控操作信息包括至少两个触控点各自所在的操作位置。
需要说明的是,上述触控事件可以是移动终端接收的作用于显示界面中的触控操作事件,如点击操作事件、长按操作事件、拖动操作事件、双击操作事件等。在本实施例中,上述触控操作信息可以包括但不限于对应于触控事件的事件标记信息、触控位置信息、触控按压力度信息、触控按压时间信息等。具体地,在本实施例中,一次长按触控操作事件的触控操作信息可以包括:触控事件A(即事件标记信息)、(40px,120px)(即该事件在操作界面的位置信息)、5N(触控按压力度信息)、0.1s(触控按压时间信息)。操作位置是指触控点在显示界面中的作用位置,具体可以根据触控操作信息中的触控位置信息得到,以确定触控点在显示界面中触控的区域。可以理解的是,上述触控事件以及触控操作信息的具体内容仅为一种示例,在此不对触控操作事件的类型以及触控操作信息的类型进行限制。
需要理解的是,在本实施例中,上述触控事件可以还包括多个触控点的触控操作事件,如图3所示,在终端的显示界面中,电子设备检测到了左手手指在攻击控件301的长按操作,并检测到了右手手指在显示界面中的向左滑动操作。可以理解的是,上述两次触控操作事件即可以视为本实施例中的触控事件。
S204,在操作位置与显示界面中至少一个对象控件所在的控件位置相匹配的情况下,确定至少一个对象控件对应的响应优先级标签。
接着对本实施例中的对象控件进行解释。对象控件可以是用于触发显示界面中各种控制操作的操作控件,可以包括但不限于攻击控件、移动控件、瞄准控件、换弹控件、技能控件。对象控件还可以包括显示界面中除上述操作控件之外的显示区域。可以理解的是,在本实施例中,对显示界面中除上述控件之外的显示区域执行触控操作,也可以产生对应的控制效果,如:检测到在非操作控件的显示区域的滑屏操作的情况下,可以控制显示界面中的虚拟角色的视角进行对应变化。如图3、图4所示,图3示出了图中的虚拟角色的虚拟视角中的场景画面,所示的虚拟场景中显示左侧墙角有一个虚拟衣柜,假设图3中的右手在非操作控件的显示区域中的触控点B进行了“向左滑屏”操作的请下,进而显示界面如图4所示,图中的虚拟角色的虚拟视角发生了变化,即虚拟角色的视角对应向右进行了移动,进而显示了该虚拟场景中右侧墙角的另一个虚拟衣柜。
接着对本实施例中的响应优先级标签进行解释。可以理解的是,在本实施例中,对应于每个对象控件,预先配置有响应优先级标签,用于指示对作用于该控件的触控操作的进行相应的优先程度。对不同的对象控件,响应优先级标签可以相同,也可以 不同。其配置原则可以根据实际需要进行确定。
比如,电子设备在检测到玩家控制虚拟角色在虚拟场景中执行“虚拟探路”任务的情况下,需要为玩家配置灵敏的视角转向的控制效果,进而可以将非操作控件的显示区域的响应优先级配置为“1”,指示在非操作控件的显示区域的触控操作的响应优先级最高,将其他操作控件的响应优先级配置为大于“1”的数值,指示对操作控件的触控操作的响应优先级低于非操作控件的显示区域的优先级。进而实现玩家控制虚拟角色执行“虚拟探路”任务的情况下,在玩家触发了多个触控操作时,电子设备对用于切换虚拟角色视角的“滑屏”操作的进行优先响应,进而提升玩家控制虚拟角色执行“虚拟探路”任务的过程中的视角转向的响应效果。
又比如,在检测到玩家控制虚拟角色在虚拟场景中执行“瞄准射击”任务的情况下,由于些微的瞄准视角的变化都会影响射击的准确度,因此需要为玩家配置灵敏的“射击响应”效果,而降低“视角切换响应”效果。进而可以将“射击控件”的响应优先级配置为“1”,将非操作控件的显示区域的响应优先级配置为大于“1”的数值,指示在射击控件对应的显示区域中的触控操作的响应优先级最高,且非操作控件的显示区域的触控操作的响应优先级低于射击的显示区域的优先级。进而实现玩家控制虚拟角色执行“瞄准射击”任务的情况下,在玩家触发了多个触控操作时,电子设备先对“射击控件”的触控操作进行优先响应,而对用于切换虚拟角色视角的“滑屏”操作的滞后响应,进而提升玩家控制虚拟角色执行“射击任务”任务的射击响应效果。
S206,按照响应优先级标签,从至少一个对象控件中确定出目标控件,并执行目标控件所指示的目标控制操作。
其中,目标控件是按照响应优先级标签确定的需要进行响应的对象控件,如可以是响应优先级最高的对象控件。目标控制操作为相应目标控件所触发的需要执行的控制操作。以下结合图3、图4对上述方法进行具体说明。
如图3所示,为玩家控制虚拟角色在虚拟场景中执行射击任务的游戏界面示意图。在该示意图中,示出了用于控制虚拟角色的多个操作控件以及对应于当前虚拟角色的视角场景画面。在该虚拟场景中,可以观察到虚拟房间的左墙角中显示有一个虚拟衣柜。同时在界面中示出了攻击控件301,用于控制虚拟角色执行攻击操作;移动控件302,用于控制虚拟角色执行移动操作;技能控件303,用于控制虚拟角色使用虚拟技能;射击控件304,用于控制虚拟角色执行另一种攻击操作;换弹控件305,用于控制虚拟角色执行换弹匣操作;开镜控件306,用于控制虚拟角色执行瞄准操作。
继续如图3所示,游戏界面中的虚拟角色处于“虚拟探路”任务状态,进而确定出上述各个控件的响应优先级。首先确定非操作控件的显示区域的响应优先级为“1”,攻击控件301的响应优先级为“2”,移动控件302的响应优先级为“3”,技能控件303的响应优先级为“4”,射击控件304的响应优先级为“2”,换弹控件305的响应优先级为“5”;开镜控件306的响应优先级为“6”。
如图3所示,在该界面中获取到的触发事件包括触控点A的触控操作信息以及触控点B的触控操作信息,其中,触控点A的触控操作为一次点击操作,其触控操作信息包括:标签信息:触控事件A、触控位置:(40px,120px)、触控按压力度:5N、触控按压时间:触控时长:0.1s;触控点B的触控操作为一次左滑操作,其操作信息包括:标签信息:触控事件B、触控位置:(540px,150px)、触控按压力度:5N、触控时长:0.5s。
进而,电子设备根据上述触控点A的触控操作信息,可以确定出触控点A作用于攻击控件301上,并根据上述触控点B的触控操作信息,可以确定出触控点B作用于非操作控件的显示界面中。进而电子设备根据攻击控件301的响应优先级“2”以及非 操作控件的显示区域的响应优先级“1”,确定出目标控件为非操作控件的显示区域,即电子设备执行触控点B对应的触控响应操作。
由于对非操作控件的显示区域执行“向左滑屏”的操作效果为“向右切换视角”,进而显示界面如图4所示,电子设备显示在原视角基础上视角向右切换的场景画面。即图中不仅示出了房间内左侧墙角中的虚拟衣柜,同时由于视角的向右切换,还示出了虚拟房间内右侧墙角中的虚拟衣柜。进而实现了对目标控制操作的执行效果。
以下结合图5、图4,对本实施方式的另一个具体实施例进行说明。
如图5所示,在该界面中获取到的触发事件包括触控点C的触控操作信息以及触控点B的触控操作信息,其中,触控点C的触控操作为一次左滑操作,其触控操作信息包括:标签信息:触控事件C、触控位置:(40px,40px)、触控按压力度:5N、触控按压时间:触控时长:0.5s;触控点B的触控操作为一次左滑操作,其操作信息包括:标签信息:触控事件B、触控位置:(540px,150px)、触控按压力度:5N、触控时长:0.5s。
进而,电子设备根据上述触控点C的触控操作信息,可以确定出触控点C作用于移动控件302上,其作用效果为控制虚拟角色向左移动。并根据上述触控点B的触控操作信息,可以确定出触控点B作用于非操作控件的显示界面中。进而电子设备根据移动控件302的响应优先级“3”以及非操作控件的显示区域的响应优先级“1”,确定出目标控件为非操作控件的显示区域,即电子设备执行触控点B对应的触控响应操作。
由于对非操作控件的显示区域执行“向左滑屏”操作对象的操作效果为“向右切换视角”,进而显示界面如图4所示,电子设备显示在原视角基础上视角向右切换的场景画面。即图中不仅示出了房间内左侧墙角中的虚拟衣柜,同时由于视角的向右切换,还示出了虚拟房间内右侧墙角中的虚拟衣柜。进而实现了对目标控制操作的执行效果。
以下结合图5、图6,对本实施方式的又一个具体实施例进行说明。
假设当前虚拟角色所在的游戏场景需要执行的任务为“逃跑”,进而调整移动控件302的响应优先级为“1”,并调整非操作控件的显示区域的响应优先级为“2”。在获取如图5所示的两个触控点的触控信息的情况下,电子设备确定目标控件为移动控件302,即执行与移动控件302对应的控制操作。进而如图6所示,显示虚拟角色从原位置上向左移动了一段距离,但并不改变虚拟角色的视角方向,因而在图6中未示出其他视角的虚拟场景。
在本发明实施例中,电子设备获取触控事件;在触控操作信息指示至少两个触控点各自所在的操作位置与显示界面中至少一个对象控件所在的控件位置相匹配的情况下,确定至少一个对象控件对应的响应优先级标签;按照响应优先级标签,从对象控件中确定出目标控件,并执行目标控件所指示的目标控制操作,从而在检测到多个触控事件的情况下,根据触控事件与对象控件的匹配关系确定最终执行的控制操作,进而解决了现有控制操作的执行的操作方式复杂度高的技术问题。
作为一种可选的实施方式,上述按照响应优先级标签,从至少一个对象控件中确定出目标控件,并执行目标控件所指示的目标控制操作包括:
S1,根据响应优先级标签对至少一个对象控件的响应顺序进行排序,得到排序结果;及
S2,在排序结果指示最高响应优先级的对象控件为一个对象控件的情况下,将最高响应优先级的对象控件确定为目标控件,并执行目标控件所指示的目标控制操作。
需要说明的是,在本实施例中,至少两个触控点各自所在的操作位置与多个对象控件所在的控件位置相匹配,即触控事件作用于至少两个对象控件,故需要从至少两个对象控件中确定需要进行执行响应的目标控件。上述优先级标签可以是在游戏开始 前为每个对象控件进行配置,可以是在游戏中根据具体的场景进行配置,还可以是在游戏中根据玩家的设置操作进行配置,也可以是根据实时游戏场景进行配置。在此不对确定上述优先级标签的时机进行限制。
基于以上述实施例中的优先级标签为例对上述方法进行说明。继续以图3为例,假设确定非操作控件的显示区域的响应优先级为“1”,攻击控件301的响应优先级为“2”,移动控件302的响应优先级为“3”,技能控件303的响应优先级为“4”,射击控件304的响应优先级为“2”,换弹控件305的响应优先级为“5”;开镜控件306的响应优先级为“6”。
进而,假设电子设备在获取到触控事件3个触控操作,触控操作D、触控操作E、触控操作F,分别对应的触控操作的触控点依次分别对应于非操作控件的显示区域(优先级为“1”)、移动控件302(优先级为“3”)、开镜控件306的情况下(优先级为“4”),确定出触控操作D、触控操作E、触控操作F的对应的对象控件的响应排序结果为:非操作控件的显示区域(优先级为“1”)、移动控件302(优先级为“3”)、开镜控件306的情况下(优先级为“4”)。
根据上述排序结果,电子设备确定出非操作控件的显示区域为目标控件,并执行作用于非操作控件的显示区域的所指示的触控操作D,进而显示对应的视角切换后的场景。电子设备
通过本申请上述实施例所述的方法,根据响应优先级标签对对象控件的响应顺序进行排序,得到排序结果;在排序结果指示最高响应优先级的对象控件为一个对象控件的情况下,将最高响应优先级的对象控件确定为目标控件,并执行目标控件所指示的目标控制操作,进而在当前显示界面中存在多个触控操作的情况下,根据多个触控操作对应的多个目标控件的响应优先级确定出最终执行的控制操作,进而避免了在存在多个触控操作的情况下控制响应混乱问题,解决了现有的现有控制操作的执行的操作方式复杂度高的技术问题。
作为一种可选的实施方式,上述在根据响应优先级标签对至少一个对象控件的响应顺序进行排序,得到排序结果之后,还包括:
S1,在排序结果指示最高响应优先级的对象控件包括至少两个对象控件的情况下,确定至少两个对象控件各自的控件操作时间;及
S2,将控件操作时间最早的对象控件确定为目标控件,并执行目标控件所指示的目标控制操作。
以下结合图7对上述实施方式的另一种可选实施例进行说明。
如图7所示,在该界面中,存在与触控点I与触控点J对应的两个触控操作。其中,触控点I作用于攻击控件301上,作用时间为游戏开始后6:00.00分,持续时间为0.5秒,即持续至游戏开始后的6:00.50分;触控点J作用于射击控件304上,作用时间为后6:00.10分,持续时间为0.3秒,即持续至游戏开始后的6:00.40分。可见,在本实施例中,触控点I的触控操作存在一段持续的时间,而在持续的时间内又检测到另一次触控操作。
接着,电子设备根据触控点I与触控点J对应的两个控件的优先级确定出目标控件。继续假攻击控件301的响应优先级为“2”,射击控件304的响应优先级为“2”。可见二控件的相应优先级相同。进而电子设备根据操作时间,确定出最早的对象控件为攻击控件301,进而执行攻击控件301对应的控制操作。假设,攻击控件301对应的攻击方式为进行狙击射击、射击控件304对应的攻击方式为进行霰弹射击。进而,最终的控制结果为控制虚拟角色执行狙击射击。
通过本申请的上述实施方式,在排序结果指示最高响应优先级的对象控件包括至 少两个对象控件的情况下,电子设备确定至少两个对象控件各自的控件操作时间;将控件操作时间最早的对象控件确定为目标控件,并执行目标控件所指示的目标控制操作,进而避免了在存在多个触控操作的情况下控制响应混乱问题,解决了现有的现有控制操作的执行的操作方式复杂度高的技术问题。
作为一种可选地实施方式,上述按照响应优先级标签,从至少一个对象控件中确定出目标控件,并执行目标控件所指示的目标控制操作包括:
S1,在至少两个触控点各自所在的操作位置与显示界面中一个对象控件所在的控件位置匹配的情况下,将一个对象控件确定为目标控件;
S2,确定每个触控点各自在目标控件上的触控时间;
S3,将触控时间最早的触控点所触发的操作,确定为所要执行的目标控制操作;及
S4,执行目标控制操作。
以下结合图8对上述实施方式的一种可选实施例进行说明。
如图8所示,显示界面中存在3个触控操作,分别对应于触控点F、触控点G、触控点H。其中,对应于触控点F和触控点G的触控操作作用于非操作控件的显示区域,操作方式为“向左滑屏”,触控点H的触控操作作用于移动控件302,操作方式为“向左滑屏”。根据非操作控件的显示区域的响应优先级为“1”,移动控件302的响应优先级为“3”,可以确定出排序结果为非操作控件的显示区域和非操作控件的显示区域(两个控件并列第一)、移动控件302。进而电子设备需要从两次非操作控件的显示区域的操作中进一步确定出目标控件对应的操作。
假设触控点F对应的操作时间为游戏开始后5:00.00分,持续时长为0.5秒,触控点G对应的操作时间为游戏开始后5:00.15分,持续时长为0.5秒。进而确定出触控点F对应的操作为最早的操作,因此执行与触控点F对应的区域对应的控制操作,即对应实现游戏场景的虚拟视角向右切换。
通过本申请的上述实施方式,以在至少两个触控点各自所在的操作位置与显示界面中一个对象控件所在的控件位置匹配的情况下,电子设备将一个对象控件确定为目标控件;确定每个触控点各自在目标控件上的触控时间;将触控时间最早的触控点所触发的操作,确定为所要执行的目标控制操作;执行目标控制操作,从而实现了在显示界面中的同一触控区域中存在多个触控点的情况下,根据触控点的触控时间的先后顺序确定出最终执行的控制操作,进而避免了在存在多个触控操作的情况下控制响应混乱问题,解决了现有的现有控制操作的执行的操作方式复杂度高的技术问题。
作为一种可选的实施方式,上述在执行目标控件所指示的目标控制操作之后,还包括:
S1,在至少两个触控点的数量发生变化的情况下,获取更新后的响应优先级标签;
S2,根据更新后的响应优先级标签,对对象控件的响应顺序进行排序,得到更新后的排序结果;及
S3,根据更新后的排序结果确定所要执行的目标控制操作。
可以理解的是,在本实施方式中,若界面中存在多个触控点,且触控点的数量发生变更的情况下,会进行重新排序,并根据重新排序后的结果执行对应的控制操作。
具体地,继续结合图8进行说明。如图8所示,电子设备显示界面中存在3个触控操作,分别对应于触控点F、触控点G、触控点H。其中,对应于触控点F和触控点G作用于非操作控件的显示区域,操作方式为“向左滑屏”,触控点H作用于移动控件302,操作方式为“向左滑屏”。根据非操作控件的显示区域的响应优先级为“1”,移动控件302的响应优先级为“3”,可以确定出排序结果为非操作控件的显示区域和 非操作控件的显示区域(并列第一)、移动控件302。进而电子设备从两次非操作控件的显示区域的操作中进一步确定出目标控件对应的操作。
在电子设备确定出与触控点F对应的操作为待执行的操作,并执行与触控点F对应的区域对应的控制操作后,假设触控点F的触控操作结束,即显示界面中剩余的触控点为触控点G和触控点H。电子设备进一步可以根据前文所述的方法,根据优先级标签,从触控点G和触控点H对应的控件中,确定出目标控件为触控点G对应的非操作控件的显示区域,并执行触控点G对应的操作,即继续控制虚拟场景中的虚拟角色向右切换视角。
通过本申请的上述实施方式,以在至少两个触控点的数量发生变化的情况下,电子设备获取更新后的响应优先级标签;根据更新后的响应优先级标签,对对象控件的响应顺序进行排序,得到更新后的排序结果;根据更新后的排序结果确定所要执行的目标控制操作,从而实现在触控点数量发生变化的情况下,根据更新后的响应优先级标签以及排序结果,动态确定出下一个执行的控制操作,进而避免了在存在多个触控操作的情况下控制响应混乱问题,解决了现有的现有控制操作的执行的操作方式复杂度高的技术问题。
在一个实施例中,触控事件包括点击操作事件、长按操作事件、拖动操作事件或双击操作事件中的至少一种。
其中,点击操作事件是指用户在显示界面上点击触控点的操作事件,长按操作事件是指用户在显示界面上对触控点进行长按的操作事件,拖动操作事件是指用户在显示界面上对触控点进行拖动的操作事件,双击操作事件是指用户在显示界面上对触控点进行双击的操作事件。触控事件可以包括至少一种类型的操作事件。
本实施例中,触控事件包括点击操作事件、长按操作事件、拖动操作事件或双击操作事件中的至少一种,从而支持在触发不同类型的操作事件时,可以根据触控事件与对象控件的匹配关系确定最终执行的控制操作,进而解决现有控制操作的执行的操作方式复杂度高的技术问题。
在一个实施例中,每个对象控件对应的响应优先级标签,根据显示界面所属的场景进行配置。
其中,在不同的场景下,显示界面中显示的画面可以不同,则显示界面中对象控件的种类、数量以及分布也可以不同。具体地,显示界面中所包括的对象控件对应的响应优先级标签,是根据显示界面所属的场景进行配置的。响应优先级标签具体可以是根据显示界面所属的场景默认配置,也可以由用户根据在场景中操作需要,进行个性化配置。
本实施例中,对象控件对应的响应优先级标签,通过显示界面所属的场景进行配置得到,从而可以使显示界面中对象控件的响应优先级更加符合场景的需要,有利于提高不同场景下控制操作的处理效率。
作为一种可选的实施方式,上述在操作位置与显示界面中至少一个对象控件所在的控件位置相匹配的情况下,确定至少一个对象控件对应的响应优先级标签包括:
S1,获取显示界面中的各个对象控件的响应区域信息;
S2,对触控操作信息所指示的每个触控点所在的操作位置,依次与每个对象控件的响应区域信息所指示的控件响应区域进行比对,得到比对结果;
S3,在比对结果指示触控点所在的操作位置位于对象控件的控件响应区域内的情况下,确定触控点所在的操作位置与对象控件所在的控件位置相匹配,并获取已确定匹配的对象控件的响应优先级标签;及
S4,将已确定匹配的对象控件的响应优先级标签保存至管理容器中。
在本实施方式中,以对滑屏操作的执行方式为例,对上述控制操作的执行方法的具体实现方式进行说明。
可以理解的是,一个完成的滑屏操作包括三个事件阶段,可以描述为按压、滑动和释放。首先在游戏启动前,将IE_Pressed(按压)、IE_Repeat(滑动)以及IE_Released(释放)三个事件注册到游戏引擎的触摸组件中,用以检测上述按压、滑动和释放三个阶段的触控事件。
在本实施例中,上述对象控件的区域信息可以包括但不限于对象控件在显示界面中的位置信息以及范围信息,用以确定触控操作对应的对象控件。可以理解的是,对象控件的区域信息可以在游戏开始之前确定并保存,也可以在游戏过程中根据玩家的设置进行变更,然后进行更新保存。比如,在游戏开始前,就将界面中的多个控件如攻击控件、移动控件、开镜控件等以及非控件的显示区域的位置信息和范围信息进行保存;在游戏过程中,假设游戏玩家对攻击控件、移动控件的位置和大小进行了调整,则对应更新已保存的攻击控件以及移动控件的位置信息和范围信息。在此不对对象控件的区域信息的确定方法进行限制。
在具体应用中,控制操作的执行方法还可以包括:
S1,获取触控操作信息。
如图9所示,在检测事件IE_Pressed的过程中,首先获取触控操作信息。在本实施例中,获取的触控操作信息至少包括“当前手指索引”(即当前触控操作的标识信息),以及触控操作对应的屏幕位置信息。
S2,确定屏幕位置信息对应的控件。
具体地,在获取到已保存的对象控件的响应区域信息的情况下,将当前触控操作的位置信息与已保存的对象控件的响应区域信息进行比对,进而确定出当前触控操作对应的对象控件,
S3,获取与对象控件相匹配的优先级标签;及
S4,最后进行数据处理。
具体地,可以如图10所示,将控件位置、控件尺寸以及优先级标签进行封装处理,并以“当前手指索引”即当前触控操作的标识信息为关键字,一同保存在数据管理容器中。
通过本申请的上述实施方式,电子设备获取显示界面中的对象控件的响应区域信息;对触控操作信息所指示的每个触控点所在的操作位置,依次与每个对象控件的响应区域信息所指示的控件响应区域进行比对,得到比对结果;在比对结果指示触控点所在的操作位置位于对象控件的控件响应区域内的情况下,确定触控点所在的操作位置与对象控件所在的控件位置相匹配,并获取已确定匹配的对象控件的响应优先级标签;将已确定匹配的对象控件的响应优先级标签保存至管理容器中,从而准确获取每一次触控操作的触发事件,进而实现触控操作的精准检测和响应。
作为一种可选的实施方式,上述按照响应优先级标签,从至少一个对象控件中确定出目标控件,并执行目标控件所指示的目标控制操作包括:
S1,从管理容器读取已确定匹配的对象控件的响应优先级标签;及
S2,根据响应优先级标签,从已确定匹配的对象控件中确定出目标控件,并执行目标控件所指示的目标控制操作。
结合图11对上述方法的一个实施方式进行说明。
可以理解的是,在本实施例中,对于滑屏操作的执行方式而言,除了对IE_Pressed事件进行检测之外,还对IE_Repeat事件进行检测。步骤如下:
S1,获取触控操作信息。
在检测事件IE_Repeat的过程中,同样需要获取触控操作信息,操作信息中至少包括“当前手指索引”即当前触控操作的标识信息,以及触控操作对应的屏幕位置信息。
S2,从Pressed中获取数据。
具体地,以“当前手指索引”为关键字在Pressed的管理容器中获取数据。
S3,在未获取到对应数据的情况下,不执行任何操作。
S4,在获取到对应的数据的情况下,对上一步中所有的数据进行排序。
可以理解的是,在本实施例中,由于是对“滑屏操作”进行响应处理,因此为了确保“滑动”事件检测的准确性,电子设备需以相同的“当前手指索引”在Pressed的管理容器中进行数据匹配和获取,进而确保每一个“滑动”事件均由“按压”事件触发。
接着,由于Pressed的管理容器中保存的数据包括了触控操作对应的对象控件的优先级,进而电子设备根据获取的优先级标签对Pressed的管理容器中所有的数据对应的对象控件进行排序。
S5,在确定出的最高优先级只有一个的情况下,执行优先级最高的对象控件对应的滑屏视角转向操作。
S6,在存在多个相同最高优先级的情况下,根据触屏先后顺序进行排序,并执行最新的一次触屏操作对应的滑屏视角转向。
通过本申请的上述实施例,电子设备从管理容器读取已确定匹配的对象控件的响应优先级标签;根据响应优先级标签,从已确定匹配的对象控件中确定出目标控件,并执行目标控件所指示的目标控制操作,从而实现根据已保存的响应优先级标签确定出最终需要执行的目标控件,进而实现触控操作的精准检测和响应。
作为一种可选的实施方式,上述在执行目标控件所指示的目标控制操作之后,还包括:从管理容器中移除目标控件的响应优先级标签。
如图12所示,在本实施例中,除了检测事件IE_Pressed、IE_Repeat之外,电子设备还检测事件IE_Released,即检测触控事件是否结束。
可以理解的是,在检测事件IE_Released的过程中,电子设备获取的触控操作信息也包括“当前手指索引”和“按下的屏幕位置”,进而以“当前手指索引”为关键字在Pressed的管理容器中获取数据,在获取到数据的情况下,电子设备在管理容器中删除上述获取的数据。
通过本申请的上述实施例,电子设备通过上述在执行目标控件所指示的目标控制操作之后,从管理容器中移除目标控件的响应优先级标签,进而在IE_Repeat中进行触控事件的数据的排序过程中,可以直接忽略已释放的触控操作,进而避免了多个触控操作的错误响应。
作为一种可选的实施方式,上述在获取触控事件之前,还包括:
S1,为每个对象控件分配响应优先级标签;
S2,获取对象控件的响应区域信息,其中,响应区域信息用于指示对象控件在显示界面中的控件响应区域;及
S3,将对象控件的响应优先级标签和响应区域信息进行封装,并保存至管理容器中。
可以理解的是,在本实施例中,在获取触控事件之前,电子设备需要为每个对象控件分配响应优先级标签,以及确定每个对象控件的响应位置和响应区域。作为一种可选的方式,响应优先级标签以及确定每个对象控件的响应位置和响应区域可以在游戏开始前根据游戏设定进行设置,还可以在游戏过程中根据玩家的设置操作进行设置。 在每次设置后即对上述信息进行封装,保存至管理容器中,以便后续操作中对上述数据进行提取和使用。
通过本申请的上述实施例,电子设备为每个对象控件分配响应优先级标签;获取对象控件的响应区域信息,其中,响应区域信息用于指示对象控件在显示界面中的控件响应区域;将对象控件的响应优先级标签和响应区域信息进行封装,并保存至管理容器中,从而实现了可以根据需要对对象控件的优先级、位置和区域范围进行设置,从而使得对触控操作的响应更加精确,,进而解决了现有控制操作的执行的操作方式复杂度高的技术问题。
以下结合图13、图14对本申请的一个具体实施例进行说明。
在本实施例中,以对游戏界面中的滑屏操作进行处理执行方式为例进行说明。触控事件包括滑屏操作触发的事件;管理容器包括滑屏管理器。
如图13所示为触控操作的管理方法的预处理流程。
S1,将多个用于响应滑屏操作的控件添加至滑屏管理器;
假设在游戏中,与滑屏操作相关的控件包括“技能控件”、“开镜控件”、“射击控件”和“屏幕”。对应地,在“技能控件”上的滑屏操作可以用于调整虚拟技能的瞄准方向、在“开镜控件”上的滑屏操作可以用于调整射击操作的瞄准方向、在“射击道具”上执行的滑屏操作可以用于调整射击道具的射击方向、在“屏幕”上执行的滑屏操作可以用于调整当前虚拟角色的视角。以上四种控件均与滑屏操作相关,但操作效果各不相同。同时由于滑屏操作是一个长时间操作(相较于瞬时的点击操作),因此一段时间内玩家可能同时执行了多个滑屏操作,进而需要对同一时间内的多个操作进行控制响应处理。
S2,为多个用于响应滑屏操作的控件分配各自的响应优先级标签。
具体地,为了避免多个滑屏操作之间的效果干扰,可以先为上述“技能控件”、“开镜控件”、“射击控件”和“屏幕”分别配置响应优先级“3”、“2”、“2”、“1”。也就是说,在本实施例中,“屏幕”上的滑屏操作的响应优先级最高,其次是“开镜控件”和“射击控件”上的滑屏操作,二者优先级相同,最后是“技能控件”。
S3,将滑屏操作的滑屏数据进行封装保存。
如图10所示,电子设备将控件位置、控件尺寸以及优先级标签保存至滑屏接管管理器中的数据管理容器中,以便后续操作对具体数据进行调用。
S4,保存后对数据管理容器中的数据进行刷新。
可以理解的是,电子设备对现有的优先级数据进行封装保存后会触发数据刷新,同时在自定义修改控件位置的情况下,也会触发数据刷新。比如,玩家在游戏过程中,对控件的位置以及大小进行调整。对应于玩家的调整操作,将数据管理容器中的数据进行刷新,以确保后续控制操作的管理的准确性。
在一个实施例中,滑屏数据包括控件位置、控件尺寸以及响应优先级标签;将滑屏操作的滑屏数据进行封装保存,包括:将控件位置、控件尺寸以及响应优先级标签封装处理,并将封装处理获得的结果保存至滑屏管理器的数据管理容器中。
其中,控件位置是指用于响应滑屏操作的控件在显示界面中的所处位置,控件尺寸是指用于响应滑屏操作的控件的尺寸,响应优先级标签是为用于响应滑屏操作的控件各自分配的标签。具体地,电子设备将控件位置、控件尺寸以及响应优先级标签封装处理,并将封装处理获得的结果保存至滑屏管理器的数据管理容器中。
本实施例中,通过将控件位置、控件尺寸以及响应优先级标签封装处理后存储到滑屏管理器的数据管理容器中,从而实现对滑屏数据的及时存储,有利于确保后续控制操作的管理的准确性。
在进行了前置的数据准备工作后,以下结合图14对本实施例的具体实现方式进行说明。
可以理解的是,在具体实现过程中,需要将与“滑屏操作”相关的三个检测事件IE_Pressed(按压)、IE_Repeat(滑动)以及IE_Released(释放)三个事件注册到游戏引擎的触摸组件中,用以检测上述按压、滑动和释放三个阶段的事件,进而精准地确定出“滑屏操作”。
接着,通过上述三个检测事件,电子设备获取触控操作的Index(即当前手指索引)和Location(即按下的屏幕位置)信息,并将相关信息进行记录,接着电子设备将相关数据输入滑屏接管管理器中进行优先级排序,最终根据排序结果执行与滑屏操作对应的视角转向操作。
以下结合图3、图4对上述方法进行具体说明。
如图3所示,显示了该界面上的两个触控操作,其中触控点A对应的控件为攻击控件301,触控点B对应的控件为屏幕。
假设触控点A执行的右滑操作通过IE_Pressed检测并记录的Index为“事件A”,位置信息为(40px,120px)。电子设备接着将Index“事件A”和位置信息(40px,120px)输入滑屏接管管理器,在滑屏接管管理器中根据管理容器中已保存的控件位置信息与“事件A”的位置信息(40px,120px)进行比对,进而确定出“事件A”对应的控件为攻击控件301,并根据攻击控件301配置的响应优先级标签获取其对应的优先级“2”。最后电子设备将“事件A”、位置信息(40px,120px)以及优先级标签“2”保存至Pressed的容器中。
假设触控点B执行的左滑操作通过IE_Pressed检测并记录的Index为“事件B”,位置信息为(540px,150px)。电子设备接着将Index“事件B”和位置信息(540px,150px)输入滑屏接管管理器,在滑屏接管管理器中根据管理容器中已保存的控件位置信息与“事件B”的位置信息(540px,150px)进行比对,进而确定出“事件B”对应的控件为屏幕,并根据屏幕配置的响应优先级标签获取其对应的优先级“1”。最后电子设备将“事件B”、位置信息(540px,150px)以及优先级标签“1”保存至Pressed的容器中。
接着通过IE_Repeat检测并记录触控点B的Index为“事件B”,位置信息为(540px,150px)。进而电子设备再次将触控点B的Index“事件B”输入滑屏接管管理器中,并在Pressed的容器中进行数据获取。
在获取到与Index“事件B”对应的数据“事件B”、位置信息(540px,150px)以及优先级标签“1”,以及触控点A的数据“事件A”、位置信息(40px,120px)以及优先级标签“2”的情况下,电子设备将两个事件以及控件优先级进行比对,确定出触控点B对应的“事件B”为目标事件,并执行“事件B”。进而显示界面如图4所示,将虚拟角色的视角进行了向右转向。
在确定出Index“事件B”对应的事件执行完毕后,电子设备将Index“事件B”对应的数据在Pressed的容器中删除。电子设备接着对Pressed的容器中剩余的事件进行排序执行。进而确定出执行“事件A”,即如图4所示的视角转向后的显示界面中,控制虚拟角色执行射击操作。
在一个实施例中,如图15所示,提供了一种控制操作的执行方法,由电子设备执行,该方法包括:
S1502,展示显示界面;显示界面中包括用于响应触控事件的对象控件,且对象控件设于显示界面中的控件位置处;及
S1504,在针对显示界面中至少两个触控点触发触控事件、且至少两个触控点各自 所在的操作位置与显示界面中至少一个对象控件所在的控件位置相匹配的情况下,执行目标控件所指示的属于目标类型的目标控制操作;
其中,触控事件用于触发执行目标类型的控制操作;目标控件是至少一个对象控件中满足响应优先级条件的控件。
其中,显示界面是在电子设备的屏幕上所显示的界面,在显示界面中包括用于响应触控事件的至少一个对象控件,且对象控件设于显示界面中的控件位置处,即在电子设备显示界面中的控件位置处,显示有相应的对象控件。触控事件可以由各种类型的操作事件触发产生。控制操作是指用户通过触控事件所需要触发进行的操作。本实施例中,用户针对至少一个对象控件触发的触控事件,均是用于触发执行目标类型的控制操作,即用于触发相同类型的控制操作。例如,在游戏界面中,用户针对多个对象控件触发触控事件,如针对多个对象控件触发滑屏操作,以触发执行镜头转向的控制操作。对于目标类型的控制操作,不同的对象控件具有不同的响应优先级。响应优先级条件用于控制对象控件对触控事件的响应先后关系。例如,响应优先级条件可以为按照响应优先级从高到底的响应顺序进行响应,则可以将响应优先级最高的对象控件确定为目标控件,并执行该目标控件所指示的属于目标类型的目标控制操作。
具体地,电子设备展示显示界面,在显示界面的控件位置处显示有用于响应触控事件的对象控件,用户可以针对对象控件触发交互。在针对显示界面中至少两个触控点触发触控事件、且至少两个触控点各自所在的操作位置与显示界面中至少一个对象控件所在的控件位置相匹配的情况下,电子设备执行目标控件所指示的属于目标类型的目标控制操作。具体应用中,用户可以对显示界面中至少两个触控点触发触控事件,电子设备确定至少两个触控点各自所在的操作位置,并将操作位置分别与显示界面中至少一个对象控件所在的控件位置进行匹配,在相匹配的情况下,电子设备按照响应优先级条件从至少一个对象控件中确定目标控件,并执行目标控件所指示的属于目标类型的目标控制操作。
本实施例中,在针对显示界面中至少两个触控点触发触控事件、且至少两个触控点各自所在的操作位置与显示界面中至少一个对象控件所在的控件位置相匹配的情况下,电子设备执行满足响应优先级条件的目标控件所指示的属于目标类型的目标控制操作,从而在检测到多个触控事件的情况下,根据触控事件与对象控件的匹配关系确定最终执行的控制操作,进而解决了现有控制操作的执行的操作方式复杂度高的技术问题。
在一个实施例中,触控事件包括滑屏操作触发的事件;目标控制操作包括视角转向操作。
其中,滑屏操作是用户针对显示界面中的对象控件触发的滑动屏幕的交互操作;视角转向操作是改变显示界面显示内容相应视角的控制操作,通过改变视角,可以改变显示界面中的显示内容。
本实施例中,在针对显示界面中至少两个触控点触发滑屏操作触发的事件、且至少两个触控点各自所在的操作位置与显示界面中至少一个对象控件所在的控件位置相匹配的情况下,电子设备执行满足响应优先级条件的目标控件所指示的视角转向操作,从而在检测到多个滑屏操作的情况下,选择满足响应优先级条件的目标控件所指示的视角转向操作进行执行,进而解决了现有视角转向操作的操作方式复杂度高的问题。
需要说明的是,对于前述的各方法实施例,为了简单描述,故将其都表述为一系列的动作组合,但是本领域技术人员应该知悉,本发明并不受所描述的动作顺序的限制,因为依据本发明,某些步骤可以采用其他顺序或者同时进行。其次,本领域技术人员也应该知悉,说明书中所描述的实施例均属于优选实施例,所涉及的动作和模块 并不一定是本发明所必须的。
根据本发明实施例的另一个方面,还提供了一种用于实施上述控制操作的执行方法的控制操作的执行装置。如图16所示,该装置包括:
获取单元1602,用于获取触控事件,其中,触控事件中携带有至少两个触控点的触控操作信息;所述触控操作信息包括所述至少两个触控点各自所在的操作位置;
确定单元1604,用于操作位置与显示界面中至少一个对象控件所在的控件位置相匹配的情况下,确定至少一个对象控件对应的响应优先级标签;及
执行单元1606,用于按照响应优先级标签,从至少一个对象控件中确定出目标控件,并执行目标控件所指示的目标控制操作。
可选地,在本实施例中,上述各个单元模块所要实现的实施例,可以参考上述各个方法实施例,这里不再赘述。
根据本发明实施例的又一个方面,还提供了一种用于实施上述控制操作的执行方法的电子设备,该电子设备可以是图17所示的终端设备或服务器。本实施例以该电子设备为终端设备为例来说明。如图17所示,该电子设备包括存储器1702和处理器1704,该存储器1702中存储有计算机可读指令,该处理器1704被设置为通过计算机可读指令执行上述任一项方法实施例中的步骤。
可选地,在本实施例中,上述电子设备可以位于计算机网络的多个网络设备中的至少一个网络设备。
可选地,在本实施例中,上述处理器可以被设置为通过计算机可读指令执行以下步骤:
S1,获取触控事件,其中,触控事件中携带有至少两个触控点的触控操作信息;触控操作信息包括至少两个触控点各自所在的操作位置;
S2,在操作位置与显示界面中至少一个对象控件所在的控件位置相匹配的情况下,确定至少一个对象控件对应的响应优先级标签;及
S3,按照响应优先级标签,从至少一个对象控件中确定出目标控件,并执行目标控件所指示的目标控制操作。
可选地,本领域普通技术人员可以理解,图16所示的结构仅为示意,电子设备也可以是车载终端、智能手机(如Android手机、iOS手机等)、平板电脑、掌上电脑以及移动互联网设备(Mobile Internet Devices,MID)、PAD等终端设备。图17其并不对上述电子设备的结构造成限定。例如,电子设备还可包括比图17中所示更多或者更少的组件(如网络接口等),或者具有与图17所示不同的配置。
其中,存储器1702可用于存储软件程序以及模块,如本发明实施例中的控制操作的执行方法和装置对应的程序指令/模块,处理器171704通过运行存储在存储器1702内的软件程序以及模块,从而执行各种功能应用以及数据处理,即实现上述的控制操作的执行方法。存储器1702可包括高速随机存储器,还可以包括非易失性存储器,如一个或者多个磁性存储装置、闪存、或者其他非易失性固态存储器。在一些实例中,存储器1702可进一步包括相对于处理器1704远程设置的存储器,这些远程存储器可以通过网络连接至终端。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。其中,存储器1702具体可以但不限于用于存储操作信息等信息。作为一种示例,如图17所示,上述存储器1702中可以但不限于包括上述控制操作的执行装置中的获取单元1602、确定单元1604、执行单元1606。此外,还可以包括但不限于上述控制操作的执行装置中的其他模块单元,本示例中不再赘述。
可选地,上述的传输装置1706用于经由一个网络接收或者发送数据。上述的网络具体实例可包括有线网络及无线网络。在一个实例中,传输装置1706包括一个网络适 配器(Network Interface Controller,NIC),其可通过网线与其他网络设备与路由器相连从而可与互联网或局域网进行通讯。在一个实例中,传输装置1706为射频(Radio Frequency,RF)模块,其用于通过无线方式与互联网进行通讯。
此外,上述电子设备还包括:显示器1708,用于显示控制虚拟角色的虚拟场景,和连接总线1710,用于连接上述电子设备中的各个模块部件。
在其他实施例中,上述终端设备或者服务器可以是一个分布式系统中的一个节点,其中,该分布式系统可以为区块链系统,该区块链系统可以是由该多个节点通过网络通信的形式连接形成的分布式系统。其中,节点之间可以组成点对点(P2P,Peer To Peer)网络,任意形式的计算设备,比如服务器、终端等电子设备都可以通过加入该点对点网络而成为该区块链系统中的一个节点。
根据本申请的一个方面,提供了一种计算机程序产品,该计算机程序产品包括计算机可读指令,该计算机可读指令包含用于执行流程图所示的方法的程序代码。在这样的实施例中,该计算机可读指令可以通过通信部分从网络上被下载和安装,和/或从可拆卸介质被安装。在该计算机可读指令被中央处理器执行时,执行本申请实施例提供的各种功能。
上述本发明实施例序号仅仅为了描述,不代表实施例的优劣。
根据本申请的一个方面,提供了一种计算机可读存储介质,计算机设备的处理器从计算机可读存储介质读取该计算机可读指令,处理器执行该计算机可读指令,使得该计算机设备执行上述控制操作的执行方方法。
可选地,在本实施例中,上述计算机可读存储介质可以被设置为存储用于执行以下步骤的计算机可读指令:
S1,获取触控事件,其中,触控事件中携带有至少两个触控点的触控操作信息;触控操作信息包括至少两个触控点各自所在的操作位置;
S2,在操作位置与显示界面中至少一个对象控件所在的控件位置相匹配的情况下,确定至少一个对象控件对应的响应优先级标签;及
S3,按照响应优先级标签,从至少一个对象控件中确定出目标控件,并执行目标控件所指示的目标控制操作。
可选地,在本实施例中,本领域普通技术人员可以理解上述实施例的各种方法中的全部或部分步骤是可以通过程序来指令终端设备相关的硬件来完成,该程序可以存储于一计算机可读存储介质中,存储介质可以包括:闪存盘、只读存储器(Read-Only Memory,ROM)、随机存取器(Random Access Memory,RAM)、磁盘或光盘等。
上述实施例中的集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在上述计算机可读取的存储介质中。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在存储介质中,包括若干指令用以使得一台或多台计算机设备(可为个人计算机、服务器或者网络设备等)执行本发明各个实施例上述方法的全部或部分步骤。
在本发明的上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见其他实施例的相关描述。
在本申请所提供的几个实施例中,应该理解到,所揭露的客户端,可通过其它的方式实现。其中,以上所描述的装置实施例仅仅是示意性的,例如上述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,单元或模 块的间接耦合或通信连接,可以是电性或其它的形式。
上述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本发明各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
以上上述仅是本发明的优选实施方式,应当指出,对于本技术领域的普通技术人员来说,在不脱离本发明原理的前提下,还可以做出若干改进和润饰,这些改进和润饰也应视为本发明的保护范围。

Claims (20)

  1. 一种控制操作的执行方法,由电子设备执行,其特征在于,包括:
    获取触控事件,其中,所述触控事件中携带有至少两个触控点的触控操作信息;所述触控操作信息包括所述至少两个触控点各自所在的操作位置;
    在所述操作位置与显示界面中至少一个对象控件所在的控件位置相匹配的情况下,确定所述至少一个对象控件对应的响应优先级标签;及
    按照所述响应优先级标签,从所述至少一个对象控件中确定出目标控件,并执行所述目标控件所指示的目标控制操作。
  2. 根据权利要求1所述的方法,其特征在于,按照所述响应优先级标签,从所述至少一个对象控件中确定出目标控件,并执行所述目标控件所指示的目标控制操作包括:
    根据所述响应优先级标签对所述至少一个对象控件的响应顺序进行排序,得到排序结果;及
    在所述排序结果指示最高响应优先级的对象控件为一个对象控件的情况下,将所述最高响应优先级的对象控件确定为目标控件,并执行所述目标控件所指示的目标控制操作。
  3. 根据权利要求2所述的方法,其特征在于,在根据所述响应优先级标签对所述至少一个对象控件的响应顺序进行排序,得到排序结果之后,还包括:
    在所述排序结果指示最高响应优先级的对象控件包括至少两个对象控件的情况下,确定所述至少两个对象控件各自的控件操作时间;及
    将所述控件操作时间最早的对象控件确定为目标控件,并执行所述目标控件所指示的目标控制操作。
  4. 根据权利要求1所述的方法,其特征在于,按照所述响应优先级标签,从所述至少一个对象控件中确定出目标控件,并执行所述目标控件所指示的目标控制操作包括:
    在所述至少两个触控点各自所在的操作位置与所述显示界面中一个对象控件所在的控件位置匹配的情况下,将所述一个对象控件确定为所述目标控件;
    确定每个所述触控点各自在所述目标控件上的触控时间;
    将所述触控时间最早的触控点所触发的操作,确定为所要执行的所述目标控制操作;及
    执行所述目标控制操作。
  5. 根据权利要求1所述的方法,其特征在于,在执行所述目标控件所指示的目标控制操作之后,还包括:
    在所述至少两个触控点的数量发生变化的情况下,获取更新后的所述响应优先级标签;
    根据更新后的所述响应优先级标签,对所述对象控件的响应顺序进行排序,得到更新后的排序结果;及
    根据所述更新后的排序结果确定所要执行的目标控制操作。
  6. 根据权利要求1所述的方法,其特征在于,所述触控事件包括点击操作事件、长按操作事件、拖动操作事件或双击操作事件中的至少一种。
  7. 根据权利要求1所述的方法,其特征在于,每个所述对象控件对应的响应优先级标签,根据所述显示界面所属的场景进行配置。
  8. 根据权利要求1至7中任一项所述的方法,其特征在于,在所述操作位置与显 示界面中至少一个对象控件所在的控件位置相匹配的情况下,确定所述至少一个对象控件对应的响应优先级标签包括:
    获取显示界面中的各个对象控件的响应区域信息;
    对所述触控操作信息所指示的每个触控点所在的操作位置,依次与每个所述对象控件的响应区域信息所指示的控件响应区域进行比对,得到比对结果;
    在所述比对结果指示所述触控点所在的操作位置位于所述对象控件的控件响应区域内的情况下,确定所述触控点所在的操作位置与所述对象控件所在的控件位置相匹配,并获取已确定匹配的所述对象控件的响应优先级标签;及
    将已确定匹配的所述对象控件的响应优先级标签保存至管理容器中。
  9. 根据权利要求8所述的方法,其特征在于,所述按照所述响应优先级标签,从所述至少一个对象控件中确定出目标控件,并执行所述目标控件所指示的目标控制操作包括:
    从所述管理容器读取已确定匹配的所述对象控件的响应优先级标签;及
    根据所述响应优先级标签,从已确定匹配的所述对象控件中确定出所述目标控件,并执行所述目标控件所指示的所述目标控制操作。
  10. 根据权利要求9所述的方法,其特征在于,在所述执行所述目标控件所指示的目标控制操作之后,还包括:
    从所述管理容器中移除所述目标控件的所述响应优先级标签。
  11. 根据权利要求8所述的方法,其特征在于,在所述获取触控事件之前,还包括:
    为每个所述对象控件分配所述响应优先级标签;
    获取所述对象控件的响应区域信息,其中,所述响应区域信息用于指示所述对象控件在所述显示界面中的控件响应区域;及
    将所述对象控件的所述响应优先级标签和所述响应区域信息进行封装,并保存至所述管理容器中。
  12. 根据权利要求8所述的方法,其特征在于,所述触控事件包括滑屏操作触发的事件;所述管理容器包括滑屏管理器;所述方法还包括:
    将多个用于响应所述滑屏操作的控件添加至所述滑屏管理器;
    为所述多个用于响应所述滑屏操作的控件分配各自的响应优先级标签;及
    将所述滑屏操作的滑屏数据进行封装保存。
  13. 根据权利要求12所述的方法,其特征在于,所述滑屏数据包括控件位置、控件尺寸以及所述响应优先级标签;所述将所述滑屏操作的滑屏数据进行封装保存,包括:
    将所述控件位置、所述控件尺寸以及所述响应优先级标签封装处理,并将封装处理获得的结果保存至所述滑屏管理器的数据管理容器中。
  14. 一种控制操作的执行方法,由电子设备执行,其特征在于,所述方法包括:展示显示界面;所述显示界面中包括用于响应触控事件的对象控件,且所述对象控件设于所述显示界面中的控件位置处;及
    在针对所述显示界面中至少两个触控点触发触控事件、且所述至少两个触控点各自所在的操作位置与所述显示界面中至少一个对象控件所在的控件位置相匹配的情况下,执行目标控件所指示的属于目标类型的目标控制操作;
    其中,所述触控事件用于触发执行所述目标类型的控制操作;所述目标控件是所述至少一个对象控件中满足响应优先级条件的控件。
  15. 根据权利要求14所述的方法,其特征在于,所述触控事件包括滑屏操作触发 的事件;所述目标控制操作包括视角转向操作。
  16. 一种控制操作的执行装置,其特征在于,包括:
    获取单元,用于获取触控事件,其中,所述触控事件中携带有至少两个触控点的触控操作信息;所述触控操作信息包括所述至少两个触控点各自所在的操作位置;
    确定单元,用于在所述操作位置与显示界面中至少一个对象控件所在的控件位置相匹配的情况下,确定所述至少一个对象控件对应的响应优先级标签;及
    执行单元,用于按照所述响应优先级标签,从所述至少一个对象控件中确定出目标控件,并执行所述目标控件所指示的目标控制操作。
  17. 根据权利要求16所述的装置,其特征在于,
    所述执行单元,还用于根据所述响应优先级标签对所述至少一个对象控件的响应顺序进行排序,得到排序结果;及在所述排序结果指示最高响应优先级的对象控件为一个对象控件的情况下,将所述最高响应优先级的对象控件确定为目标控件,并执行所述目标控件所指示的目标控制操作。
  18. 一种计算机可读的存储介质,其特征在于,所述计算机可读的存储介质包括存储的程序,其中,所述程序被处理器运行时执行所述权利要求1至15任一项中所述的方法。
  19. 一种计算机程序产品,包括计算机程序/指令,其特征在于,该计算机程序/指令被处理器执行时实现权利要求1至15任一项所述方法的步骤。
  20. 一种电子设备,包括存储器和处理器,其特征在于,所述存储器中存储有计算机程序,所述处理器被设置为通过所述计算机程序执行所述权利要求1至15任一项中所述的方法。
PCT/CN2022/134889 2022-03-01 2022-11-29 控制操作的执行方法和装置、存储介质及电子设备 WO2023165184A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US18/215,211 US20230342021A1 (en) 2022-03-01 2023-06-28 Performing a control operation based on multiple touch points

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210199987.9A CN116726475A (zh) 2022-03-01 2022-03-01 控制操作的执行方法和装置、存储介质及电子设备
CN202210199987.9 2022-03-01

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/215,211 Continuation US20230342021A1 (en) 2022-03-01 2023-06-28 Performing a control operation based on multiple touch points

Publications (1)

Publication Number Publication Date
WO2023165184A1 true WO2023165184A1 (zh) 2023-09-07

Family

ID=87882896

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/134889 WO2023165184A1 (zh) 2022-03-01 2022-11-29 控制操作的执行方法和装置、存储介质及电子设备

Country Status (3)

Country Link
US (1) US20230342021A1 (zh)
CN (1) CN116726475A (zh)
WO (1) WO2023165184A1 (zh)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120105481A1 (en) * 2010-11-03 2012-05-03 Samsung Electronics Co. Ltd. Touch control method and portable terminal supporting the same
JP2013196582A (ja) * 2012-03-22 2013-09-30 Sharp Corp タッチパネル入力装置、タッチパネルの入力方法及びプログラム
CN107562263A (zh) * 2017-08-18 2018-01-09 维沃移动通信有限公司 数据输入方法、移动终端以及计算机可读存储介质
CN107562262A (zh) * 2017-08-14 2018-01-09 维沃移动通信有限公司 一种响应触控操作的方法、终端及计算机可读存储介质
CN108073347A (zh) * 2017-12-15 2018-05-25 掌阅科技股份有限公司 多点触控的处理方法、计算设备及计算机存储介质
CN111314441A (zh) * 2020-01-21 2020-06-19 维达力实业(深圳)有限公司 基于多区域控制的终端控制方法、装置和终端
CN112817483A (zh) * 2021-01-29 2021-05-18 网易(杭州)网络有限公司 多点触控的处理方法、装置、设备及存储介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120105481A1 (en) * 2010-11-03 2012-05-03 Samsung Electronics Co. Ltd. Touch control method and portable terminal supporting the same
JP2013196582A (ja) * 2012-03-22 2013-09-30 Sharp Corp タッチパネル入力装置、タッチパネルの入力方法及びプログラム
CN107562262A (zh) * 2017-08-14 2018-01-09 维沃移动通信有限公司 一种响应触控操作的方法、终端及计算机可读存储介质
CN107562263A (zh) * 2017-08-18 2018-01-09 维沃移动通信有限公司 数据输入方法、移动终端以及计算机可读存储介质
CN108073347A (zh) * 2017-12-15 2018-05-25 掌阅科技股份有限公司 多点触控的处理方法、计算设备及计算机存储介质
CN111314441A (zh) * 2020-01-21 2020-06-19 维达力实业(深圳)有限公司 基于多区域控制的终端控制方法、装置和终端
CN112817483A (zh) * 2021-01-29 2021-05-18 网易(杭州)网络有限公司 多点触控的处理方法、装置、设备及存储介质

Also Published As

Publication number Publication date
CN116726475A (zh) 2023-09-12
US20230342021A1 (en) 2023-10-26

Similar Documents

Publication Publication Date Title
JP6529659B2 (ja) 情報処理方法、端末及びコンピュータ記憶媒体
US20230168796A1 (en) Augmented reality for the internet of things
US10768812B2 (en) Method, terminal, and storage medium for operating objects displayed on a graphical user interface
KR101996978B1 (ko) 정보 처리 방법, 단말 및 컴퓨터 저장 매체
JP6628443B2 (ja) 情報処理方法、端末、およびコンピュータ記憶媒体
JP2022527502A (ja) 仮想オブジェクトの制御方法及び装置、モバイル端末及びコンピュータプログラム
CN110703966A (zh) 文件共享方法、装置、系统、相应设备及存储介质
JP2018525050A (ja) 情報処理方法、端末、およびコンピュータ記憶媒体
CN110870975B (zh) 游戏直播的处理方法、装置、设备及计算机可读存储介质
CN114332417B (zh) 一种多人场景交互的方法、设备、存储介质及程序产品
WO2022247181A1 (zh) 游戏场景的处理方法、装置、存储介质及电子设备
CN110473097A (zh) 交易监控方法、终端和计算机可读存储介质
CN110928397B (zh) 用户界面刷新方法、装置、存储介质及电子装置
US20160192117A1 (en) Data transmission method and first electronic device
WO2023165184A1 (zh) 控制操作的执行方法和装置、存储介质及电子设备
EP3014483B1 (en) Augmented reality
CN112131240A (zh) 脏数据的处理方法和装置、存储介质及电子设备
WO2023035619A1 (zh) 一种场景渲染方法、装置、设备及系统
CN114257827B (zh) 游戏直播间显示方法、装置、设备及存储介质
CN113633974B (zh) 用户实时对局信息的显示方法、装置、终端及存储介质
CN113144606B (zh) 虚拟对象的技能触发方法及相关设备
CN113813599A (zh) 虚拟角色的控制方法和装置、存储介质及电子设备
CN112947748A (zh) 增强现实ar远程交互方法及其系统
CN114968053B (zh) 操作处理方法及装置、计算机可读存储介质和电子设备
CN113112613B (zh) 模型显示方法、装置、电子设备和存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22929612

Country of ref document: EP

Kind code of ref document: A1