CN111214825A - Game control method, device and storage medium - Google Patents

Game control method, device and storage medium Download PDF

Info

Publication number
CN111214825A
CN111214825A CN202010041039.3A CN202010041039A CN111214825A CN 111214825 A CN111214825 A CN 111214825A CN 202010041039 A CN202010041039 A CN 202010041039A CN 111214825 A CN111214825 A CN 111214825A
Authority
CN
China
Prior art keywords
trigger
area
trigger area
display
instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010041039.3A
Other languages
Chinese (zh)
Other versions
CN111214825B (en
Inventor
陈茹沂
陈千举
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202010041039.3A priority Critical patent/CN111214825B/en
Publication of CN111214825A publication Critical patent/CN111214825A/en
Application granted granted Critical
Publication of CN111214825B publication Critical patent/CN111214825B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/22Setup operations, e.g. calibration, key configuration or button assignment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/814Musical performances, e.g. by evaluating the player's ability to follow a notation
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1037Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted for converting control signals received from the game device into a haptic signal, e.g. using force feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/64Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8047Music games

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to the technical field of computers, and discloses a game control method, a game control device and a storage medium, which are used for improving the control efficiency of a user during game playing. In the method, multimedia data is played in response to a performance request for a performance of a tune; displaying at least three trigger keys on a display interface, wherein the at least three trigger keys are arranged in an arc shape; and responding to an interaction instruction aiming at least one trigger key, and executing an interaction operation corresponding to the interaction instruction, wherein the interaction instruction is used for interacting with the multimedia data. Therefore, the trigger keys are arranged on the display interface in an arc shape, and the ergonomics is better met, so that the control efficiency can be improved when a user uses the terminal to play games.

Description

Game control method, device and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a game control method, apparatus, and storage medium.
Background
When the existing horizontal tone dance type hand game is operated, two hands usually hold the hand game and two thumbs are used for operation; or the terminal may be placed on a flat surface for operation. In the related art, when a user plays a game, the operation efficiency is low.
Disclosure of Invention
The embodiment of the application provides a game control method, a game control device and a storage medium, which are used for improving the control efficiency of a user during game playing.
In a first aspect, a game control method is provided, including:
responding to the playing request of the music playing purpose, and playing the multimedia data;
displaying at least three trigger keys on a display interface, wherein the at least three trigger keys are arranged in an arc shape;
and responding to an interaction instruction aiming at least one trigger key, and executing an interaction operation corresponding to the interaction instruction, wherein the interaction instruction is used for interacting with the multimedia data.
In a second aspect, there is provided a game control apparatus comprising:
the playing module is used for responding to a playing request of the playing music and playing the multimedia data;
the first display module is used for displaying at least three trigger keys on a display interface, and the at least three trigger keys are arranged in an arc shape;
the first execution module is used for responding to an interaction instruction aiming at least one trigger key and executing the interaction operation corresponding to the interaction instruction, and the interaction instruction is used for interacting with the multimedia data.
In one embodiment, the apparatus further comprises:
an obtaining module, configured to obtain set display mode indication information in response to a setting operation of a display mode input on the display interface, where the display mode includes a first mode and a second mode, and when the display mode is in the first mode, the first trigger area is displayed at a display position where a sector area is formed with the first interior angle, and when the display mode is in the second mode, the first trigger area is displayed at a display position where a sector area is formed with the second interior angle, where the first interior angle and the second interior angle are two adjacent interior angles on the display interface;
the determining module is configured to determine a display position of the first trigger area according to set display mode indication information when the first trigger area and the second trigger area are displayed on the display interface, and determine a display position of the second trigger area according to the display position of the first trigger area.
In an embodiment, the second execution module is specifically configured to trigger the pressing instruction if it is detected that the pressing pressure in the second trigger area is greater than a preset pressure.
In one embodiment, the interactive operation includes at least one of a sound effect and an animation indicating whether the interactive instruction is successful; and;
the response to the interaction instruction aiming at the at least one triggering key comprises an interaction instruction for triggering the at least three triggering keys simultaneously.
In a third aspect, a computing device is provided, comprising at least one processing unit, and at least one memory unit, wherein the memory unit stores a computer program that, when executed by the processing unit, causes the processing unit to perform the steps of any of the game control methods described above.
In one embodiment, the computing device may be a server or a terminal device.
In a fourth aspect, there is provided a computer readable medium storing a computer program executable by a terminal device, the program, when run on the terminal device, causing the terminal device to perform the steps of any of the game control methods described above.
The embodiment of the application provides a game control method, a game control device and a storage medium, wherein at least three trigger keys are arranged on a display interface in an arc shape by adjusting the trigger keys on the display interface, and interaction is performed aiming at an interaction instruction of at least one trigger key. Therefore, the trigger keys are arranged on the display interface in an arc shape, and the ergonomics is better met, so that the control efficiency can be improved when a user uses the terminal to play games.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the application. The objectives and other advantages of the application may be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
FIG. 1 is a schematic diagram of a first display page of a dance-type hand game;
FIG. 2 is a diagram of a second display page of a dance-type hand game;
FIG. 3 is a schematic diagram of a first display interface according to an embodiment of the present application;
FIG. 4 is a schematic flow chart of a user playing a game in the embodiment of the present application;
FIG. 5 is a diagram illustrating a second display interface according to an embodiment of the present application;
FIG. 6 is a diagram illustrating the transmission of a first type note symbol according to an embodiment of the present application;
FIG. 7 is a diagram illustrating the transmission of a second type of token in an embodiment of the present application;
FIG. 8 is a flow chart illustrating a game control method according to an embodiment of the present application;
FIG. 9 is a schematic diagram of a first triggering area in the embodiment of the present application;
FIG. 10 is a diagram illustrating a second exemplary trigger zone in an embodiment of the present application;
FIG. 11 is a schematic diagram of a third display interface in an embodiment of the present application;
FIG. 12 is a diagram illustrating a first display mode according to an embodiment of the present application;
FIG. 13 is a diagram illustrating a second display mode according to an embodiment of the present application;
FIG. 14 is a schematic structural diagram of an image display apparatus according to an embodiment of the present application;
fig. 15 is a schematic structural diagram of a terminal device in an embodiment of the present application.
Detailed Description
In order to improve the efficiency of operation and control when a user uses a terminal to play a game, embodiments of the present application provide a game control method, an apparatus, and a storage medium. In order to better understand the technical solution provided by the embodiments of the present application, the following brief description is made on the basic principle of the solution:
the dancing type (music dance) is a network game type, and the main playing method of the dancing type hand game is to carry out interaction according to the operation prompted on a display interface when music is played.
When a user plays a game, the user needs to select music which the user wants to listen to on a client and start a game stage. The client plays the music selected by the user, plays the animation corresponding to the music and displays the control area on the display interface.
After the game stage is started, along with the playing of music, the note symbols move on the display interface according to the note tracks, and when the notes reach the preset positions, a user needs to interact by clicking the trigger keys in the control area. As shown in fig. 1, it is a display interface of a dance-like hand game in the related art. Wherein 101 and 102 in fig. 1 are both trigger keys, the trigger key 101 is located at the lower left of the display interface, the trigger key 102 is located at the lower right of the display interface, 103 represents a predetermined position, 104 represents a note symbol, and 105 represents a note track. In fig. 1, each note symbol moves from right to left on the display interface, and when one note symbol 104 moves to the predetermined position 103, the user needs to click the trigger key corresponding to the note symbol (as the note symbol is represented as a left arrow in fig. 1, and therefore needs to click the corresponding trigger key 101), thereby completing the interaction. The note symbol is data moving on the display interface, and the note track is a track on which the note symbol moves.
Whereas in the related art, the dance-type hand game generally displays two trigger keys, such as the trigger key 101 and the trigger key 102 shown in fig. 1, on the display interface. When a user plays a game, a method of holding the game by two hands is adopted, and the thumb of the two hands clicks the trigger key to carry out interaction.
Alternatively, in the related art, in the voice dance type of hand game supporting multi-touch, the trigger keys are generally arranged in a straight line. As shown in fig. 2, it is a schematic diagram of a display page of a dance-like hand game. In fig. 2, 201, 202, 203 and 204 are all trigger keys, 205 denotes a note track, 206 denotes a note symbol, and it should be noted that the predetermined position in fig. 2 is the same as the position of the trigger key. In fig. 2, each note symbol moves from top to bottom on the display interface, and since four trigger keys arranged in a straight line are displayed on the display interface, when a user plays a game, the user needs to place the terminal on a plane, and cannot pick up the terminal. Of course, the terminal may also be held, but in the held state, one thumb is required to take charge of two trigger keys. For example, the left thumb is responsible for triggering key 201 and triggering key 202, and the right thumb is responsible for triggering key 203 and triggering key 204, but if the triggering key 201 and triggering key 202 (or triggering key 203 and triggering key 204) drop the note symbol at the same time, the interaction cannot be perfectly performed. And when playing the sound dance type hand game corresponding to the above-mentioned fig. 1 or fig. 2, the fingers of the user feel muscular fatigue after operating for a certain time, thereby resulting in low control efficiency.
For this reason, referring to fig. 3, in the embodiment of the present application, in consideration of the ergonomic principle, when three or four fingers operated by one hand operate a plurality of trigger keys simultaneously, it is most convenient to set the trigger keys to be arc-shaped, so that the present application adjusts the trigger keys on the display interface, sets at least three trigger keys, and arranges the three trigger keys on the display interface in an arc shape. In fig. 3, 301, 302, 303 and 304 are all trigger keys, and four trigger keys are arranged in an arc shape, so that a user can operate all the trigger keys by only one hand during playing a game, and the finger fatigue of the user is reduced due to ergonomics, so that the operation efficiency is improved when the user uses the terminal to play the game. Further in the embodiment of the present application, it is possible to support triggering a plurality of note symbols simultaneously.
As shown in fig. 3, the triggering key in fig. 3 is a circular button, and of course, in the embodiment of the present application, the button of the triggering key may also be set to be square, triangular, or other shapes, which is not limited in the present application. In order to reduce muscle fatigue of fingers when a user plays a game, the trigger keys are arranged in an arc shape in the embodiment of the present application (as shown in fig. 3). In the application, an arc line can be preset, and the center of each trigger key is placed on the arc line; the arc may be an arc formed by a part of a circle or an arc formed by a part of an ellipse. Or determining the position information of each finger when the four fingers of the user touch the screen together according to the sample, and determining an arc line according to the position information, thereby determining the position of the trigger key.
Through setting up the button like this, accord with ergonomics, make the user more comfortable when holding the cell-phone and playing, improve the efficiency of controlling.
After introducing the design idea of the embodiment of the present application, an application scenario related to the method is briefly described below.
As shown in fig. 4, it is a schematic flow chart of the user when playing the game. The following explains an overall application scenario of the user playing a game in the embodiment of the present application with reference to fig. 4.
In step 401, after the user selects the music he wants to listen to on the client, he clicks the button for starting the game stage, and the game is initialized.
In the initialization stage, the basic data of the game, the note manager and the trigger point manager are initialized, so as to facilitate the subsequent playing. Wherein the basic data includes pictures, animations, sound effects, musical notes, etc. of the game.
The note manager includes a note emitter and a track controller. The note emitter is used for emitting note symbols, and the emitting speed of the note symbols is different according to the selected different levels of music; the track controller is for controlling the note track so that the emitted note symbol moves in accordance with the note track.
The trigger point manager is used for detecting whether the trigger point is in an activated state; wherein, the trigger point is the target point of the movement of the note symbol. When the user clicks the trigger key, the trigger point is in an activated state, and when the trigger key is not clicked, the trigger point is in an inactivated state.
In step 402, after the initialization is completed, music is played and basic data is displayed on the display interface, and the note manager starts to emit notes and starts the trigger point manager to detect the status of the trigger point.
In the embodiment of the present application, in order to enrich the diversity of game control, there are different types of note symbols, so that it is necessary to mark a note symbol and determine which type of note symbol the current note symbol is according to the label. All notes to be transmitted are put into the note set, so that when the note symbol is transmitted, the note symbol can be directly obtained in the note set.
In step 403, when the note symbol moves within the trigger point range, the user completes the interaction of the game by clicking the trigger key. When the trigger point is in an activated state, it is determined whether a distance between the note symbol and the trigger point is less than a preset distance.
In the embodiment of the application, when whether the interaction is successful or not is detected, the distance between the note symbol and the trigger point is determined by judging when the trigger key is clicked. The specific detection algorithm is as follows:
Figure BDA0002367770300000071
wherein, (x1, y1, z1) are coordinates of the trigger point; (x2, y2, z2) are the coordinates of the note symbol. Thus, in the three-dimensional space, if the distance between the coordinates of the note symbols and the coordinates of the trigger points is within a reasonable value range, the detection is passed, otherwise, the detection is not passed.
If the detection passes, step 404 is executed, and if the detection does not pass, feedback is directly given to the user.
In step 404, since the note sets have different types of note symbols, the operations corresponding to the different types of note signals are different. Therefore, when it is determined that the distance between the note symbol and the trigger point is less than the preset distance, it is also necessary to determine whether the note symbol is a special note.
In the embodiment of the present application, the type of the note symbol is determined according to the label of the note symbol.
If it is a special note, step 405 is executed, and if it is not a special note, the user is given feedback directly.
In step 405, if it is determined that the note symbol is a special note, it is determined whether a press command is triggered.
In order to take the diversity of user's manipulation into consideration, the present application adds a pressing area on the basis of fig. 3, as shown in fig. 5. In fig. 5, 501 is an increased area of compression, which can be manipulated by the thumb of the left hand. Therefore, when the triggered common note is, the interaction can be completed only by clicking the trigger key by the right hand; when the special note is triggered, the left hand and the right hand are required to be matched; namely, the right hand clicks a trigger key corresponding to the special note, the left hand presses the pressing area, and if a pressing instruction is triggered, the interaction is completed; if the pressing instruction is not triggered, the interaction fails. And feeding back to the user whether the interaction is successful or not.
In step 406, feedback is given to the user's operation. For example: each operation of the user is scored, and different performance animations or sound effects and the like can be generated when the user successfully clicks or unsuccessfully clicks.
In step 407, after performing the feedback once, it needs to determine whether there is any note not transmitted in the note set, and if not, the note symbol is transmitted; if so, continue to step 402.
And finally, after the music is played and the note symbols are transmitted, ending the game stage.
In the embodiment of the present application, the steps 402 to 406 may be specifically implemented as steps a1 to A3:
step A1: and following the audio data, controlling at least one note symbol to move towards the first trigger area along a set note track, wherein the note track of each note symbol corresponds to one trigger key.
Wherein, the audio data is a played song; the first trigger area is a plurality of trigger keys controlled by a right hand.
Step A2: and when the position relation between each note symbol and the first trigger area reaches a set condition and the trigger key corresponding to each note symbol is triggered, executing the interactive operation corresponding to the interactive instruction matched with the trigger operation.
As shown in fig. 6, it is a schematic diagram of transmitting symbols of notes of the first type. Since a common note symbol is transmitted, the interaction can be completed only through the first trigger area. In FIG. 6, when the note symbol moves to the trigger button 302, the user clicks the trigger button 302, and the interaction is completed.
Step A3: and when the position relation between each note symbol and the first trigger area reaches a set condition, the trigger key corresponding to each note track is triggered and the pressing instruction is effective, executing the interactive operation corresponding to the interactive instruction matched with the triggering operation.
Fig. 7 is a schematic diagram of transmitting a second type of tone symbol. The second type note symbol is different from the first type note symbol, and since the second type note symbol is a special note, in addition to the need to click the corresponding trigger key 302 in the first trigger area, the pressing instruction needs to be triggered by pressing the pressing area.
Therefore, the control can be completed only through the first trigger area, and the control can also be completed through the matching of the first trigger area and the pressing area, so that the control means is enriched.
The method of the present application is explained with reference to the application scenario described in fig. 4, and as shown in fig. 8, the method may specifically include the following steps:
step 801: the multimedia data is played in response to a performance request for a performance song.
Wherein the multimedia data includes audio data and image data.
Step 802: and displaying at least three trigger keys on the display interface, wherein the at least three trigger keys are arranged in an arc shape.
In the embodiment of the present application, in order to facilitate a user to find a trigger area and improve the trigger efficiency of the user during a holding operation, specifically, a first trigger area forming an arc-shaped strip area is displayed on a display interface, and at least three trigger keys are displayed in the first trigger area in an arc-shaped arrangement manner.
In this embodiment, the display interface is a rectangular display interface including four inner angles, and two ends of the first trigger area respectively extend to two adjacent edges of the display interface and form a fan-shaped area with the first inner angle.
As shown in fig. 9, it is a schematic diagram of the trigger area. In fig. 9, the activation region is an arc-shaped strip, and the activation key divides the activation region into four parts. The trigger area is connected with the lower side and the right side of the display interface, and the first inner angle is the lower left corner of the display interface. Of course, the specific shape of the trigger area is not limited in the present application for the sake of the aesthetic appearance of the display of the trigger area, and as shown in fig. 10, the trigger area boundary may be an irregular pattern, or may have another shape.
In the embodiment of the present application, in order to enrich the triggering means, a second triggering area (i.e. a pressing area) is added to the display interface, specifically: and displaying a second trigger area in an area outside the first trigger area, wherein the display position of the second trigger area and the display position of the first trigger area satisfy the following conditions: one end closer to the first trigger area is provided with a set isolation area.
Wherein, closer to one end of the first trigger region means that the second trigger region is closer to one end than to the other end of the first trigger region.
As shown in fig. 11, it is a schematic diagram of the display interface after the second trigger area is added. In fig. 11, one segment of the first trigger area is on the right side of the display interface, and the other end is in the middle portion of the display interface and is connected to the lower side. The second trigger area is near an end of the first trigger area in the middle portion of the display interface. The second display area is provided in the shape of a left thumb for guiding the operation of the user. Of course, the second trigger region may have other shapes, which is not limited in this application.
The left thumb may be placed within the second trigger area while the user is playing the game, and interaction may occur by forcibly pressing the second trigger area while the user is operating. Specifically, the method comprises the following steps: and responding to a pressing instruction aiming at the second trigger area, and executing an operation corresponding to the pressing instruction. Therefore, the first trigger area and the second trigger area are interactively matched, and trigger means can be enriched.
In order to improve the control efficiency and improve the user experience, in the embodiment of the present application, a 3D Touch (multi-Touch technology) is applied to the triggering method. In the present application, the control area of the display interface includes two parts, a first trigger area and a second trigger area, through which a user can interact. In order to enable the user to better hold the terminal, the user can place the thumb of the left hand on the second trigger area while playing the game, and the other fingers of the left hand can be placed on the back of the terminal, so that the terminal can be fixed only by the left hand; meanwhile, the trigger key of the first trigger area is interacted by using the finger of the right hand. If the second trigger area needs to be interacted, the pressure at the second trigger area can be detected by a 3D Touch technology, specifically: responding to the pressing pressure of the second trigger area, if the pressing pressure is detected to be greater than the preset pressure, generating a pressing instruction, and executing an operation corresponding to the pressing instruction.
The pressing operation (such as light pressing or heavy pressing) of the user can be identified through the 3D Touch technology, the pressing operation of the user is determined according to the pressing degree of the user on the screen, when the second trigger area does not need to be interacted, although the thumb of the left hand of the user is always placed in the second trigger area, the left thumb only plays the role of fixing the terminal and does not press the screen forcefully, the pressing operation of the user is detected to be light pressing, and therefore the pressing instruction is not triggered. And if the second trigger area needs to be interacted, the user increases the pressing pressure through the heavy pressing screen, when the pressing pressure is detected to be greater than the preset pressure, the pressing operation of the user is determined to be heavy pressing, and at the moment, the pressing instruction is triggered, so that the interaction is completed.
When the user presses the second trigger area, different feedback can be given to the user according to different pressing force degrees of the user. For example: the displayed color is different according to the pressure. If the pressure intensity pressed by the user is not greater than the preset pressure intensity, the second trigger area is unchanged, and if the pressure intensity pressed by the user is greater than the preset pressure intensity, the color of the second trigger area is changed into pink. Alternatively, the frequency of the vibration may be different depending on the pressure. And if the pressing pressure intensity of the user is not greater than the preset pressure intensity, the second trigger area is unchanged, and if the pressing pressure intensity is greater than the preset pressure intensity, the second trigger area carries out vibration prompt.
Like this, detect user's pressing operation through 3D Touch technique, can make the user can be more comfortable control to control efficiency has been improved, and user experience has been promoted.
In the embodiment of the application, when the user plays the game, a user is used to operate with the left hand, and a user is used to operate with the right hand. In order to meet the requirements of different users, two different operation modes are set. Specifically, the method comprises the following steps: responding to the setting operation of the display mode input on the display interface to obtain the set display mode indication information;
the display mode comprises a first mode and a second mode, when the display mode is used for displaying in the first mode, the first trigger area is displayed at the display position where the first interior angle forms a fan-shaped area, when the display mode is used for displaying in the second mode, the first trigger area is displayed at the display position where the first interior angle forms a fan-shaped area, and the first interior angle and the second interior angle are two adjacent interior angles on the display interface.
When the first trigger area and the second trigger area are displayed on the display interface, the display position of the first trigger area is determined according to the set display mode indication information, and the display position of the second trigger area is determined according to the display position of the first trigger area.
As shown in fig. 12 and 13, which are schematic views of two display modes. The user may select the display mode through settings in the client. In fig. 12, the first display mode is that the second trigger area is operated by the left hand and the first trigger area is operated by the right hand. In fig. 13, the second display mode is that the left hand operates the first trigger area and the right hand operates the second trigger area.
Therefore, the user can select different display modes according to own habits, and the user can better control the display modes, so that the control efficiency is improved.
Step 303: and responding to the interaction instruction aiming at the at least one trigger key, and executing the interaction operation corresponding to the interaction instruction, wherein the interaction instruction is used for interacting with the multimedia data.
In the embodiment of the application, corresponding feedback is given to the user according to the operation of the user. Specifically, the interactive operation includes at least one of a sound effect and an animation indicating whether the interactive instruction is successful.
For example: if the user clicks the trigger key of the note track corresponding to the note symbol when the note symbol reaches the preset position, a sound effect for prompting success and an animation effect corresponding to success are given. If the user does not click the trigger key of the note track corresponding to the note symbol when the note symbol reaches the preset position, a sound effect for prompting failure and an animation effect corresponding to failure are given.
In the embodiment of the application, the response to the interaction instruction aiming at the at least one triggering key comprises the interaction instruction of simultaneously triggering at least three triggering keys. In the related art, since only two fingers can be used for manipulation, at most two points are simultaneously touched. In the embodiment of the application, the number of the triggering keys is increased to at least three, and the interaction instruction for simultaneously triggering at least three triggering keys is supported.
Therefore, the trigger keys are arranged on the display interface in an arc shape, and the ergonomics is better met, so that the control efficiency can be improved when a user uses the terminal to play games.
Based on the same inventive concept, the embodiment of the application also provides a game control device. As shown in fig. 14, the apparatus includes:
a playing module 1401 for playing multimedia data in response to a playing request for a playing tune;
the first display module 1402 is configured to display at least three trigger keys on a display interface, where the at least three trigger keys are arranged in an arc shape;
the first executing module 1403 is configured to respond to an interaction instruction for the at least one trigger key, and execute an interaction operation corresponding to the interaction instruction, where the interaction instruction is used for interacting with multimedia data.
In one embodiment, the first display module 1402 includes displaying a first trigger area forming an arc-shaped strip area on the display interface, and displaying at least three trigger keys in an arc-shaped arrangement in the first trigger area.
In one embodiment, the display interface is a rectangular display interface including four inner angles, and two ends of the first trigger area respectively extend to two adjacent edges of the display interface and form a fan-shaped area with the first inner angle.
In one embodiment, the apparatus further comprises:
the second display module is used for displaying a second trigger area in an area outside the first trigger area, and the display position of the second trigger area and the display position of the first trigger area meet the following conditions: one end closer to the first trigger area and a set isolation area arranged between the first trigger area and the first trigger area;
and the second execution module is used for responding to the pressing instruction aiming at the second trigger area and executing the operation corresponding to the pressing instruction.
In an embodiment, the second execution module is specifically configured to trigger the pressing instruction if it is detected that the pressing pressure in the second trigger area is greater than a preset pressure.
In one embodiment, the multimedia data includes audio data, the apparatus further comprising:
the control module is used for controlling at least one note symbol to move to the first trigger area along a set note track along with the audio data, and the note track of each note symbol corresponds to one trigger key;
the first execution module 1403 includes at least one of:
the first execution unit is used for executing the interaction operation corresponding to the interaction instruction matched with the triggering operation when the position relation between each note symbol and the first triggering area reaches the set condition and the triggering key corresponding to each note symbol is triggered;
and the second execution unit is used for executing the interactive operation corresponding to the interactive instruction matched with the triggering operation when the position relation between each note symbol and the first triggering area reaches the set condition, the triggering key corresponding to each note track is triggered and the pressing instruction is effective.
In one embodiment, the apparatus further comprises:
the display device comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for responding to the setting operation of a display mode input on a display interface and acquiring the set display mode indication information, the display mode comprises a first mode and a second mode, when the display mode is in the first mode, a first trigger area is displayed at the display position where a fan-shaped area is formed with a first internal angle, when the display mode is in the second mode, the first trigger area is displayed at the display position where the fan-shaped area is formed with a second internal angle, and the first internal angle and the second internal angle are two adjacent internal angles on the display interface;
and the determining module is used for determining the display position of the first trigger area according to the set display mode indication information when the first trigger area and the second trigger area are displayed on the display interface, and determining the display position of the second trigger area according to the display position of the first trigger area.
In one embodiment, the interactive operation includes at least one of sound effects and animation indicating whether the interactive instruction is successful;
the response to the interaction instruction aiming at the at least one triggering key comprises the interaction instruction of simultaneously triggering at least three triggering keys.
Based on the same technical concept, the present application further provides a terminal device 1500, referring to fig. 15, where the terminal device 1500 is configured to implement the methods described in the above various method embodiments, for example, implement the embodiment shown in fig. 8, and the terminal device 1500 may include a memory 1501, a processor 1502, an input unit 1503, and a display panel 1504.
A memory 1501 for storing computer programs executed by the processor 1502. The memory 1501 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function, and the like; the storage data area may store data created according to the use of the terminal apparatus 1500, and the like. The processor 1502 may be a Central Processing Unit (CPU), a digital processing unit, or the like. The input unit 1503 may be used to obtain a user instruction input by a user. The display panel 1504 is configured to display information input by a user or information provided to the user, and in this embodiment of the present application, the display panel 1504 is mainly used to display a display interface of each application program in the terminal device and a control entity displayed in each display interface. Alternatively, the display panel 1504 may be configured by a Liquid Crystal Display (LCD) or an organic light-emitting diode (OLED) or the like.
The embodiment of the present application does not limit a specific connection medium among the memory 1501, the processor 1502, the input unit 1503, and the display panel 1504. In the embodiment of the present invention, the memory 1501, the processor 1502, the input unit 1503, and the display panel 1504 are connected by a bus 1505 in fig. 15, the bus 1505 is shown by a thick line in fig. 15, and the connection manner between other components is merely schematic illustration and is not limited thereto. The bus 1505 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in FIG. 15, but this is not intended to represent only one bus or type of bus.
The memory 1501 may be a volatile memory (volatile memory), such as a random-access memory (RAM); the memory 1501 may also be a non-volatile memory (non-volatile) such as, but not limited to, a read-only memory (rom), a flash memory (flash memory), a hard disk (HDD) or a solid-state drive (SSD), or the memory 1501 may be any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory 1501 may be a combination of the above memories.
The processor 1502, configured to implement the embodiment shown in fig. 8, includes:
a processor 1502 for invoking a computer program stored in the memory 1501 for performing the embodiment shown in fig. 8 is implemented.
The embodiment of the present application further provides a computer-readable storage medium, which stores computer-executable instructions required to be executed by the processor, and includes a program required to be executed by the processor.
In some possible embodiments, aspects of a game control method provided by the present application may also be implemented in the form of a program product including program code for causing a terminal device to perform the steps of a game control method according to various exemplary embodiments of the present application described above in this specification when the program product is run on the terminal device. For example, the terminal device may perform the embodiment as shown in fig. 3.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A game control program product for embodiments of the present application may employ a portable compact disk read-only memory (CD-ROM) and include program code, and may be run on a computing device. However, the program product of the present application is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present application may be written in any combination of one or more programming languages, including a physical programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device over any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., over the internet using an internet service provider).
It should be noted that although several units or sub-units of the apparatus are mentioned in the above detailed description, such division is merely exemplary and not mandatory. Indeed, the features and functions of two or more units described above may be embodied in one unit, according to embodiments of the application. Conversely, the features and functions of one unit described above may be further divided into embodiments by a plurality of units.
Further, while the operations of the methods of the present application are depicted in the drawings in a particular order, this does not require or imply that these operations must be performed in this particular order, or that all of the illustrated operations must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable game control device to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable game control device, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable game control apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable game control apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While the preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (15)

1. A game control method, the method comprising:
responding to the playing request of the music playing purpose, and playing the multimedia data;
displaying at least three trigger keys on a display interface, wherein the at least three trigger keys are arranged in an arc shape;
and responding to an interaction instruction aiming at least one trigger key, and executing an interaction operation corresponding to the interaction instruction, wherein the interaction instruction is used for interacting with the multimedia data.
2. The method of claim 1, wherein displaying at least three toggle keys on a display interface comprises:
and displaying a first trigger area forming an arc-shaped strip area on the display interface, and displaying the at least three trigger keys in the first trigger area in an arc-shaped arrangement mode.
3. The method according to claim 2, wherein the display interface is a rectangular display interface including four inner corners, and two ends of the first trigger area respectively extend to two adjacent edges of the display interface and form a fan-shaped area with the first inner corner.
4. The method of claim 3, further comprising:
displaying a second trigger area in an area outside the first trigger area, wherein the display position of the second trigger area and the display position of the first trigger area satisfy: one end closer to the first trigger area and a set isolation area arranged between the first trigger area and the first trigger area;
and responding to a pressing instruction aiming at the second trigger area, and executing an operation corresponding to the pressing instruction.
5. The method according to claim 4, wherein the performing, in response to the pressing instruction for the second trigger area, an operation corresponding to the pressing instruction comprises:
responding to the pressing pressure of the second trigger area, if the pressing pressure is detected to be greater than the preset pressure, generating a pressing instruction, and executing an operation corresponding to the pressing instruction.
6. The method of claim 4, wherein the multimedia data comprises audio data, the method further comprising:
following the audio data, controlling at least one note symbol to move to the first trigger area along a set note track, wherein the note track of each note symbol corresponds to one trigger key;
the response is to an interaction instruction of at least one trigger key, and the interaction operation corresponding to the interaction instruction is executed, wherein the interaction operation at least comprises one of the following steps:
when the position relation between each note symbol and the first trigger area reaches a set condition and a trigger key corresponding to each note symbol is triggered, executing an interaction operation corresponding to an interaction instruction matched with the trigger operation;
and when the position relation between each note symbol and the first trigger area reaches a set condition, the trigger key corresponding to each note track is triggered and the pressing instruction is effective, executing the interactive operation corresponding to the interactive instruction matched with the triggering operation.
7. The method of claim 4, further comprising:
obtaining set display mode indication information in response to a setting operation of a display mode input on the display interface, wherein the display mode comprises a first mode and a second mode, when the display mode is in the first mode, the first trigger area is displayed at a display position where a fan-shaped area is formed with the first inner angle, when the display mode is in the second mode, the first trigger area is displayed at a display position where a fan-shaped area is formed with the second inner angle, and the first inner angle and the second inner angle are two adjacent inner angles on the display interface;
when the first trigger area and the second trigger area are displayed on the display interface, the display position of the first trigger area is determined according to the set display mode indication information, and the display position of the second trigger area is determined according to the display position of the first trigger area.
8. The method according to any one of claims 1 to 7, wherein the interactive operation includes at least one of sound effects and animation indicating whether the interactive instruction is successful; and the number of the first and second groups,
the response to the interaction instruction aiming at the at least one triggering key comprises an interaction instruction for triggering the at least three triggering keys simultaneously.
9. A game control apparatus, characterized in that the method comprises:
the playing module is used for responding to a playing request of the playing music and playing the multimedia data;
the first display module is used for displaying at least three trigger keys on a display interface, and the at least three trigger keys are arranged in an arc shape;
the first execution module is used for responding to an interaction instruction aiming at least one trigger key and executing the interaction operation corresponding to the interaction instruction, and the interaction instruction is used for interacting with the multimedia data.
10. The device of claim 9, wherein the first display module comprises a first trigger area that forms an arc-shaped strip area and displays the at least three trigger keys in an arc-shaped arrangement in the first trigger area on the display interface.
11. The device according to claim 10, wherein the display interface is a rectangular display interface including four inner corners, and two ends of the first trigger area respectively extend to two adjacent edges of the display interface and form a fan-shaped area with the first inner corner.
12. The apparatus of claim 11, further comprising:
the second display module is used for displaying a second trigger area in an area outside the first trigger area, and the display position of the second trigger area and the display position of the first trigger area meet the following conditions: one end closer to the first trigger area and a set isolation area arranged between the first trigger area and the first trigger area;
and the second execution module is used for responding to the pressing instruction aiming at the second trigger area and executing the operation corresponding to the pressing instruction.
13. The apparatus of claim 10, wherein the multimedia data comprises audio data, the apparatus further comprising:
the control module is used for following the audio data and controlling at least one note symbol to move to the first trigger area along a set note track, and the note track of each note symbol corresponds to one trigger key;
the first execution module includes at least one of:
the first execution unit is used for executing the interaction operation corresponding to the interaction instruction matched with the triggering operation when the position relation between each note symbol and the first triggering area reaches the set condition and the triggering key corresponding to each note symbol is triggered;
and the second execution unit is used for executing the interactive operation corresponding to the interactive instruction matched with the triggering operation when the position relation between each note symbol and the first triggering area reaches a set condition, the triggering key corresponding to each note track is triggered and the pressing instruction is effective.
14. A computer-readable medium having stored thereon computer-executable instructions for performing the method of any one of claims 1-8.
15. A computing device, comprising:
at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-8.
CN202010041039.3A 2020-01-15 2020-01-15 Game control method, device and storage medium Active CN111214825B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010041039.3A CN111214825B (en) 2020-01-15 2020-01-15 Game control method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010041039.3A CN111214825B (en) 2020-01-15 2020-01-15 Game control method, device and storage medium

Publications (2)

Publication Number Publication Date
CN111214825A true CN111214825A (en) 2020-06-02
CN111214825B CN111214825B (en) 2021-08-17

Family

ID=70826083

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010041039.3A Active CN111214825B (en) 2020-01-15 2020-01-15 Game control method, device and storage medium

Country Status (1)

Country Link
CN (1) CN111214825B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112698757A (en) * 2020-12-25 2021-04-23 北京小米移动软件有限公司 Interface interaction method and device, terminal equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105813700A (en) * 2013-12-11 2016-07-27 科乐美数码娱乐株式会社 Game program, game system, and game method
JP5962726B2 (en) * 2014-09-16 2016-08-03 株式会社Cygames GAME DEVICE AND PROGRAM
CN106362393A (en) * 2015-07-20 2017-02-01 新游网络科技有限公司 Method, apparatus, and recording medium for controlling game
CN107008008A (en) * 2015-12-25 2017-08-04 株式会社万代南梦宫娱乐 Server

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105813700A (en) * 2013-12-11 2016-07-27 科乐美数码娱乐株式会社 Game program, game system, and game method
JP5962726B2 (en) * 2014-09-16 2016-08-03 株式会社Cygames GAME DEVICE AND PROGRAM
CN106362393A (en) * 2015-07-20 2017-02-01 新游网络科技有限公司 Method, apparatus, and recording medium for controlling game
CN107008008A (en) * 2015-12-25 2017-08-04 株式会社万代南梦宫娱乐 Server

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
无: "《音乐游戏别踩白块儿,新手赛单手9.164》", 《HTTPS://V.QQ.COM/X/PAGE/E0517ZIM4S2.HTML?FROMVSOGOU=1》 *
晓峰BOSS: "劲舞时代P Love Perfeet*N劲舞团十年重逢再遇绝版梦蝶翅膀+[友谊]紫色猫咪喵喵", 《HTTPS://V.YOUKU.COM/V_SHOW/ID_XMJC4MTI3MDI5NG==》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112698757A (en) * 2020-12-25 2021-04-23 北京小米移动软件有限公司 Interface interaction method and device, terminal equipment and storage medium

Also Published As

Publication number Publication date
CN111214825B (en) 2021-08-17

Similar Documents

Publication Publication Date Title
KR101052393B1 (en) Techniques for Interactive Input to Portable Electronic Devices
KR101319264B1 (en) Method for providing UI according to multi touch pressure and electronic device using the same
KR101657963B1 (en) Operation Method of Device based on a alteration ratio of touch area And Apparatus using the same
US20170046121A1 (en) Method and apparatus for providing user interface in an electronic device
KR20210062640A (en) Techniques for implementing graphic overlays for streaming games based on current game scenarios
TW200949666A (en) Accessing a menu utilizing a drag-operation
JP2008200295A (en) Game device, program and information storage medium
CN108369456A (en) Touch feedback for touch input device
US9530399B2 (en) Electronic device for providing information to user
JP2012053532A (en) Information processing apparatus and method, and program
US10496199B2 (en) Device and method for controlling playback of digital multimedia data as well as a corresponding computer-readable storage medium and a corresponding computer program
WO2016158219A1 (en) Game device and game program
WO2016158213A1 (en) Game device and game program
US11896900B2 (en) Game console application with action card strand
JP2017138738A (en) Input device, display device, and method for controlling input device
CN111214825B (en) Game control method, device and storage medium
US20120223891A1 (en) Electronic percussion gestures for touchscreens
JP2023517283A (en) Methods for caching and presenting interactive menus for heterogeneous applications
CN113198179A (en) Virtual object steering control method and device, storage medium and electronic equipment
US20140143303A1 (en) Information processing system, information processing device, information processing method, and storage medium having stored therein computer program
JP2016193052A (en) Game device and game program
JP2016179152A (en) Game device and program
CN103853414B (en) Information processing method, electronic equipment and touch control input equipment
US9741326B2 (en) Electronic device supporting music playing function and method for controlling the electronic device
JP2020049337A (en) Game device and game program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40024150

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant