WO2022199082A1 - 游戏界面的交互方法、系统及计算机可读存储介质 - Google Patents

游戏界面的交互方法、系统及计算机可读存储介质 Download PDF

Info

Publication number
WO2022199082A1
WO2022199082A1 PCT/CN2021/132258 CN2021132258W WO2022199082A1 WO 2022199082 A1 WO2022199082 A1 WO 2022199082A1 CN 2021132258 W CN2021132258 W CN 2021132258W WO 2022199082 A1 WO2022199082 A1 WO 2022199082A1
Authority
WO
WIPO (PCT)
Prior art keywords
game
audio
operation instruction
display interface
volume
Prior art date
Application number
PCT/CN2021/132258
Other languages
English (en)
French (fr)
Inventor
姜程瀚
薛乔
孙凯男
郭晓畅
Original Assignee
上海莉莉丝计算机技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海莉莉丝计算机技术有限公司 filed Critical 上海莉莉丝计算机技术有限公司
Priority to US18/283,382 priority Critical patent/US20240211128A1/en
Priority to CN202180077409.1A priority patent/CN116940921A/zh
Publication of WO2022199082A1 publication Critical patent/WO2022199082A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/165Management of the audio stream, e.g. setting of volume, audio stream path
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Definitions

  • the present invention relates to the field of game control, and in particular, to a game interface interaction method, system and computer-readable storage medium.
  • the switching operation mode by clicking the button is too simple, and during the switching process, the switching connection of the background music is not smooth, giving the user a poor game experience.
  • the purpose of the present invention is to provide a game interface interaction method, system and computer-readable storage medium, the background music follows the game scene perfectly and provides users with a better visual and auditory experience.
  • the invention discloses a game interface interaction method, comprising the following steps:
  • each game interface picture includes at least two game scene units
  • a display interface is configured, the display interface corresponds to the display screen of the smart terminal, and when the display screen receives a sliding operation, the display interface moves laterally;
  • the control module of the smart terminal forms an audio control command based on the distance vector to change the audio parameters of each game audio.
  • the vertical calibration base of each game scene unit is defined as the starting point
  • the step of calculating the distance vector between the display interface and each starting point includes:
  • each game scene unit Defines the central axis of each game scene unit as the vertical calibration basis, and define the central axis of the display interface as the longitudinal reference basis;
  • the control module of the intelligent terminal forms an audio control instruction based on the distance vector, and the steps of changing the audio parameters of each game audio include:
  • the control module forms an audio control instruction including volume control information based on the first ratio and the second ratio of the distance between the first distance scalar and the second distance scalar and the distance between two adjacent longitudinal calibration bases, wherein based on the volume control information, each adjustment is performed separately.
  • volume control information based on the first ratio and the second ratio of the distance between the first distance scalar and the second distance scalar and the distance between two adjacent longitudinal calibration bases, wherein based on the volume control information, each adjustment is performed separately.
  • the step of separately adjusting the volume of each game audio comprises:
  • the control module obtains the current volume of the smart terminal based on:
  • Volume control information (1-first ratio)*100%*current volume
  • Volume control information (1-second ratio)*100%*current volume
  • the vertical calibration base of each game scene unit is defined as the starting point, and the step of calculating the distance vector between the display interface and each starting point further includes:
  • the steps of separately adjusting the volume of each game audio include:
  • the control module obtains the current volume of the smart terminal based on:
  • Volume control information in the first direction (1-first ratio)*100%*current volume
  • the step of changing the audio parameters of each game audio comprises:
  • the interactive method also includes the following steps:
  • a sliding threshold and an audio adjustment rate threshold are set in the game application.
  • the control module changes the audio parameters of each game audio based on the audio adjustment rate threshold.
  • the steps of selecting any operation instruction in the operation instruction group and applying the operation instruction to the game object include:
  • the operation command corresponding to the determined operation command interval is applied to the first game object in the weight ranking.
  • the invention also discloses an interactive system for a game interface, comprising:
  • a storage module which stores at least one game interface picture in the intelligent terminal running the game application program, and each game interface picture includes at least two game scene units;
  • the configuration module is configured to configure a display interface, the display interface corresponds to the display screen of the intelligent terminal, and when the display screen receives the sliding operation, the display interface moves laterally;
  • the calculation module defines the vertical calibration base of each game scene unit as the starting point, and calculates the distance vector between the display interface and each starting point;
  • an audio module running and playing at least two game audios in the game application, wherein each game audio corresponds to a game scene unit;
  • the control module when the display interface moves laterally within the game interface picture, the control module forms an audio control instruction based on the distance vector to change the audio parameters of each game audio.
  • it also includes:
  • an acquisition module to acquire a game object in any game scene unit, and an operation instruction group of the operation object, wherein the operation instruction group includes at least one operation instruction;
  • the execution module selects any operation instruction in the operation instruction group, and applies the operation instruction to the game object, wherein the execution module includes:
  • the statistical unit based on the remaining life value of the game object and the weight of the operation instructions, forms a line segment with a total length of l and a length of each operation instruction interval of in;
  • the execution unit applies an operation instruction corresponding to the determined operation instruction interval to the first game object in the weighting order based on the weight ordering of the game objects.
  • the present invention further discloses a computer-readable storage medium on which a computer program is stored, and when the computer program is executed by a processor, the above steps are implemented.
  • FIG. 1 is a schematic flowchart of an interaction method for a game interface in accordance with a preferred embodiment of the present invention
  • FIG. 2 is a schematic diagram of switching between game scene units in accordance with a preferred embodiment of the present invention.
  • first, second, third, etc. may be used in this disclosure to describe various pieces of information, such information should not be limited by these terms. These terms are only used to distinguish the same type of information from each other.
  • first information may also be referred to as the second information, and similarly, the second information may also be referred to as the first information, without departing from the scope of the present disclosure.
  • word "if” as used herein can be interpreted as "at the time of” or "when” or "in response to determining.”
  • FIG. 1 it is a schematic flowchart of an interaction method for a game interface in accordance with a preferred embodiment of the present invention.
  • the interaction method for a game interface includes the following steps:
  • S100 Store at least one game interface picture in the smart terminal running the game application, and each game interface picture includes at least two game scene units;
  • the game application When running, the game application will display the interface of the game to the user.
  • the main interface and main operation interaction displayed after entering the game will be presented in a game interface picture as a game scene unit.
  • the entire game interface is not displayed to the user as a whole, and includes at least two game scene units.
  • S200 Configure a display interface, the display interface corresponds to the display screen of the smart terminal, and when the display screen receives a sliding operation, the display interface moves laterally;
  • a display interface is configured, and the display interface can also be regarded as the display screen of the smart terminal, that is, what the user sees is all the content in the display interface.
  • the size of the display interface is smaller than the picture of the game interface. If the user needs to switch the content displayed on the display screen, he can slide on the display screen. For example, when the smart terminal is placed vertically, he can move up and down on the display screen, and when the smart terminal is placed horizontally, he can move left and right on the display screen.
  • the display interface will slide on the game interface picture, so that the part of the game interface picture selected by the display interface is the part displayed to the user, so that there is no interface transition in the display content of the game process, giving users a seamless interactive experience.
  • S300 Define the vertical calibration base of each game scene unit as the starting point, and calculate the distance vector between the display interface and each starting point;
  • each game scene unit is defined with a vertical calibration base as a starting point, and the position of the game scene unit is defined in a manner similar to a "mass point", that is, the position of each game scene unit is first determined. Then, calculate the distance vector between the display interface and the starting point at any time. It can be understood that the distance vector refers to the distance between the display interface and the vertical calibration base of any game scene unit (it can be calculated when the game scene unit is displayed or not displayed on the display interface), and the display interface is located in the game scene. The orientation of the unit, so as to spatially reflect the relationship between the display unit and the game scene unit.
  • the played audio will be acquired on the other hand, and the played audio will be associated with the sliding of the display interface.
  • the game application program runs and plays at least two game audios, each of which corresponds to a game scene unit. That is, when the display interface completely displays a certain game scene unit, the game audio corresponding to the game scene unit will be played. Therefore, there is a one-to-one correspondence between game scene units and game audio.
  • the control module in the smart terminal forms an audio control command based on the distance vector calculated above, and the audio control command will control the audio parameters of the game audio. For example, when 50% of the display interface displays the first game scene unit, and the remaining 50% displays the second game scene unit, the game audio corresponding to the first game scene unit will be played, and the game audio corresponding to the second game scene unit will also be played. .
  • the game audio corresponding to the first game scene unit will be played, and the game corresponding to the second game scene unit will also be played.
  • the volume of the game audio corresponding to the first game scene unit is small, and the volume of the game audio corresponding to the second game scene unit is large, so that the content displayed on the display interface matches the play mode of the game audio.
  • the user can receive the sliding effect of the display interface visually and audibly.
  • the seamless switching method saves the user's waiting time and improves the game experience.
  • the perfect fit of the audio and video gives the user more Immersive feeling.
  • step S300 specifically includes:
  • S310 Define the central axis of each game scene unit as the vertical calibration basis, and define the central axis of the display interface as the vertical reference basis;
  • each game scene unit is rectangular, so as to match the display screen of the smart terminal. Therefore, the central axis of the rectangular game scene unit is used as the basis for longitudinal calibration, that is, the rest of the game scene units or display interfaces are calculated by All distances will refer to this central axis.
  • the central axis of the display interface is also defined as its vertical reference base, which simplifies the two rectangular display images into a measure between "lines" and "lines”.
  • S320 Calculate the first distance scalar and the second distance scalar of the longitudinal reference base and the two adjacent longitudinal calibration bases respectively;
  • the first distance scalar and the second distance scalar between the longitudinal reference base of the display interface and the adjacent two longitudinal calibration bases are calculated. More specifically, it can be understood that if the distance between the longitudinal calibration bases of two adjacent game scene units is certain, that is, the sum of the first distance scalar and the second distance scalar is certain, then the first distance scalar or the second distance is calculated. When scalar, the constant sum of the two can be subtracted from one of the computed terms.
  • the specific values of the first distance scalar and the second distance scalar represent the distance between the longitudinal reference base and the two adjacent longitudinal calibrations. Since the longitudinal calibration base and the longitudinal reference base are two parallel or almost parallel lines, the first distance scalar and the second distance scalar are the distances between the two parallel lines.
  • the vertical reference bases are vertical lines, and the distance between the obtained intersection points is the first distance scalar or the second distance scalar.
  • step S500 includes:
  • the control module forms an audio control instruction including volume control information based on the first ratio and the second ratio of the distance between the first distance scalar and the second distance scalar and the distances of two adjacent longitudinal calibration bases, wherein based on the volume control information, respectively Adjust the volume of each game's audio.
  • the control module After having the values of the first distance scalar and the second distance scalar, the control module compares the two with the distance between the two adjacent longitudinal calibration bases, and the obtained first ratio and second ratio indicate that the current display interface has The ratio of the displayed content of two adjacent game scene units, in order to cooperate with the audio control, especially to give the user a sense of immersion of "the more displayed, the louder the volume", the audio control command formed will include the volume control information, which will be displayed in the display. During the sliding process of the interface, the volume of different game audio will be adjusted accordingly.
  • the step S510 of respectively adjusting the volume of each game audio includes:
  • the control module obtains the current volume of the smart terminal based on:
  • Volume control information (1-first ratio)*100%*current volume
  • Volume control information (1-second ratio)*100%*current volume
  • the volume set by the current user on the smart terminal will be obtained, and the current volume will be used as the volume upper limit to calculate the volume of the two channels of game audio respectively, and the volume control information will be consistent with the display content in the display interface.
  • the audio volume of one channel of game audio can be heard gradually decreasing, and the volume of the other channel of game audio is gradually increased, which corresponds to the visually perceived content of a game scene unit. Less and less, another game scene unit appears more and more.
  • the vertical calibration base of each game scene unit is defined as the starting point
  • the step S300 of calculating the distance vector between the display interface and each starting point also includes:
  • S330 Calculate the first direction and the second direction of the longitudinal reference base and the two adjacent longitudinal calibration bases respectively;
  • the first direction and the second direction of the longitudinal reference base and the adjacent two longitudinal calibration bases can be obtained.
  • the intersection of the two vertical calibration bases and the longitudinal reference base and the vertical calibration base and the longitudinal reference base can be obtained.
  • the intersection with the longitudinal reference base is the starting point, and the intersection with the longitudinal calibration base is the end point, dividing the vector, the direction of the vector That is, the first direction and the second direction.
  • the step S510 of respectively adjusting the volume of each game audio includes:
  • the control module obtains the current volume of the smart terminal based on:
  • Volume control information in the first direction (1-first ratio)*100%*current volume
  • the volume on the channel will be adjusted according to the different directions.
  • first direction or the second direction corresponds to the left channel or the right channel
  • each channel plays a game audio, so that Under the dual channel, the game audio of the left and right channels is inconsistent.
  • the volume of the game audio of one channel gradually increases, and vice versa, the other gradually decreases.
  • both the left and right channels play two game audios, but the same game audio in different channels is inconsistent.
  • the first game volume is 30% of the current volume (the sum of the two is 100% of the full volume), otherwise the second game audio volume of the left channel is 30% of the current volume, then the second game volume under the right channel It is 70% of the current volume (under each channel, the full volume of different game audio is allocated), and when the display interface slides, the above ratio changes all the time, giving users an immersive sense of immersion.
  • step S500 further includes:
  • S520 Change one or more of the volume, frequency band, phase or reverberation of each of the game audios.
  • the interactive method also includes the following steps:
  • the sliding threshold and the audio adjustment rate threshold are set in the game application.
  • the sliding threshold When it is detected that the lateral movement speed of the display interface is greater than the sliding threshold, there is no restriction on the sliding action itself, but the changes of audio parameters, such as the speed and phase of volume adjustment The rate of adjustment will be limited to the audio adjustment rate threshold to control the occurrence of poor user experience.
  • the control logic with the same changing speed can still be used.
  • the interaction method further includes the following steps:
  • S700 Acquire a game object in any game scene unit, and an operation instruction group of the operation object, wherein the operation instruction group includes at least one operation instruction;
  • one of the skills is preferentially selected, and then applied to the opponent's game object. And which operation command is selected and applied to which game object will be further completed by the following steps:
  • each operation command is a line segment whose length is in. It can be understood that the higher the weight, the longer the length of the line segment.
  • S820 Randomly select points on the line segment, and select the falling range to determine the operation command range
  • Randomly choose an action command In order to achieve randomness, or in game applications, to achieve balance between automatic operation and manual operation (if automatic operation is always provided with the best choice, the operation result of automatic operation will be far better than that of manual operation), Randomly choose an action command.
  • the realization of randomness is to randomly select a point on the total factory l, and determine which command interval it falls into, then determine the operation command. Up to this point, the skill to perform will be determined first (very different from the way in which the object of operation is first determined in the prior art).
  • the operation command is applied to the first game object.
  • the operation logic that has certain intelligence, but limited intelligence, and even misoperations may occur, manual operations can be simulated as much as possible.
  • the invention also discloses an interactive system for a game interface, comprising:
  • the storage module stores at least one game interface picture in the intelligent terminal running the game application program, and each game interface picture includes at least two game scene units;
  • the configuration module is configured with a display interface, and the display interface corresponds to the display screen of the intelligent terminal , when the display screen receives the sliding operation, the display interface moves horizontally;
  • the calculation module defines the vertical calibration base of each game scene unit as the starting point, and calculates the distance vector between the display interface and each starting point;
  • the audio module in the game application program Running and playing at least two game audios, wherein each game audio corresponds to a game scene unit;
  • the control module when the display interface moves laterally in the game interface picture, the control module forms audio control instructions based on the distance vector to change each game audio audio parameters.
  • it also includes: an acquisition module, which acquires a game object in any game scene unit and an operation instruction group of the operation object, wherein the operation instruction group includes at least one operation instruction; an execution module, selects one of the operation instructions in the operation instruction group.
  • any operation instruction of the game object and apply the operation instruction to the game object
  • the execution module includes: a statistical unit, based on the remaining life value of the game object and the weight of the operation instruction, to form a total length of l, the length of each operation instruction interval is in a line segment
  • the determination unit randomly selects points on the line segment, and selects the falling interval as the determined operation instruction interval; the execution unit, based on the weight sorting of the game objects, applies the operation instruction corresponding to the determined operation instruction interval to the first game object in the weight sorting.
  • the present invention also discloses a computer-readable storage medium on which a computer program is stored, and when the computer program is executed by a processor, the above steps are implemented.
  • Smart terminals can be implemented in various forms.
  • the terminals described in the present invention may include smart terminals such as mobile phones, smart phones, notebook computers, PDAs (Personal Digital Assistants), PADs (Tablet Computers), PMPs (Portable Multimedia Players), navigation devices, etc., as well as smart terminals such as Stationary terminals for digital TVs, desktop computers, etc.
  • smart terminals such as Stationary terminals for digital TVs, desktop computers, etc.
  • the terminal is an intelligent terminal.
  • the configuration according to the embodiments of the present invention can also be applied to stationary type terminals, in addition to elements especially for mobile purposes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本发明提供了一种游戏界面的交互方法、系统及计算机可读存储介质,交互方法,包括以下步骤:于运行游戏应用程序的智能终端内存储游戏界面图片,游戏界面图片包括至少两个游戏场景单元;配置显示界面,显示界面对应于智能终端的显示屏,当显示屏接收滑动操作时,显示界面横向移动;定义游戏场景单元的纵向标定基础为起始点,计算显示界面与起始点的距离矢量;于游戏应用程序内运行播放至少两条游戏音频,其中游戏音频对应游戏场景单元;当显示界面在游戏界面图片内横向移动时,智能终端的控制模块基于距离矢量形成音频控制指令,以改变每一游戏音频的音频参数。采用上述技术方案后,背景音乐跟随游戏场景完美衔接,给予用户较好的视觉听觉体验。

Description

游戏界面的交互方法、系统及计算机可读存储介质 技术领域
本发明涉及游戏控制领域,尤其涉及一种游戏界面的交互方法、系统及计算机可读存储介质。
背景技术
随着智能终端的快速发展,以及用户对于精神层面的追求。许多用户在使用智能终端时,将利用智能终端的硬件配置和安装的游戏类应用程序进行娱乐体验。在运行游戏时,很多游戏支持多场景切换,切换的方式是点击按钮刷新界面,或是点击相应的切换接口(以按钮的形式体现)来切换。此外,不同的场景有各自的背景音乐,切换场景后也会切换相应的背景音乐。
现有的交互方式中,通过点击按钮的切换操作方式过于单一,且切换过程中,背景音乐的切换衔接不流畅,给予用户较差的游戏体验。
因此,需要一种新型的游戏界面的交互方法,可支持丰富的场景切换操作,给予用户更佳的游戏体验。
发明内容
为了克服上述技术缺陷,本发明的目的在于提供一种游戏界面的交互方法、系统及计算机可读存储介质,背景音乐跟随游戏场景完美衔接,给予用户较好的视觉听觉体验。
本发明公开了一种游戏界面的交互方法,包括以下步骤:
于运行游戏应用程序的智能终端内存储至少一张游戏界面图片,每一游戏界面图片包括至少两个游戏场景单元;
配置一显示界面,显示界面对应于智能终端的显示屏,当显示屏接收滑动操作时,显示界面横向移动;
定义每一游戏场景单元的纵向标定基础为起始点,计算显示界面与每一起始点的距 离矢量;
于游戏应用程序内运行播放至少两条游戏音频,其中每一游戏音频对应一游戏场景单元;
当显示界面在游戏界面图片内横向移动时,智能终端的控制模块基于距离矢量形成音频控制指令,以改变每一游戏音频的音频参数。
优选地,定义每一游戏场景单元的纵向标定基础为起始点,计算显示界面与每一起始点的距离矢量的步骤包括:
定义每一游戏场景单元的中轴线为纵向标定基础,且定义显示界面的中轴线为纵向参照基础;
分别计算纵向参照基础与相邻两个纵向标定基础的第一距离标量和第二距离标量;
智能终端的控制模块基于距离矢量形成音频控制指令,以改变每一游戏音频的音频参数的步骤包括:
控制模块基于第一距离标量和第二距离标量与相邻两个纵向标定基础的距离的第一比值和第二比值,形成包括音量控制信息的音频控制指令,其中基于音量控制信息,分别调节每一游戏音频的音量。
优选地,基于音量控制信息,分别调节每一游戏音频的音量的步骤包括:
控制模块获取智能终端的当前音量,并基于:
音量控制信息=(1-第一比值)*100%*当前音量;
音量控制信息=(1-第二比值)*100%*当前音量;
改变游戏音频的音量。
优选地,定义每一游戏场景单元的纵向标定基础为起始点,计算显示界面与每一起始点的距离矢量的步骤还包括:
分别计算纵向参照基础与相邻两个纵向标定基础的第一方向和第二方向;
基于音量控制信息,分别调节每一游戏音频的音量的步骤包括:
控制模块获取智能终端的当前音量,并基于:
第一方向上的音量控制信息=(1-第一比值)*100%*当前音量、第二方向上的音频控制信息==(1-第二比值)*100%*当前音量,改变每一游戏音频在不同声道上的音量。
优选地,改变每一游戏音频的音频参数的步骤包括:
改变每一游戏音频的音量、频段、相位或混响中的一种或多种;
交互方法还包括如下步骤:
于游戏应用程序内设置滑动阈值与音频调节速率阈值,当显示界面横向移动的速度大于滑动阈值时,控制模块基于音频调节速率阈值改变每一游戏音频的音频参数。
优选地,还包括以下步骤:
获取任一游戏场景单元内的游戏对象,及操作对象的操作指令组,其中操作指令组包括至少一个操作指令;
选择操作指令组中的任一操作指令,并施加操作指令至游戏对象。
优选地,选择操作指令组中的任一操作指令,并施加操作指令至游戏对象的步骤包括:
基于游戏对象的剩余生命值和操作指令的权重,形成总长为l,各操作指令区间长为i n的线段;
于线段上随机选点,选择落入区间为确定操作指令区间;
基于游戏对象的权重排序,施加确定操作指令区间对应的操作指令至权重排序中首位的游戏对象。
本发明还公开了一种游戏界面的交互系统,包括:
存储模块,于运行游戏应用程序的智能终端内存储至少一张游戏界面图片,每一游戏界面图片包括至少两个游戏场景单元;
配置模块,配置一显示界面,显示界面对应于智能终端的显示屏,当显示屏接收滑动操作时,显示界面横向移动;
计算模块,定义每一游戏场景单元的纵向标定基础为起始点,计算显示界面与每一起始点的距离矢量;
音频模块,于游戏应用程序内运行播放至少两条游戏音频,其中每一游戏音频对应一游戏场景单元;
控制模块,当显示界面在游戏界面图片内横向移动时,控制模块基于距离矢量形成音频控制指令,以改变每一游戏音频的音频参数。
优选地,还包括:
获取模块,获取任一游戏场景单元内的游戏对象,及操作对象的操作指令组,其中操作指令组包括至少一个操作指令;
执行模块,选择操作指令组中的任一操作指令,并施加操作指令至游戏对象,其中执行模块包括:
统计单元,基于游戏对象的剩余生命值和操作指令的权重,形成总长为l,各操作指 令区间长为i n的线段;
确定单元,于线段上随机选点,选择落入区间为确定操作指令区间;
执行单元,基于游戏对象的权重排序,施加确定操作指令区间对应的操作指令至权重排序中首位的游戏对象。
本发明又公开了一种计算机可读存储介质,其上存储有计算机程序,计算机程序被处理器执行时实现如上所述的步骤。
采用了上述技术方案后,与现有技术相比,具有以下有益效果:
1.交互画面切换时具有无缝感,同时辅以音乐配合,给予用户身临其境的感受;
2.游戏过程中,自动释放技能的逻辑更为真实。
附图说明
图1为符合本发明一优选实施例中游戏界面的交互方法的流程示意图;
图2为符合本发明一优选实施例中游戏场景单元切换示意图。
具体实施方式
以下结合附图与具体实施例进一步阐述本发明的优点。
这里将详细地对示例性实施例进行说明,其示例表示在附图中。下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本公开相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本公开的一些方面相一致的装置和方法的例子。
在本公开使用的术语是仅仅出于描述特定实施例的目的,而非旨在限制本公开。在本公开和所附权利要求书中所使用的单数形式的“一种”、“所述”和“该”也旨在包括多数形式,除非上下文清楚地表示其他含义。还应当理解,本文中使用的术语“和/或”是指并包含一个或多个相关联的列出项目的任何或所有可能组合。
应当理解,尽管在本公开可能采用术语第一、第二、第三等来描述各种信息,但这些信息不应限于这些术语。这些术语仅用来将同一类型的信息彼此区分开。例如,在不脱离本公开范围的情况下,第一信息也可以被称为第二信息,类似地,第二信息也可以被称为第一信息。取决于语境,如在此所使用的词语“如果”可以被解释成为“在……时”或“当……时”或“响应于确定”。
在本发明的描述中,需要理解的是,术语“纵向”、“横向”、“上”、“下”、“前”、“后”、 “左”、“右”、“竖直”、“水平”、“顶”、“底”“内”、“外”等指示的方位或位置关系为基于附图所示的方位或位置关系,仅是为了便于描述本发明和简化描述,而不是指示或暗示所指的装置或元件必须具有特定的方位、以特定的方位构造和操作,因此不能理解为对本发明的限制。
在本发明的描述中,除非另有规定和限定,需要说明的是,术语“安装”、“相连”、“连接”应做广义理解,例如,可以是机械连接或电连接,也可以是两个元件内部的连通,可以是直接相连,也可以通过中间媒介间接相连,对于本领域的普通技术人员而言,可以根据具体情况理解上述术语的具体含义。
在后续的描述中,使用用于表示元件的诸如“模块”、“部件”或“单元”的后缀仅为了有利于本发明的说明,其本身并没有特定的意义。因此,“模块”与“部件”可以混合地使用。
参阅图1,为符合本发明一优选实施例中游戏界面的交互方法的流程示意图,在该实施例中,游戏界面的交互方法包括以下步骤:
S100:于运行游戏应用程序的智能终端内存储至少一张游戏界面图片,每一游戏界面图片包括至少两个游戏场景单元;
取一智能终端,并在其内运行游戏应用程序,运行时,游戏应用程序将展示游戏的界面至用户。为实现平滑切换游戏的不同交互场景,优选地,进入游戏后展示的主界面、主操作交互作为游戏场景单元将呈现在一张游戏界面图片内。但整个游戏界面不整体展示给用户,其包括至少两个游戏场景单元,用户切换交互界面时,便是对游戏场景单元的切换。
S200:配置一显示界面,显示界面对应于智能终端的显示屏,当显示屏接收滑动操作时,显示界面横向移动;
在游戏应用程序中,配置一显示界面,显示界面也可视作为智能终端的显示屏,即用户所看到的,便是显示界面内的所有内容。参阅图2,在该实施例中,显示界面的大小小于游戏界面图片。若用户需要切换显示屏显示的内容,则可在显示屏上进行滑动,例如竖置智能终端时,可在显示屏上上下移动,横置智能终端时,可在显示屏上左右移动。基于上述滑动操作,显示界面将在游戏界面图片上滑动,从而使得显示界面所框选的游戏界面图片的部分,为展示至用户的部分,从而使得游戏的展示内容中,不具有界面的转场过程,给予用户无缝交互体验。
S300:定义每一游戏场景单元的纵向标定基础为起始点,计算显示界面与每一起始点的距离矢量;
为实现游戏场景转场时与输出给用户的音乐匹配,将识别显示界面所框选到的游戏界面图片的具体位置。具体地,每一游戏场景单元被定义有纵向标定基础为一起始点,通过类似于“质点”的方式定义游戏场景单元的位置,也就是说,首先确定各游戏场景单元的位置。而后,计算任意时刻下显示界面与该起始点的距离矢量。可以理解的是,距离矢量是指显示界面与任意游戏场景单元的纵向标定基础的距离(在显示界面显示该游戏场景单元或未显示该游戏场景单元下均可计算),以及显示界面位于游戏场景单元的方位,从而空间上反应显示单元与个游戏场景单元的关系。
S400:于游戏应用程序内运行播放至少两条游戏音频,其中每一游戏音频对应一游戏场景单元;
确定显示界面在游戏界面图片内的具体位置后,将在另一方面获取所播放的音频,并将所播放的音频与显示界面的滑动关联。具体地,游戏应用程序内运行播放至少两条游戏音频,每一条游戏音频对应于一游戏场景单元。也就是说,当显示界面完全显示某一游戏场景单元时,将播放该游戏场景单元所对应的游戏音频。因此,游戏场景单元与游戏音频将呈一一对应的关系。
S500:当显示界面在游戏界面图片内横向移动时,智能终端的控制模块基于距离矢量形成音频控制指令,以改变每一游戏音频的音频参数。
当显示界面在游戏界面图片内横向移动时,由于显示界面显示的内容由于移动而变化,例如展示了相邻两游戏场景单元的相接部分,即既包括了某一游戏场景单元的部分,又包括了另一游戏场景单元的部分。智能终端内的控制模块基于前述计算的距离矢量形成一音频控制指令,该音频控制指令将控制游戏音频的音频参数。例如,显示界面的50%显示第一游戏场景单元,剩余50%显示第二游戏场景单元时,将使得第一游戏场景单元对应的游戏音频播放,同时也播放第二游戏场景单元对应的游戏音频。再如,显示界面的30%显示第一游戏场景单元,剩余70%显示第二游戏场景单元时,将使得第一游戏场景单元对应的游戏音频播放,同时也播放第二游戏场景单元对应的游戏音频,且第一游戏场景单元对应的游戏音频的音量较小,第二游戏场景单元对应的游戏音频的音量较大,使得显示界面显示的内容与游戏音频的播放模式匹配。
通过上述配置,用户可在视觉和听觉上接收显示界面的滑动效果,一方面无痕切换的方式节省了用户的等待时间,提高游戏体验,另一方面,音画的完美贴合给予用户更为身临其境的感受。
一优选实施例中,步骤S300具体包括:
S310:定义每一游戏场景单元的中轴线为纵向标定基础,且定义显示界面的中轴线为纵向参照基础;
为准确确定每一游戏场景单元与显示界面的距离,将采用两者各自具有的参照物进行计算。具体地,每一游戏场景单元呈矩形,从而与智能终端的显示屏匹配,因此,以该矩形的游戏场景单元的中轴线为纵向标定基础,也就是说,其余游戏场景单元或显示界面与其计算距离时,均将参照该中轴线。同时,显示界面的中轴线也同样被定义为其纵向参照基础,从而将两个矩形显示画面简化为“线”与“线”之间的衡量。
S320:分别计算纵向参照基础与相邻两个纵向标定基础的第一距离标量和第二距离标量;
由此,在计算显示界面与每一起始点的距离矢量时,将计算显示界面的纵向参照基础与相邻的两纵向标定基础的第一距离标量和第二距离标量。更具体地,可以理解的是,两相邻的游戏场景单元的纵向标定基础间的距离一定,也即第一距离标量和第二距离标量的和一定,则计算第一距离标量或第二距离标量时,可将两者恒定的和减去已计算的其中一项。
第一距离标量和第二距离标量的具体值,表示了纵向参照基础与相邻的两纵向标定的距离大小。由于纵向标定基础和纵向参照基础为两条平行或几乎平行的直线,则第一距离标量和第二距离标量即为两条平行线间的距离,采集时,可挑选任意一条与纵向标定基础和纵向参照基础都垂直的直线,所得交点间的距离即为第一距离标量或第二距离标量。
进一步地,步骤S500包括:
S510:控制模块基于第一距离标量和第二距离标量与相邻两个纵向标定基础的距离的第一比值和第二比值,形成包括音量控制信息的音频控制指令,其中基于音量控制信息,分别调节每一游戏音频的音量。
在具有第一距离标量和第二距离标量的数值后,控制模块将两者与相邻两个纵向标定基础的距离作比,求的第一比值和第二比值,表示当前的显示界面内具有的相邻两游戏场景单元显示内容的比例,为配合音频的控制,尤其给与用户“显示的越多,音量越大”的沉浸感,所形成的音频控制指令将包括音量控制信息,在显示界面滑动的过程中,不同游戏音频的音量将随之调节。
更进一步地,基于音量控制信息,分别调节每一游戏音频的音量的步骤S510包括:
S511:控制模块获取智能终端的当前音量,并基于:
音量控制信息=(1-第一比值)*100%*当前音量;
音量控制信息=(1-第二比值)*100%*当前音量;
改变游戏音频的音量。
首先将获知当前用户对智能终端的设置音量,并将当前音量作为音量上限,分别计算两路游戏音频的音量,将音量控制信息与显示界面内的显示内容占比一致。对于用户而言,当显示界面滑动时,听觉上能够感知到一路游戏音频的音量逐渐减小,另一路游戏音频的音量逐渐增大,对应于视觉上的感知到的一个游戏场景单元出现的内容越来越少,另一个游戏场景单元出现的内容越来越多。
更进一步地,定义每一游戏场景单元的纵向标定基础为起始点,计算显示界面与每一起始点的距离矢量的步骤S300还包括:
S330:分别计算纵向参照基础与相邻两个纵向标定基础的第一方向和第二方向;
除第一距离标量和第二距离标量外,还将获得如智能终端横置时,纵向参照基础与相邻两个纵向标定基础的第一方向和第二方向,例如可通过如上述与相邻两个纵向标定基础和纵向参照基础垂直的直线与纵向标定基础和纵向参照基础的交点可得,将与纵向参照基础的交点为起点,与纵向标定基础的交点为终点,划分向量,向量的方向即为第一方向和第二方向。
基于音量控制信息,分别调节每一游戏音频的音量的步骤S510包括:
S512:控制模块获取智能终端的当前音量,并基于:
第一方向上的音量控制信息=(1-第一比值)*100%*当前音量、第二方向上的音频控制信息==(1-第二比值)*100%*当前音量,改变每一游戏音频在不同声道上的音量
除调节音量外,还将根据方向的不同调节声道上的音量不同,在第一方向或第二方向对应于左声道或右声道时,首先,每一声道播放一个游戏音频,使得双声道下,左右声道的游戏音频不一致,在显示界面滑动下,一个声道的游戏音频的音量逐渐增大,反之另一个逐渐减小。或是双声道下,左右声道均播放两游戏音频,但不同声道下的同一游戏音频不一致,例如左声道的第一游戏音频音量为当前音量的70%,则在右声道下第一游戏音量为当前音量的30%(两者之和为满音量的100%),反之左声道的第二游戏音频音量为当前音量的30%,则在右声道下第二游戏音量为当前音量的70%(各自声道下,不同游戏音频分配满音量),显示界面滑动下,上述比例时刻变化,给予用户身临其境的沉浸感。
优选或可选地,步骤S500还包括:
S520:改变每一所述游戏音频的音量、频段、相位或混响中的一种或多种。
而交互方法还包括以下步骤:
S600:于游戏应用程序内设置滑动阈值与音频调节速率阈值,当显示界面横向移动的速度大于滑动阈值时,控制模块基于音频调节速率阈值改变每一游戏音频的音频参数
考虑到不用用户在对显示界面滑动时,可能采用慢慢滑动以欣赏游戏场景图片的体验,也可能采用尽可能快速地滑动显示界面以尽快跳转至其他游戏场景单元。则在用户快速操作下,需控制音频参数的调节频率,防止音频过快变化给用户带来反效果。因此,游戏应用程序内设置滑动阈值与音频调节速率阈值,当检测到显示界面横向移动的速度大于滑动阈值时,对于滑动动作本身不做限制,但音频参数的改变,例如音量调节的速度、相位调节的速率均将限制在音频调节速率阈值内,以控制较差用户体验的出现。当然,若滑动速度小于滑动阈值,仍可采用两者改变速度相同的控制逻辑。
又一优选实施例中,交互方法还包括以下步骤:
S700:获取任一游戏场景单元内的游戏对象,及操作对象的操作指令组,其中操作指令组包括至少一个操作指令;
当切换至一游戏场景单元后,用户需对该游戏场景单元内的如敌人、图标等游戏对象控制或发送攻击指令时,将获取其内所有的游戏对象,以及可向该游戏对象执行操作的操作指令组,例如普通攻击、攻击技能、增益技能等。
S800:选择操作指令组中的任一操作指令,并施加操作指令至游戏对象
根据用户的选择,或是游戏应用程序自身操作逻辑,优先选择技能中的一种,再施加至对方的游戏对象。而选择哪一操作指令,施加至哪一游戏对象,将进一步地依托于以下步骤完成:
S810:基于游戏对象的剩余生命值和操作指令的权重,形成总长为l,各操作指令区间长为i n的线段;
根据对方游戏对象的剩余生命值,以及预先设置的各技能的权重(例如攻击性指令权重较高、增益性指令权重较低,或在剩余生命值越低下,攻击性指令权重越高),形成一总厂为l,各操作指令长为i n的线段。可以理解的是,权重越高的,其线段长度越长。
S820:于线段上随机选点,选择落入区间为确定操作指令区间;
为实现随机性,或在游戏应用程序内,为实现自动操作与人工操作的平衡性(若自动操作下,始终提供最优选择,将使得自动操作的操作结果远远优于人工操作),将随机选择操作指令。随机性的实现,是在总厂为l上随机选点,落入哪一指令区间,则确定 该操作指令。至此,将首先确定所执行的技能(非常不同于现有技术中首先确定操作对象的方式)。
S830:基于游戏对象的权重排序,施加确定操作指令区间对应的操作指令至权重排序中首位的游戏对象
而后,选择游戏对象,将对方游戏对象权重排序(决定权重的因素可以有游戏对象的剩余血量、受击死亡的概率、属性克制、具有增益或非增益的加成等)。排序完成后便将操作指令施加至首位游戏对象。通过该具有一定智能性,但智能性有限,甚至会出现失误操作的操作逻辑,可尽可能模拟人工操作。
可以理解的是,对于游戏类应用程序而言,若需要调节难度,便可将自动化操作的逻辑控制的更为精细,如将攻击性操作指令的权重与剩余血量的权重挂钩,更大可能性地击杀对方残血游戏对象。通过对交互方式的逻辑控制,适应各种体验场景。
本发明还公开了一种游戏界面的交互系统,包括:
存储模块,于运行游戏应用程序的智能终端内存储至少一张游戏界面图片,每一游戏界面图片包括至少两个游戏场景单元;配置模块,配置一显示界面,显示界面对应于智能终端的显示屏,当显示屏接收滑动操作时,显示界面横向移动;计算模块,定义每一游戏场景单元的纵向标定基础为起始点,计算显示界面与每一起始点的距离矢量;音频模块,于游戏应用程序内运行播放至少两条游戏音频,其中每一游戏音频对应一游戏场景单元;控制模块,当显示界面在游戏界面图片内横向移动时,控制模块基于距离矢量形成音频控制指令,以改变每一游戏音频的音频参数。
优选地或可选地,还包括:获取模块,获取任一游戏场景单元内的游戏对象,及操作对象的操作指令组,其中操作指令组包括至少一个操作指令;执行模块,选择操作指令组中的任一操作指令,并施加操作指令至游戏对象,其中执行模块包括:统计单元,基于游戏对象的剩余生命值和操作指令的权重,形成总长为l,各操作指令区间长为i n的线段;确定单元,于线段上随机选点,选择落入区间为确定操作指令区间;执行单元,基于游戏对象的权重排序,施加确定操作指令区间对应的操作指令至权重排序中首位的游戏对象。
本发明还公开了一种计算机可读存储介质,其上存储有计算机程序,计算机程序被处理器执行时实现如上所述的步骤。
智能终端可以以各种形式来实施。例如,本发明中描述的终端可以包括诸如移动电话、智能电话、笔记本电脑、PDA(个人数字助理)、PAD(平板电脑)、PMP(便携式多媒体 播放器)、导航装置等等的智能终端以及诸如数字TV、台式计算机等等的固定终端。下面,假设终端是智能终端。然而,本领域技术人员将理解的是,除了特别用于移动目的的元件之外,根据本发明的实施方式的构造也能够应用于固定类型的终端。
应当注意的是,本发明的实施例有较佳的实施性,且并非对本发明作任何形式的限制,任何熟悉该领域的技术人员可能利用上述揭示的技术内容变更或修饰为等同的有效实施例,但凡未脱离本发明技术方案的内容,依据本发明的技术实质对以上实施例所作的任何修改或等同变化及修饰,均仍属于本发明技术方案的范围内。

Claims (10)

  1. 一种游戏界面的交互方法,其特征在于,包括以下步骤:
    于运行游戏应用程序的智能终端内存储至少一张游戏界面图片,每一游戏界面图片包括至少两个游戏场景单元;
    配置一显示界面,所述显示界面对应于所述智能终端的显示屏,当所述显示屏接收滑动操作时,所述显示界面横向移动;
    定义每一所述游戏场景单元的纵向标定基础为起始点,计算所述显示界面与每一所述起始点的距离矢量;
    于所述游戏应用程序内运行播放至少两条游戏音频,其中每一游戏音频对应一游戏场景单元;
    当所述显示界面在游戏界面图片内横向移动时,所述智能终端的控制模块基于所述距离矢量形成音频控制指令,以改变每一所述游戏音频的音频参数。
  2. 如权利要求1所述的交互方法,其特征在于,
    定义每一所述游戏场景单元的纵向标定基础为起始点,计算所述显示界面与每一所述起始点的距离矢量的步骤包括:
    定义每一游戏场景单元的中轴线为纵向标定基础,且定义显示界面的中轴线为纵向参照基础;
    分别计算纵向参照基础与相邻两个纵向标定基础的第一距离标量和第二距离标量;
    所述智能终端的控制模块基于所述距离矢量形成音频控制指令,以改变每一所述游戏音频的音频参数的步骤包括:
    所述控制模块基于所述第一距离标量和第二距离标量与相邻两个纵向标定基础的距离的第一比值和第二比值,形成包括音量控制信息的音频控制指令,其中基于所述音量控制信息,分别调节每一游戏音频的音量。
  3. 如权利要求2所述的交互方法,其特征在于,
    基于所述音量控制信息,分别调节每一游戏音频的音量的步骤包括:
    所述控制模块获取所述智能终端的当前音量,并基于:
    音量控制信息=(1-第一比值)*100%*当前音量;
    音量控制信息=(1-第二比值)*100%*当前音量;
    改变游戏音频的音量。
  4. 如权利要求3所述的交互方法,其特征在于,
    定义每一所述游戏场景单元的纵向标定基础为起始点,计算所述显示界面与每一所述起始点的距离矢量的步骤还包括:
    分别计算纵向参照基础与相邻两个纵向标定基础的第一方向和第二方向;
    基于所述音量控制信息,分别调节每一游戏音频的音量的步骤包括:
    所述控制模块获取所述智能终端的当前音量,并基于:
    第一方向上的音量控制信息=(1-第一比值)*100%*当前音量、第二方向上的音频控制信息==(1-第二比值)*100%*当前音量,改变每一游戏音频在不同声道上的音量。
  5. 如权利要求1所述的交互方法,其特征在于,
    改变每一所述游戏音频的音频参数的步骤包括:
    改变每一所述游戏音频的音量、频段、相位或混响中的一种或多种;
    所述交互方法还包括如下步骤:
    于所述游戏应用程序内设置滑动阈值与音频调节速率阈值,当所述显示界面横向移动的速度大于所述滑动阈值时,所述控制模块基于所述音频调节速率阈值改变每一所述游戏音频的音频参数。
  6. 如权利要求1所述的交互方法,其特征在于,还包括以下步骤:
    获取任一游戏场景单元内的游戏对象,及操作对象的操作指令组,其中所述操作指令组包括至少一个操作指令;
    选择操作指令组中的任一操作指令,并施加所述操作指令至游戏对象。
  7. 如权利要求6所述的交互方法,其特征在于,选择操作指令组中的任一操作指令,并施加所述操作指令至游戏对象的步骤包括:
    基于所述游戏对象的剩余生命值和操作指令的权重,形成总长为l,各操作指令区间长为i n的线段;
    于所述线段上随机选点,选择落入区间为确定操作指令区间;
    基于游戏对象的权重排序,施加确定操作指令区间对应的操作指令至权重排序中首位的游戏对象。
  8. 一种游戏界面的交互系统,其特征在于,包括:
    存储模块,于运行游戏应用程序的智能终端内存储至少一张游戏界面图片,每一游戏界面图片包括至少两个游戏场景单元;
    配置模块,配置一显示界面,所述显示界面对应于所述智能终端的显示屏,当所述显示屏接收滑动操作时,所述显示界面横向移动;
    计算模块,定义每一所述游戏场景单元的纵向标定基础为起始点,计算所述显示界面与每一所述起始点的距离矢量;
    音频模块,于所述游戏应用程序内运行播放至少两条游戏音频,其中每一游戏音频对应一游戏场景单元;
    控制模块,当所述显示界面在游戏界面图片内横向移动时,控制模块基于所述距离矢量形成音频控制指令,以改变每一所述游戏音频的音频参数。
  9. 如权利要求8所述的交互系统,其特征在于,还包括:
    获取模块,获取任一游戏场景单元内的游戏对象,及操作对象的操作指令组,其中所述操作指令组包括至少一个操作指令;
    执行模块,选择操作指令组中的任一操作指令,并施加所述操作指令至游戏对象,其中所述执行模块包括:
    统计单元,基于所述游戏对象的剩余生命值和操作指令的权重,形成总长为l,各操作指令区间长为i n的线段;
    确定单元,于所述线段上随机选点,选择落入区间为确定操作指令区间;
    执行单元,基于游戏对象的权重排序,施加确定操作指令区间对应的操作指令至权重排序中首位的游戏对象。
  10. 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求1-7任一项所述的步骤。
PCT/CN2021/132258 2021-03-24 2021-11-23 游戏界面的交互方法、系统及计算机可读存储介质 WO2022199082A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US18/283,382 US20240211128A1 (en) 2021-03-24 2021-11-23 Game interface interaction method, system, and computer readable storage medium
CN202180077409.1A CN116940921A (zh) 2021-03-24 2021-11-23 游戏界面的交互方法、系统及计算机可读存储介质

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110312764.4A CN112882628A (zh) 2021-03-24 2021-03-24 游戏界面的交互方法、系统及计算机可读存储介质
CN202110312764.4 2021-03-24

Publications (1)

Publication Number Publication Date
WO2022199082A1 true WO2022199082A1 (zh) 2022-09-29

Family

ID=76042121

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/132258 WO2022199082A1 (zh) 2021-03-24 2021-11-23 游戏界面的交互方法、系统及计算机可读存储介质

Country Status (3)

Country Link
US (1) US20240211128A1 (zh)
CN (2) CN112882628A (zh)
WO (1) WO2022199082A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112882628A (zh) * 2021-03-24 2021-06-01 上海莉莉丝计算机技术有限公司 游戏界面的交互方法、系统及计算机可读存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104801043A (zh) * 2014-01-23 2015-07-29 腾讯科技(深圳)有限公司 场景音效控制的方法及装置
CN111494949A (zh) * 2020-04-17 2020-08-07 网易(杭州)网络有限公司 游戏大厅的显示控制方法、装置及电子设备
US20210026441A1 (en) * 2019-03-04 2021-01-28 Magical Technologies, Llc Virtual Object Control Of A Physical Device and/or Physical Device Control of A Virtual Object
CN112882628A (zh) * 2021-03-24 2021-06-01 上海莉莉丝计算机技术有限公司 游戏界面的交互方法、系统及计算机可读存储介质

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104090766B (zh) * 2014-07-17 2017-08-25 广东欧珀移动通信有限公司 一种移动终端的音效切换方法及系统
CN105930172A (zh) * 2016-05-13 2016-09-07 深圳市豹风网络股份有限公司 移动终端网络游戏的用户界面动态显示方法及系统
CN108970116B (zh) * 2018-07-19 2020-05-08 腾讯科技(深圳)有限公司 虚拟角色控制方法及装置
CN111111188B (zh) * 2019-12-24 2024-03-22 北京像素软件科技股份有限公司 游戏音效的控制方法及相关装置
CN112402975B (zh) * 2020-11-24 2024-10-01 网易(杭州)网络有限公司 游戏技能的控制方法、装置、设备及存储介质
CN112492097B (zh) * 2020-11-26 2022-01-11 广州酷狗计算机科技有限公司 音频播放方法、装置、终端及计算机可读存储介质

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104801043A (zh) * 2014-01-23 2015-07-29 腾讯科技(深圳)有限公司 场景音效控制的方法及装置
US20210026441A1 (en) * 2019-03-04 2021-01-28 Magical Technologies, Llc Virtual Object Control Of A Physical Device and/or Physical Device Control of A Virtual Object
CN111494949A (zh) * 2020-04-17 2020-08-07 网易(杭州)网络有限公司 游戏大厅的显示控制方法、装置及电子设备
CN112882628A (zh) * 2021-03-24 2021-06-01 上海莉莉丝计算机技术有限公司 游戏界面的交互方法、系统及计算机可读存储介质

Also Published As

Publication number Publication date
CN112882628A (zh) 2021-06-01
US20240211128A1 (en) 2024-06-27
CN116940921A (zh) 2023-10-24

Similar Documents

Publication Publication Date Title
US11298609B2 (en) Virtual object movement control method and apparatus, electronic apparatus, and storage medium
CN106774830B (zh) 虚拟现实系统、语音交互方法及装置
CN111527523B (zh) 用于共享虚拟现实环境的装置和方法
US10848894B2 (en) Controlling audio in multi-viewpoint omnidirectional content
CN107982918B (zh) 游戏对局结果的展示方法、装置及终端
US20170235462A1 (en) Interaction control method and electronic device for virtual reality
EP3478384A1 (en) Digital multimedia platform
US11330240B2 (en) Information processing system, information processing method and non-transitory computer readable medium
WO2022199082A1 (zh) 游戏界面的交互方法、系统及计算机可读存储介质
CN114443945A (zh) 一种虚拟用户界面中应用图标的展示方法及三维显示设备
KR20230120668A (ko) 영상통화 방법 및 장치
CN115475382A (zh) 画面补偿方法、终端设备、云服务器及存储介质
CN109350958B (zh) 终端及终端的控制方法、计算机可读存储介质
JP2022169530A (ja) プログラム、端末、及びゲームシステム
CN110769904A (zh) 输出内容处理方法、输出方法、电子设备及存储介质
JP5869176B1 (ja) ゲームプログラムおよびゲーム装置
US9999832B2 (en) Game system, computer-readable non-transitory storage medium, game processing method and game apparatus
CN110800308B (zh) 用于在可穿戴设备中呈现用户界面的方法、系统和介质
CN110134305A (zh) 一种语速调节方法、装置和用于语速调节的装置
CN114522416B (zh) 一种虚拟角色的移动控制方法和装置
US9092079B2 (en) Storage medium having information processing program stored thereon, information processing apparatus, information processing system, and instruction distinguishing method
WO2024146246A1 (zh) 虚拟场景的交互处理方法、装置、电子设备及计算机存储介质
JP6694106B1 (ja) 情報処理方法、情報処理端末、およびプログラム
US20240359106A1 (en) Virtual item processing method and apparatus, electronic device, storage medium, and program product
US20230342989A1 (en) Virtual Content

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21932690

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202180077409.1

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21932690

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 22.02.2024)