WO2017181903A1 - 一种在触摸终端上执行操作的方法与设备 - Google Patents

一种在触摸终端上执行操作的方法与设备 Download PDF

Info

Publication number
WO2017181903A1
WO2017181903A1 PCT/CN2017/080391 CN2017080391W WO2017181903A1 WO 2017181903 A1 WO2017181903 A1 WO 2017181903A1 CN 2017080391 W CN2017080391 W CN 2017080391W WO 2017181903 A1 WO2017181903 A1 WO 2017181903A1
Authority
WO
WIPO (PCT)
Prior art keywords
action area
target
target operation
operation object
application
Prior art date
Application number
PCT/CN2017/080391
Other languages
English (en)
French (fr)
Inventor
毛信良
周田伟
陈二喜
Original Assignee
上海逗屋网络科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海逗屋网络科技有限公司 filed Critical 上海逗屋网络科技有限公司
Publication of WO2017181903A1 publication Critical patent/WO2017181903A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present application relates to the field of computers, and in particular, to a technique for performing operations on a touch terminal.
  • the correspondence between the user operation instruction and the response execution of the smart device is relatively Simple, for example, if the user has a series of execution operations in the current application environment of the smart device and wants to obtain a response from the smart device, it is common practice for the user to issue an instruction, such as a touch operation, by the smart device responding to the touch. Operation, and then, based on the result of the smart device response, the user determines to issue the next instruction.
  • the user determines to determine the issuance of the next instruction based on the completion of the previous instruction, the user is limited in response speed and accuracy.
  • Such restrictions as the own conditions will directly affect the situation in which the next instruction is completed, such as failing to achieve the predetermined execution effect expected by the user. Therefore, on the whole, the user's human-computer interaction experience is not improved and satisfied.
  • a method of performing an operation on a touch terminal comprising:
  • an apparatus for performing an operation on a touch terminal comprising:
  • a first device configured to adjust a target in the application based on a touch operation of the current application by the user The relative position of the operating object and the operating area;
  • the second device is configured to apply an operation corresponding to the operation action area to the target operation object if the target operation object overlaps the operation action area.
  • the present application adjusts the relative position of the target operation object and the operation action area in the application in the touch operation of the current application, when the target operation object overlaps with the operation action area,
  • the device will perform an operation corresponding to the operation action area on the target operation object.
  • the device automatically triggers the execution of the corresponding operation on the target operation object by determining the relative position of the target operation object and the operation action area, and the entire process of the device is automatically determined to be performed more than the user's self-determination execution.
  • the cooperation degree of human-computer interaction is higher, especially in an application scenario in which the execution condition judgment and execution speed of the operation are highly required, the application can complete the user-pairing more accurately and quickly.
  • the expected operation of the target operation object is such that the user experience is optimized.
  • the device in the process of adjusting the relative position, if the distance between the target operation object and the operation action area is less than or equal to a distance threshold, the operation action area is superimposed on the target operation object.
  • the device may automatically superimpose the operation action area on the target operation object, thereby locking the target operation object. Therefore, the device can assist and calibrate the touch operation of the user by intelligently determining, and the device can smoothly perform an operation instruction issued by the user based on the touch operation.
  • FIG. 1 shows a schematic diagram of an apparatus of an apparatus that performs operations on a touch terminal in accordance with an aspect of the present application
  • FIG. 2 shows a schematic diagram of an apparatus for performing an operation on a touch terminal in accordance with a preferred embodiment of the present application
  • FIG. 3 illustrates a flow chart of a method of performing operations on a touch terminal in accordance with another aspect of the present application
  • FIG. 5 is a diagram showing an example of performing an operation on a touch terminal according to another preferred embodiment of the present application.
  • FIG. 6 illustrates an example schematic diagram of performing operations on a touch terminal in accordance with yet another preferred embodiment of the present application.
  • the terminal, the device of the service network, and the trusted party each include one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
  • processors CPUs
  • input/output interfaces network interfaces
  • memory volatile and non-volatile memory
  • the memory may include non-persistent memory, random access memory (RAM), and/or non-volatile memory in a computer readable medium, such as read only memory (ROM) or flash memory.
  • RAM random access memory
  • ROM read only memory
  • Memory is an example of a computer readable medium.
  • Computer readable media includes both permanent and non-persistent, removable and non-removable media.
  • Information storage can be implemented by any method or technology.
  • the information can be computer readable instructions, data structures, modules of programs, or other data.
  • Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read only memory. (ROM), electrically erasable programmable read only memory (EEPROM), flash memory or other memory technology, compact disk read only memory (CD-ROM), digital versatile disk (DVD) or other optical storage,
  • computer readable media does not include non-transitory computer readable media, such as modulated data signals and carrier waves.
  • FIG. 1 shows a schematic diagram of an apparatus 1 of an apparatus 1 that performs operations on a touch terminal in accordance with an aspect of the present application.
  • the device 1 includes a first device 11 and a second device 12.
  • the first device 11 adjusts the relative position of the target operation object and the operation action area in the application based on the touch operation of the current application by the user;
  • the target operation object is overlapped with the operation action area, and an operation corresponding to the operation action area is applied to the target operation object.
  • the touch terminal includes, but is not limited to, a smart terminal device including a touch screen, such as a personal computer with a touch screen, a smart phone, a tablet computer, a game device, etc., and the touch control further includes an external touch screen.
  • a smart terminal device including a touch screen, such as a personal computer with a touch screen, a smart phone, a tablet computer, a game device, etc.
  • the touch control further includes an external touch screen.
  • the device 1 may be the touch terminal itself, and may also be a combination of hardware, software or hardware and software installed on the touch terminal and capable of performing the performing operation.
  • the first device 11 adjusts the relative position of the target operation object and the operation action area in the application based on the touch operation of the current application by the user.
  • the current application includes, but is not limited to, an application loaded on the touch terminal, such as a game application, and further, the current application is displayed on a touch screen of the touch terminal.
  • the target operation object includes a target object that is presented in the current application of the touch terminal and is selected by the user as a subsequent execution operation.
  • the target operation object may be an object that the user player wishes to lock in a currently applied scene, or an object that wants to attack, for example, various static or moving characters, buildings, logos, and the like. .
  • the operation action area includes an area highlighted by the range identifier in the current application; or although not highlighted, the corresponding area actually exists.
  • the operation action area may be fixedly displayed on a corresponding area of the current application, for example, fixedly displayed in the center area.
  • the touch operation can be moved based on the area of the user, and the area where the operation action area is located is flexibly adjusted in the current application, for example, the operation action area is correspondingly moved based on the sliding operation of the user in the current application.
  • a corresponding operation can be performed in the operation action area.
  • the area of the operation is represented by a distinguishable display area, specifically, as in a game application, the operation action area corresponds to a game skill action area, such as an auxiliary.
  • the position of the target operation object in the current application or by adjusting the position of the operation action area, or simultaneously adjusting the target operation object and the operation action area
  • the position is to achieve an adjustment of the relative position of the target operating object and the operating action area in the application.
  • the operational action area is fixedly displayed in a central area of the current application, and is specifically presented in a corresponding touch screen central area.
  • the touch operation includes but is not limited to a user
  • the user-defined operation of the current application may be a gesture operation of the corresponding area on the touch screen of the touch terminal corresponding to the current application, for example, but not limited to, click, multiple click, long press, and tap , press or slide the corresponding operation mode on the touch screen. Based on the different touch operations of the user, the device 1 will respond to different response effects.
  • the current applied scenario is presented based on the view angle of the user, if it is on the right
  • the sliding operation on the touch screen corresponds to a change of the scene angle of the current application, for example, sliding to the left, moving the scene angle to the left, sliding to the right, and shifting the scene angle to the right, when the user is on the right touch screen of the current application.
  • the view angle of the scene is shifted to the right, that is, the target operation object in the current application scene also moves correspondingly as the view angle changes, thereby adjusting the target in the application based on the touch operation.
  • the relative position of the target operation object and the operation action area also follows the user's The specific touch operation changes, for example, the relative position of the target operation object and the operation action area becomes far, or is close, or heavy Stack.
  • the second device 12 applies an operation corresponding to the operation action area to the target operation object.
  • the device 1 performs a corresponding operation on the target operation object superimposed on the operation action area, the operation The operation corresponding to the operation action area set in advance is included.
  • the overlapping may include: the target operation object is completely covered by the operation action area; or the superimposed part of the target operation object and the operation action area reaches a preset superimposition range, for example, the superimposed part is reached. The maximum value of the intersectable area.
  • the operation corresponding to the operation action area is set to be an attack like an object in the operation action area, when the target operation object overlaps the operation action area, the second The device 12 automatically triggers an attack on the target operating object that overlaps the operational area.
  • the operation corresponding to the operation action area is flexible and customized according to the needs of different application scenarios.
  • the present application adjusts the relative position of the target operation object and the operation action area in the application based on the touch operation of the current application by the user, and when the target operation object and the operation action area overlap, the device 1 Performing an operation corresponding to the operation action area on the target operation object.
  • the device 1 automatically triggers a corresponding operation on the target operation object by determining the relative position of the target operation object and the operation action area, whereby the device 1 automatically determines that the entire process of execution is more than the user
  • the self-determination execution is more efficient and accurate, and the cooperation degree of human-computer interaction is higher, especially in an application scenario where the execution condition judgment and execution speed of the operation are high, the application can be more precise and fast.
  • the user's expected operation on the target operation object is completed, so that the user experience is optimized.
  • the device 1 further includes a third device (not shown), the third device sets an execution state of the application to an automatic execution state according to an automatic execution operation instruction submitted by the user; wherein, if the target operation The object is overlapped with the operation action area, and an execution state of the application is an automatic execution state, and the second device 12 applies an operation corresponding to the operation action area to the target operation object.
  • a third device not shown
  • the third device sets an execution state of the application to an automatic execution state according to an automatic execution operation instruction submitted by the user; wherein, if the target operation The object is overlapped with the operation action area, and an execution state of the application is an automatic execution state, and the second device 12 applies an operation corresponding to the operation action area to the target operation object.
  • the execution state of the current application for example, the execution state of the operation corresponding to the operation action area in the current application includes an automatic execution state, or a manual execution state, and the like.
  • the execution state may be set to default or may vary based on the acquired user command.
  • the device 1 sets the execution state of the application to an automatic execution state based on an automatic execution operation instruction submitted by the user.
  • the submitting the automatic execution operation instruction may include the user submitting the instruction based on a gesture operation on the current application, such as a touch button on the touch screen, or based on a hardware preset function, such as a key to the smart phone,
  • the operation of the preset function of the volume key implements the submission of the instruction.
  • the operation corresponding to the operation action area is to shoot the corresponding operation object
  • the user can trigger the switch button corresponding to the automatic execution state displayed on the touch screen, and if it is slid to open, the corresponding automatic execution The status is turned on, and if it is slid to off, the corresponding automatic execution status is turned off.
  • the device 1 automatically executes the target operation object overlapping the operation action area. The corresponding operation is described.
  • FIG. 2 shows a schematic diagram of an apparatus 1 of an apparatus 1 that performs operations on a touch terminal in accordance with a preferred embodiment of the present application.
  • the device 1 comprises a first device 11' and a second device 12', wherein the first device 11' comprises a first unit 111' and a second unit 112'.
  • the first unit 111' adjusts the based on the touch operation of the current application by the user.
  • the relative position of the target operation object and the operation action area in the application in the process of adjusting the relative position, if the distance between the target operation object and the operation action area is less than or equal to the distance threshold, the second unit 112'
  • the operation action area is superimposed on the target operation object; if the target operation object is overlapped with the operation action area, the second device 12' applies an operation corresponding to the operation action area to the target operation Object.
  • the second device 12' is the same as or substantially the same as the second device 12 shown in FIG. 1, and is not described here, and is hereby incorporated by reference.
  • the first unit 111' adjusts a relative position of the target operation object and the operation action area in the application based on the touch operation of the current application by the user; and, in the process of adjusting the relative position, if The distance between the target operation object and the operation action area is less than or equal to the distance threshold, and the second unit 112' superimposes the operation action area on the target operation object.
  • the device 1 adjusts the relative position of the target operation object and the operation action area based on the touch operation of the user, preferably, in the adjustment process, the target operation object and the When the distance of the relative position of the operation area is less than or equal to the distance preset, the operation area is automatically superimposed on the target operation object.
  • the distance may be flexibly set as the distance between the target operation object and the center point of the operation action area based on the need; or, if the operation action area has a certain area range, Determining, as the distance, the shortest distance, or the farthest distance, or the like of the target operation object and the operation action area based on the boundary of the area range.
  • the distance is greater than the distance threshold, the operation action area is not automatically superimposed on the target operation object, and the target operation object and the location are adjusted based on the touch operation of the user.
  • the device 1 automatically performs the operation even if the operation action area does not overlap or does not intersect with the target operation object.
  • the active area is superimposed on the target operational object.
  • the operation action area is fixedly displayed in a central area of the current application, the currently applied scene is automatically adjusted, so that the target operation object and the operation action area are automatically phased. Stack.
  • the device 1 in the process of adjusting the relative position, if the distance between the target operation object and the operation action area is less than or equal to the distance threshold, the operation action area is overlapped. Added to the target operation object.
  • the device 1 may automatically superimpose the operation action area on the target operation object, thereby locking the target operation object. Therefore, the device 1 can intelligently determine, assist, and calibrate the touch operation of the user, and improve the smooth execution of the operation instruction issued by the user based on the touch operation.
  • the apparatus 1 further includes a fourth device (not shown) that adjusts the operational action area in accordance with movement information of the target operational object to maintain The operation action area is superimposed on the target operation object.
  • each of the operation objects in the current application may be stationary or continuously moving.
  • the target operation object that automatically overlaps the operation action area when the distance threshold is satisfied when it moves, adjust the operation action area based on the current application, specifically the movement information on the touch screen, to It is ensured that the superimposed state of the operation action area and the target operation object is relatively unchanged, that is, the target operation object is kept continuously locked by the operation action area.
  • the fourth device further causes the operational action region to be located in a central region of the application by calibrating a scene perspective of the application.
  • the current application corresponding to the touch terminal may be based on the user's perspective and presented on the touch screen, corresponding to
  • a preferred scenario is that the operational action area is fixedly displayed in a central area of the current application, such as specifically displayed in a central area of the corresponding touch screen.
  • the scenario of the application may be adjusted to achieve the operation action area and other scenarios in the application by calibrating the view angle of the application. Relatively moving visual effects.
  • the movement of the scene view angle ensures that the operation action area and the target operation object are always superimposed, and
  • the operational area continues to be located in a central area of the application.
  • the second unit 112' adjusts the relative position, if the distance between each of the plurality of target operating objects and the operating area is less than or equal to a distance threshold, superimposing the operation action area on the plurality of target operations At least one of the objects.
  • the target operation object that satisfies the condition that the distance from the operation action area is less than or equal to the distance threshold may be one or more.
  • the operation action area may be superimposed on one or simultaneously superimposed on the plurality of target operation objects. Further, the operation corresponding to the operation action area may be correspondingly applied to one or more of the target operation objects.
  • the operation action area is a game skill action area, such as an aiming frame of a shooting skill, and the operation corresponding to the operation action area may be simultaneously issued to multiple objects in the aiming frame multiple times.
  • the operation action area may be automatically superimposed on the plurality of the target operation objects at the same time.
  • the second unit 112' if the distance between each of the plurality of target operation objects and the operation action area is less than or equal to a distance threshold, the operation action area And superimposed on the target operation object that is closest to the operation action area among the plurality of target operation objects.
  • the target operation object that satisfies the condition that the distance from the operation action area is less than or equal to the distance threshold may be one or more, and may be based on the principle of proximity. And superimposing the operation action area on a target operation object that is closest to the operation action area among the plurality of target operation objects.
  • the distance may be flexibly set as a distance between the target operation object and a center point of the operation action area, or a shortest distance between the target operation object and the operation action area, Or the farthest distance.
  • the operation action area is a game skill action area, such as an aiming frame of a shooting skill, preferably, the distance between the attacked target operation object and the center point of the aiming frame is For the measurement standard of the distance, the target operation object closest to the center point of the aiming frame is selected as the superimposed target operation object.
  • the second unit 112' if the distance between each of the plurality of target operation objects and the operation action area is less than or equal to a distance threshold, the operation action area Superimposed on the object with the highest object level among the plurality of target operation objects Target operation object.
  • the target operation object that satisfies the condition that the distance from the operation action area is less than or equal to the distance threshold may be one or more, and at this time, The target operation object that is finally superimposed is determined based on the attribute characteristics of the respective target operation objects.
  • the target operation object having the highest object level among the plurality of target operation objects may be determined.
  • the specific meaning of the object level is different.
  • the operation action area is a game skill action area, such as an aiming frame of a shooting skill, and an operation corresponding to the operation action area is to shoot a scene object in the aiming frame, the target operation.
  • Objects include the characters, buildings, or other items that are fired.
  • the target operation object with the highest object level at this time may be set based on the difficulty level of the object.
  • the target operation object that is finally superimposed may also be determined based on attributes such as size, volume, color, shape, and the like of the target operation object.
  • the device 1 determines a target operation object that is actually superimposed with the operation action area based on different preset conditions, and includes determining the target based on a condition that is closest to the operation action area.
  • the operation object, and the target operation object are determined based on the highest level of the object.
  • various determination methods may be applied separately or in combination.
  • the target operation object is determined based on a condition that is closest to the operation action area, and if there are two or more distance conditions, the target operation is consistent.
  • the object may further determine the target operation object superimposed by the operation action area based on the highest level of the object.
  • the first device 11' further includes a third unit (not shown), wherein the third unit is in the process of adjusting the relative position, if the target operating object The state is invisible, and the target operation object is re-determined according to the distance between the operation object and the operation action area in the current scene of the application.
  • the second unit 112' may superimpose the operation action area on the target operation object when the target operation object and the operation action area are less than or equal to a distance threshold, and then the target operation The object is locked by the action area, for example, in the game application, that is, the attacked object is locked by the aiming frame.
  • the device 1 when the state of the target operation object superimposed with the operation action area is invisible, preferably, the device 1 will automatically act according to the operation object and the operation in the current scene of the application. The distance of the region re-determines the target operational object.
  • the invisible state of the original target operation object is included in the current application, the target operation object cannot be seen or recognized by the user, for example, the target operation object is being moved during the movement.
  • the props in the scene are occluded or occluded by other operating objects; or based on some trigger settings, the target operating objects disappear from the current application, and the like. Then, the device 1 will abandon the determined target operation object superimposed with the operation action area, and re-select a new target operation object based on the determination condition of the distance threshold, and automatically perform the operation.
  • the active area is superimposed with the re-determined target operational object. If there is no operation object that meets the judgment condition of the distance threshold, then preferably, the current scene is maintained and the subsequent operations of the user are awaited.
  • the device 1 further includes a fifth device (not shown), and if the operation is completed, the fifth device operates according to an object in the current scene of the application.
  • the distance of the operation action area re-determines the target operation object.
  • the operation execution completion includes the device 1 applying an operation corresponding to the operation action area to the target operation object after the target operation object is overlapped with the operation action area.
  • the device 1 discards the determined target operation object superimposed with the operation action area, and the fifth device re-selects a new target operation object based on the determination condition of the distance threshold. And automatically superimposing the operation action area with the re-determined target operation object.
  • the device 1 also sets the distance between the operation object and the operation action area in the current scene of the application as The user re-determines the target operational object. If there is no operation object that meets the judgment condition of the distance threshold, then Preferably, the current scene is maintained and waiting for subsequent operations by the user.
  • FIG. 3 illustrates a flow chart of a method of performing an operation on a touch terminal in accordance with another aspect of the present application. Wherein, the method comprises step S31 and step S32.
  • step S31 the device 1 adjusts the relative position of the target operation object and the operation action area in the application based on the touch operation of the current application by the user; in step S32, if the target operation object and the target are The operation action areas are overlapped, and the device 1 applies an operation corresponding to the operation action area to the target operation object.
  • the touch terminal includes, but is not limited to, a smart terminal device including a touch screen, such as a personal computer with a touch screen, a smart phone, a tablet computer, a game device, etc., and the touch control further includes an external touch screen.
  • a smart terminal device including a touch screen, such as a personal computer with a touch screen, a smart phone, a tablet computer, a game device, etc.
  • the touch control further includes an external touch screen.
  • the device 1 may be the touch terminal itself, and may also be a combination of hardware, software or hardware and software installed on the touch terminal and capable of performing the performing operation.
  • the device 1 adjusts the relative position of the target operation object and the operation action area in the application based on the touch operation of the current application by the user.
  • the current application includes, but is not limited to, an application loaded on the touch terminal, such as a game application, and further, the current application is displayed on a touch screen of the touch terminal.
  • the target operation object includes a target object that is presented in the current application of the touch terminal and is selected by the user as a subsequent execution operation.
  • the target operation object may be an object that the user player wishes to lock in a currently applied scene, or an object that wants to attack, for example, various static or moving characters, buildings, logos, and the like. .
  • the operation action area includes an area highlighted by the range identifier in the current application; or although not highlighted, the corresponding area actually exists.
  • the operation action area may be fixedly displayed on a corresponding area of the current application, for example, fixedly displayed in the center area.
  • the touch operation can be moved based on the area of the user, and the area where the operation action area is located is flexibly adjusted in the current application, for example, the operation action area is correspondingly moved based on the sliding operation of the user in the current application.
  • a corresponding operation can be performed in the operation action area.
  • the area of the operation is represented by a distinguishable display area, specifically, as in a game application, the operation action area corresponds to a game skill action area, such as an auxiliary.
  • it can be based on the user's touch An operation of the target operation in the application by adjusting a position of the target operation object in the current application, or by adjusting a position of the operation action area, or simultaneously adjusting positions of the target operation object and the operation action area Adjustment of the relative position of the object and the operating area.
  • the operational action area is fixedly displayed in a central area of the current application, and is specifically presented in a corresponding touch screen central area.
  • the touch operation includes, but is not limited to, a gesture operation of the user in the corresponding area of the touch screen of the current application, for example, but not limited to, a click operation. Multiple clicks, long presses, taps, re-presses or slides the corresponding area on the touch screen. Based on the different touch operations of the user, the device 1 will respond to different response effects.
  • the current applied scenario is presented based on the view angle of the user, if it is on the right
  • the sliding operation on the touch screen corresponds to a change of the scene angle of the current application, for example, sliding to the left, moving the scene angle to the left, sliding to the right, and shifting the scene angle to the right, when the user is on the right touch screen of the current application.
  • the view angle of the scene is shifted to the right, that is, the target operation object in the current application scene also moves correspondingly as the view angle changes, thereby adjusting the target in the application based on the touch operation.
  • the relative position of the target operation object and the operation action area also follows the user's The specific touch operation changes, for example, the relative position of the target operation object and the operation action area becomes far, or is close, or heavy Stack.
  • step S32 the device 1 applies an operation corresponding to the operation action area to the target operation object if the target operation object overlaps with the operation action area.
  • the device 1 performs a corresponding operation on the target operation object superimposed on the operation action area, the operation The operation corresponding to the operation action area set in advance is included.
  • the overlapping may include: the target operation object is completely covered by the operation action area; or the superimposed part of the target operation object and the operation action area reaches a preset superimposition range, for example, the superimposed part is reached. The maximum value of the intersectable area.
  • the device 1 Automatic Triggering an attack on a target operation object that overlaps the operational action areas.
  • the operation corresponding to the operation action area is flexible and customized according to the needs of different application scenarios.
  • the present application adjusts the relative position of the target operation object and the operation action area in the application based on the touch operation of the current application by the user, and when the target operation object and the operation action area overlap, the device 1 Performing an operation corresponding to the operation action area on the target operation object.
  • the device 1 automatically triggers a corresponding operation on the target operation object by determining the relative position of the target operation object and the operation action area, whereby the device 1 automatically determines that the entire process of execution is more than the user
  • the self-determination execution is more efficient and accurate, and the cooperation degree of human-computer interaction is higher, especially in an application scenario where the execution condition judgment and execution speed of the operation are high, the application can be more precise and fast.
  • the user's expected operation on the target operation object is completed, so that the user experience is optimized.
  • the method further includes a step S33 (not shown), in step S33, the device 1 sets an execution state of the application to an automatic execution state according to an automatic execution operation instruction submitted by a user; wherein, in step S32 If the target operation object overlaps with the operation action area, and the execution state of the application is an automatic execution state, the device 1 applies an operation corresponding to the operation action area to the target operation object.
  • step S33 the device 1 sets an execution state of the application to an automatic execution state according to an automatic execution operation instruction submitted by a user; wherein, in step S32 If the target operation object overlaps with the operation action area, and the execution state of the application is an automatic execution state, the device 1 applies an operation corresponding to the operation action area to the target operation object.
  • the execution state of the current application for example, the execution state of the operation corresponding to the operation action area in the current application includes an automatic execution state, or a manual execution state, and the like.
  • the execution state may be set to default or may vary based on the acquired user command.
  • the device 1 sets the execution state of the application to an automatic execution state based on an automatic execution operation instruction submitted by the user.
  • the submitting the automatic execution operation instruction may include the user submitting the instruction based on a gesture operation on the current application, such as a touch button on the touch screen, or based on a hardware preset function, such as a key to the smart phone,
  • the operation of the preset function of the volume key implements the submission of the instruction.
  • the operation corresponding to the operation action area is to shoot the corresponding operation object
  • the user can trigger the switch button corresponding to the automatic execution state displayed on the touch screen, and if it is slid to open, the corresponding automatic execution The status is turned on, and if it is slid to off, the corresponding automatic execution status is turned off.
  • the device 1 if in the automatic execution state, a scene in which the target operation object overlaps with the operation action area occurs, the device 1 will automatically target the target operation object overlapping the operation action area. The corresponding operation is performed.
  • the method includes a step S41 and a step S42, wherein the step S41 further includes step S411 and step S412.
  • step S411 the device 1 adjusts a relative position of the target operation object and the operation action area in the application based on the touch operation of the current application by the user; in step S412, the relative position is adjusted In the process, if the distance between the target operation object and the operation action area is less than or equal to the distance threshold, the device 1 superimposes the operation action area on the target operation object; in step S42, if the target The operation object is overlapped with the operation action area, and the device 1 applies an operation corresponding to the operation action area to the target operation object.
  • the step S42 is the same as or substantially the same as the content of the step S32 shown in FIG. 3, and details are not described herein again, and are included herein by reference.
  • step S411 the device 1 adjusts the relative position of the target operation object and the operation action area in the application based on the touch operation of the current application by the user; and further, in step S412, the relative position is adjusted.
  • the device 1 adjusts the relative position of the target operation object and the operation action area based on the touch operation of the user, preferably, in the adjustment process, the target operation object and the When the distance of the relative position of the operation area is less than or equal to the distance preset, the operation area is automatically superimposed on the target operation object.
  • the distance may be flexibly set as the distance between the target operation object and the center point of the operation action area based on the need; or, if the operation action area has a certain area range, Determining, as the distance, the shortest distance, or the farthest distance, or the like of the target operation object and the operation action area based on the boundary of the area range.
  • the distance is greater than the distance threshold, the operation action area is not automatically superimposed on the target operation object, and the target operation object and the location are adjusted based on the touch operation of the user.
  • the device 1 automatically performs the operation even if the operation action area does not overlap or does not intersect with the target operation object.
  • the active area is superimposed on the target operational object.
  • the operation action area is fixedly displayed in a central area of the current application
  • the currently applied scene is automatically adjusted such that the target operation object and the operation action area are automatically overlapped.
  • the device 1 in the process of adjusting the relative position, if the distance between the target operation object and the operation action area is less than or equal to the distance threshold, the operation action area is superimposed on the target operation object.
  • the device 1 may automatically superimpose the operation action area on the target operation object, thereby locking the target operation object. Therefore, the device 1 can intelligently determine, assist, and calibrate the touch operation of the user, and improve the smooth execution of the operation instruction issued by the user based on the touch operation.
  • the method further includes a step S44 (not shown), in which the device 1 adjusts the operation action area according to the movement information of the target operation object, To maintain the operational action area superimposed on the target operational object.
  • each of the operation objects in the current application may be stationary or continuously moving.
  • the target operation object that automatically overlaps the operation action area when the distance threshold is satisfied when it moves, adjust the operation action area based on the current application, specifically the movement information on the touch screen, to It is ensured that the superimposed state of the operation action area and the target operation object is relatively unchanged, that is, the target operation object is kept continuously locked by the operation action area.
  • the device 1 further causes the operation action area to be located in a central area of the application by calibrating the scene view of the application.
  • the current application corresponding to the touch terminal may be based on the user's perspective and presented on the touch screen, corresponding to
  • a preferred scenario is that the operational action area is fixedly displayed in a central area of the current application, such as specifically displayed in a central area of the corresponding touch screen.
  • the scenario of the application may be adjusted to achieve the operation action area and other scenarios in the application by calibrating the view angle of the application. Relatively moving visual effects.
  • the target operation object superimposed with the operation action area in the application moves, The movement of the scene perspective ensures that the operational action area is always superimposed with the target operational object, and the operational action area continues to be located in the central area of the application.
  • step S412 in the process of adjusting the relative position, if a distance between each of the plurality of target operation objects and the operation action area is less than or equal to a distance threshold The device 1 superimposes the operation action area on at least one of the plurality of target operation objects.
  • the target operation object that satisfies the condition that the distance from the operation action area is less than or equal to the distance threshold may be one or more.
  • the operation action area may be superimposed on one or simultaneously superimposed on the plurality of target operation objects. Further, the operation corresponding to the operation action area may be correspondingly applied to one or more of the target operation objects.
  • the operation action area is a game skill action area, such as an aiming frame of a shooting skill, and the operation corresponding to the operation action area may be simultaneously issued to multiple objects in the aiming frame multiple times.
  • the operation action area may be automatically superimposed on the plurality of the target operation objects at the same time.
  • step S412 in the process of adjusting the relative position, if the distance between each of the plurality of target operation objects and the operation action area is less than or equal to a distance threshold, the device 1 performs the operation.
  • the action area is superimposed on the target operation object that is closest to the operation action area among the plurality of target operation objects.
  • the target operation object that satisfies the condition that the distance from the operation action area is less than or equal to the distance threshold may be one or more, and may be based on the principle of proximity. And superimposing the operation action area on a target operation object that is closest to the operation action area among the plurality of target operation objects.
  • the distance may be flexibly set as a distance between the target operation object and a center point of the operation action area, or a shortest distance between the target operation object and the operation action area, Or the farthest distance.
  • the operation action area is a game skill action area, such as an aiming frame of a shooting skill, preferably, the target operation pair is attacked.
  • the distance from the center point of the aiming frame is the measurement standard of the distance, and the target operation object closest to the center point of the aiming frame is selected as the superimposed target operation object.
  • step S412 in the process of adjusting the relative position, if the distance between each of the plurality of target operation objects and the operation action area is less than or equal to a distance threshold, the device 1 performs the operation.
  • the action area is superimposed on the target operation object having the highest object level among the plurality of target operation objects.
  • the target operation object that satisfies the condition that the distance from the operation action area is less than or equal to the distance threshold may be one or more, and at this time, The target operation object that is finally superimposed is determined based on the attribute characteristics of the respective target operation objects.
  • the target operation object having the highest object level among the plurality of target operation objects may be determined.
  • the specific meaning of the object level is different.
  • the operation action area is a game skill action area, such as an aiming frame of a shooting skill, and an operation corresponding to the operation action area is to shoot a scene object in the aiming frame, the target operation.
  • Objects include the characters, buildings, or other items that are fired.
  • the target operation object with the highest object level at this time may be set based on the difficulty level of the object.
  • the target operation object that is finally superimposed may also be determined based on attributes such as size, volume, color, shape, and the like of the target operation object.
  • the device 1 determines a target operation object that is actually superimposed with the operation action area based on different preset conditions, and includes determining the target based on a condition that is closest to the operation action area.
  • the operation object, and the target operation object are determined based on the highest level of the object.
  • various determination methods may be applied separately or in combination.
  • the target operation object is determined based on a condition that is closest to the operation action area, and if there are two or more distance conditions, the target operation is consistent.
  • the object may further determine the target operation object superimposed by the operation action area based on the highest level of the object.
  • the step S41 further includes a step S413 (not shown).
  • a step S413 in the process of adjusting the relative position, if the state of the target operation object is not It can be seen that the device 1 re-determines the target operation object according to the distance between the operation object and the operation action area in the current scene of the application.
  • step S412 the device 1 superimposes the operation action area on the target operation object when the target operation object and the operation action area are less than or equal to a distance threshold, and then The target operation object is locked by the operation action area, taking the game application as an example, that is, the attacked object is locked by the aiming frame.
  • the device 1 will automatically act according to the operation object and the operation in the current scene of the application. The distance of the region re-determines the target operational object.
  • the invisible state of the original target operation object is included in the current application, the target operation object cannot be seen or recognized by the user, for example, the target operation object is being moved during the movement.
  • the props in the scene are occluded or occluded by other operating objects; or based on some trigger settings, the target operating objects disappear from the current application, and the like. Then, the device 1 will abandon the determined target operation object superimposed with the operation action area, and re-select a new target operation object based on the determination condition of the distance threshold, and automatically perform the operation.
  • the active area is superimposed with the re-determined target operational object. If there is no operation object that meets the judgment condition of the distance threshold, then preferably, the current scene is maintained and the subsequent operations of the user are awaited.
  • the method further comprises a step S45 (not shown), in step S45, if the operation is completed, the device 1 operates according to the current scene of the application The distance of the object from the operational action area re-determines the target operational object.
  • the operation execution completion includes the device 1 applying an operation corresponding to the operation action area to the target operation object after the target operation object is overlapped with the operation action area.
  • the device 1 will abandon the determined target operation object superimposed with the operation action area, and the device 1 will re-determine the determination bar based on the distance threshold. And selecting a new target operation object and automatically superimposing the operation action area with the re-determined target operation object.
  • the device 1 also sets the distance between the operation object and the operation action area in the current scene of the application as The user re-determines the target operational object. If there is no operation object that meets the judgment condition of the distance threshold, then preferably, the current scene is maintained and the subsequent operations of the user are awaited.
  • FIG. 5 is a diagram showing an example of performing an operation on a touch terminal according to another preferred embodiment of the present application
  • FIG. 6 is a diagram showing an example of performing an operation on a touch terminal according to still another preferred embodiment of the present application.
  • FIG. 5 shows a schematic diagram of the present application taking a game application as an example.
  • Figure 51 shows a touch screen of a touch terminal of the device 1, such as a smart phone or a touch screen of a tablet.
  • the icon 52 corresponds to the target operation object being the attacked object in the game
  • the icon 53 corresponds to the operation action area, that is, a circular aiming frame
  • the icon 54 identifies the center point of the operation action area. That is the aiming center.
  • the figure 61 shows the touch screen of the touch terminal of the device 1 which is the same as 51, for example, the touch screen icon 62 of the smart phone or the tablet corresponds to the target operation object being the attacked object in the game.
  • the figure 63 corresponds to the operation action area, that is, a circular aiming frame, and the figure 64 identifies the center point of the operation action area, that is, the aiming center.
  • the attacked object 52 moves to the current position of FIG. 5, and if the distance between the location and the center point 54 is less than or equal to the distance threshold, the The device 1 will automatically change the position of the attacked object from the position shown by the illustration 52 to move to the position shown in the illustration 62, at which point the attacked object 62 and the current application are The aiming frames 63 overlap.
  • the operation corresponding to the operation action area is a shooting operation at this time, that is, automatically acts on the attacked object 62.

Abstract

本申请的目的是提供一种在触摸终端上执行操作的方法与设备;基于用户在当前应用的触控操作,调整所述应用中目标操作对象与操作作用区域的相对位置;若所述目标操作对象与所述操作作用区域相叠,将所述操作作用区域对应的操作作用于所述目标操作对象。与现有技术相比,本申请通过对所述目标操作对象与操作作用区域相对位置的判断,自动触发执行对所述目标操作对象的相应操作,由于设备自动判断执行的整个过程比用户自主判断执行要更为高效和准确,使得人机交互的配合度更高,特别是在对所述操作的执行条件判断和执行速度有较高要求的应用场景中,本申请能够更精准和快速地完成用户对所述目标操作对象的预期操作,使得用户体验得到优化。

Description

一种在触摸终端上执行操作的方法与设备 技术领域
本申请涉及计算机领域,尤其涉及一种在触摸终端上执行操作的技术。
背景技术
随着计算机技术的快速发展,人们逐渐习惯了通过在触摸终端上执行触摸操作来实现与智能设备之间的交互,但是现有技术中,用户操作指令与智能设备响应执行之间的对应关系相对简单,例如,若是在智能设备的当前应用环境下,用户有一系列的执行操作希望得到该智能设备的响应,通常的做法是,用户发出一个指令,例如一个触摸操作,由该智能设备响应该触摸操作,进而,用户基于智能设备响应的结果,确定发出下一个指令,在此,当用户基于上一个指令的完成情况,判断确定下一个指令的发出时,受限于用户反应速度和判断精准度等自身条件的限制,会直接影响到下一个指令被完成的情况,如达不到用户期望的预定执行效果等。从而在整体上,使得用户的人机交互的体验得不到更好的提升和满足。
发明内容
本申请的目的是提供一种在触摸终端上执行操作的方法与设备。
根据本申请的一个方面,提供了一种在触摸终端上执行操作的方法,包括:
基于用户在当前应用的触控操作,调整所述应用中目标操作对象与操作作用区域的相对位置;
若所述目标操作对象与所述操作作用区域相叠,将所述操作作用区域对应的操作作用于所述目标操作对象。
根据本申请的另一方面,还提供了一种在触摸终端上执行操作的设备,包括:
第一装置,用于基于用户在当前应用的触控操作,调整所述应用中目标 操作对象与操作作用区域的相对位置;
第二装置,用于若所述目标操作对象与所述操作作用区域相叠,将所述操作作用区域对应的操作作用于所述目标操作对象。
与现有技术相比,本申请基于用户在当前应用的触控操作调整所述应用中目标操作对象与操作作用区域的相对位置,在当所述目标操作对象与操作作用区域相叠时,所述设备将对所述目标操作对象执行所述操作作用区域对应的操作。在此,所述设备通过对所述目标操作对象与操作作用区域相对位置的判断,自动触发执行对所述目标操作对象的相应操作,由于设备自动判断执行的整个过程比用户自主判断执行要更为高效和准确,使得人机交互的配合度更高,特别是在对所述操作的执行条件判断和执行速度有较高要求的应用场景中,本申请能够更精准和快速地完成用户对所述目标操作对象的预期操作,使得用户体验得到优化。
进一步,当在调整所述相对位置过程中,若所述目标操作对象与所述操作作用区域的距离小于或等于距离阈值,将所述操作作用区域叠加于所述目标操作对象。在此,当满足一定的阈值条件时,所述设备可以自动将所述操作作用区域叠加于所述目标操作对象,从而锁定该目标操作对象。由此,所述设备可以通过智能判断,辅助、校准所述用户的触控操作,提高所述设备顺利执行用户基于所述触控操作发出的操作指令。
附图说明
通过阅读参照以下附图所作的对非限制性实施例所作的详细描述,本申请的其它特征、目的和优点将会变得更明显:
图1示出根据本申请一个方面的一种在触摸终端上执行操作的设备的设备示意图;
图2示出根据本申请一个优选实施例的一种在触摸终端上执行操作的设备的设备示意图;
图3示出根据本申请另一个方面的一种在触摸终端上执行操作的方法流程图;
图4示出根据本申请一个优选实施例的一种在触摸终端上执行操作的 方法流程图;
图5示出根据本申请另一个优选实施例的一种在触摸终端上执行操作的实例示意图;
图6示出根据本申请又一个优选实施例的一种在触摸终端上执行操作的实例示意图。
附图中相同或相似的附图标记代表相同或相似的部件。
具体实施方式
下面结合附图对本申请作进一步详细描述。
在本申请一个典型的配置中,终端、服务网络的设备和可信方均包括一个或多个处理器(CPU)、输入/输出接口、网络接口和内存。
内存可能包括计算机可读介质中的非永久性存储器,随机存取存储器(RAM)和/或非易失性内存等形式,如只读存储器(ROM)或闪存(flash RAM)。内存是计算机可读介质的示例。
计算机可读介质包括永久性和非永久性、可移动和非可移动媒体可以由任何方法或技术来实现信息存储。信息可以是计算机可读指令、数据结构、程序的模块或其他数据。计算机的存储介质的例子包括,但不限于相变内存(PRAM)、静态随机存取存储器(SRAM)、动态随机存取存储器(DRAM)、其他类型的随机存取存储器(RAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、快闪记忆体或其他内存技术、只读光盘只读存储器(CD-ROM)、数字多功能光盘(DVD)或其他光学存储、磁盒式磁带,磁带磁盘存储或其他磁性存储设备或任何其他非传输介质,可用于存储可以被计算设备访问的信息。按照本文中的界定,计算机可读介质不包括非暂存电脑可读媒体(transitory media),如调制的数据信号和载波。
图1示出根据本申请一个方面的一种在触摸终端上执行操作的设备1的设备示意图。其中,所述设备1包括第一装置11、第二装置12。
其中,所述第一装置11基于用户在当前应用的触控操作,调整所述应用中目标操作对象与操作作用区域的相对位置;第二装置12若所述目 标操作对象与所述操作作用区域相叠,将所述操作作用区域对应的操作作用于所述目标操作对象。
具体地,本申请中,所述触摸终端包括但不限于含有触摸屏的智能终端设备,例如,有触摸屏的个人计算机、智能手机、平板电脑、游戏设备等,所述触摸操控还包括可以外接触屏设备的其他智能终端设备。在此,所述设备1可以是触摸终端本身,还可以是安装在所述触摸终端上、并能实现所述执行操作的硬件、软件或软硬件的结合。
在此,所述第一装置11基于用户在当前应用的触控操作,调整所述应用中目标操作对象与操作作用区域的相对位置。在此,优选地,所述当前应用包括但不限于加载于所述触摸终端上的应用程序,如游戏应用,进一步,所述当前应用在所述触摸终端的触摸屏上显示。所述目标操作对象包括所述触摸终端的当前应用中所呈现的、并为所述用户选择的、作为后续执行操作对应的目标对象。例如,在游戏应用中,所述目标操作对象可以是当前应用的场景中所述用户玩家希望锁定的对象,或是希望攻击的对象,例如,各种静止或是移动的人物、建筑、标识等。所述操作作用区域包括在所述当前应用中通过范围标识突出显示的区域;或是虽不突出显示,但是实际存在对应区域。在此,所述操作作用区域可以固定显示于所述当前应用的相应区域,例如,固定显示于中心区域。此外,还可以基于用户的区域移动触控操作,在当前应用中灵活调整所述操作作用区域的所在区域,如基于用户在当前应用的滑动操作,相应的移动所述操作作用区域。在此,可以在所述操作作用区域执行对应的操作。例如,在所述触摸终端的当前应用中,以一个可区分显示的区域代表所述操作作用区域,具体地,如在一个游戏应用中,所述操作作用区域对应于游戏技能作用区域,如辅助执行者进行攻击的瞄准框等。在此,可以基于用户的触控操作,通过调整所述当前应用中目标操作对象的位置、或是通过调整所述操作作用区域的位置、或是同时调整所述目标操作对象和操作作用区域的位置来实现所述应用中目标操作对象与操作作用区域的相对位置的调整。在此,在一个优选场景中,所述操作作用区域固定显示在所述当前应用的中心区域,具体呈现在对应的触摸屏中心区域。所述触控操作包括但不限于用户 在所述当前应用的自定义操作,具体地,可以是用户在当前应用对应的触控终端的触摸屏上相应区域的手势操作,例如包括但不限于单击、多次点击、长按、轻按、重按或滑动触摸屏上对应区域等具体操作方式。基于用户的不同触控操作,所述设备1将对应不同的响应效果,例如,优选地,在游戏应用场景下,所述当前应用的场景是基于所述用户的视野角度呈现的,若是在右触摸屏上的滑动操作对应的是所述当前应用的场景视角的改变,如,向左滑动,场景视角左移,向右滑动,场景视角右移,则当所述用户在当前应用的右触摸屏上向右滑动时,所述场景视角右移,即在当前应用场景中的目标操作对象也会随着场景视角的改变而相应移动,由此可以基于所述触控操作,调整所述应用中目标操作对象的位置,进一步,在所述操作作用区域固定显示在所述当前应用的中心区域的场景下,所述目标操作对象与所述操作作用区域的相对位置,也会随着所述用户的具体触控操作而变化,例如,所述目标操作对象与所述操作作用区域的相对位置变远,或是变近,或是重叠。
接着,第二装置12若所述目标操作对象与所述操作作用区域相叠,将所述操作作用区域对应的操作作用于所述目标操作对象。在此,若是基于所述调整使得所述目标操作对象与所述操作作用区域的相对位置重叠,则所述设备1将对重叠在所述操作作用区域的目标操作对象执行相应的操作,该操作包括预先设置的、与所述操作作用区域对应的操作。在此,所述重叠可以包括:所述目标操作对象被所述操作作用区域完全覆盖;或是所述目标操作对象与所述操作作用区域的叠加部分达预设的叠加范围,如叠加部分达到可相交区域的最大值。以游戏应用为例,若设置所述操作作用区域对应的操作是像所述操作作用区域内的对象发起攻击,当所述目标操作对象与所述操作作用区域相叠时,则所述第二装置12自动触发向所述操作作用区域相叠的目标操作对象发起进攻。在此,所述操作作用区域对应的操作会基于不同应用场景的需要,而灵活自定义设置。
在此,本申请基于用户在当前应用的触控操作调整所述应用中目标操作对象与操作作用区域的相对位置,在当所述目标操作对象与操作作用区域相叠时,所述设备1将对所述目标操作对象执行所述操作作用区域对应的操作。 在此,所述设备1通过对所述目标操作对象与操作作用区域相对位置的判断,自动触发执行对所述目标操作对象的相应操作,由此,由于设备1自动判断执行的整个过程比用户自主判断执行要更为高效和准确,使得人机交互的配合度更高,特别是在对所述操作的执行条件判断和执行速度有较高要求的应用场景中,本申请能够更精准和快速地完成用户对所述目标操作对象的预期操作,使得用户体验得到优化。
优选地,所述设备1还包括第三装置(未示出),所述第三装置根据用户提交的自动执行操作指令设置所述应用的执行状态为自动执行状态;其中,若所述目标操作对象与所述操作作用区域相叠,且所述应用的执行状态为自动执行状态,所述第二装置12将所述操作作用区域对应的操作作用于所述目标操作对象。
具体地,所述当前应用的执行状态,例如,所述当前应用中所述操作作用区域对应的操作的执行状态包括自动执行状态、或是手动执行状态等。所述执行状态可以是设置为默认,或是基于获取的用户指令而变化。在此,优选地,所述设备1会基于用户提交的自动执行操作指令,将所述应用的执行状态设置为自动执行状态。所述自动执行操作指令的提交可以包括用户基于对当前应用中、例如触摸屏上的触控按钮的手势操作,实现指令的提交;或是基于对硬件预设功能,例如对智能手机的开关键、音量键的预置功能的操作实现所述指令的提交。以游戏应用为例,若所述操作作用区域对应的操作为射击相应的操作对象,则用户可以通过触发所述触摸屏上显示的自动执行状态对应的开关按钮,若是滑动到开,则对应自动执行状态开启,若是滑动到关,则对应自动执行状态被关闭。进一步,若是在所述自动执行状态下,出现所述目标操作对象与所述操作作用区域相叠的场景,则所述设备1将对与所述操作作用区域相叠的目标操作对象自动执行所述对应的操作。
图2示出根据本申请一个优选实施例的一种在触摸终端上执行操作的设备1的设备示意图。其中,所述设备1包括第一装置11’、第二装置12’,其中,所述第一装置11’包括第一单元111’和第二单元112’。
其中,所述第一单元111’基于用户在当前应用的触控操作,调整所述 应用中目标操作对象与操作作用区域的相对位置;在调整所述相对位置过程中,若所述目标操作对象与所述操作作用区域的距离小于或等于距离阈值,所述第二单元112’将所述操作作用区域叠加于所述目标操作对象;若所述目标操作对象与所述操作作用区域相叠,所述第二装置12’将所述操作作用区域对应的操作作用于所述目标操作对象。在此,所述第二装置12’与图1示出的第二装置12内容相同或基本相同,在此不再赘述,并以引用方式包含于此。
具体地,所述第一单元111’基于用户在当前应用的触控操作,调整所述应用中目标操作对象与操作作用区域的相对位置;进而,在调整所述相对位置过程中,若所述目标操作对象与所述操作作用区域的距离小于或等于距离阈值,所述第二单元112’将所述操作作用区域叠加于所述目标操作对象。在此,所述设备1基于用户的所述触控操作对所述目标操作对象与操作作用区域的相对位置进行调整,优选地,若是在所述调整过程中,所述目标操作对象与所述操作作用区域的相对位置的距离小于、或是等于距离预置,则自动将所述操作作用区域叠加在所述目标操作对象上。在此,所述距离可以基于需要,灵活地设置为所述目标操作对象与所述操作作用区域的中心点之间的距离;或是,若所述操作作用区域有一定的区域范围,还可以基于所述区域范围的边界确定所述目标操作对象与所述操作作用区域的最短距离、或是最远距离等,以此作为所述距离。在此,当所述距离大于距离阈值时,设置所述操作作用区域并不自动叠加于所述目标操作对象上,而当基于所述用户的所述触控操作调整所述目标操作对象与所述操作作用区域的相对位置达到、或是小于所述距离阈值时,即使所述操作作用区域与所述目标操作对象不相叠、或是不相交、所述设备1也会自动将所述操作作用区域叠加于所述目标操作对象。在此,优选地,在所述操作作用区域固定显示在所述当前应用的中心区域的场景下,所述当前应用的场景会自动调整,使得所述目标操作对象与所述操作作用区域自动相叠。
在本实施例中,当在调整所述相对位置过程中,若所述目标操作对象与所述操作作用区域的距离小于或等于距离阈值,将所述操作作用区域叠 加于所述目标操作对象。在此,当满足一定的阈值条件时,所述设备1可以自动将所述操作作用区域叠加于所述目标操作对象,从而锁定该目标操作对象。由此,所述设备1可以通过智能判断,辅助、校准用户的所述触控操作,提高所述设备顺利执行用户基于所述触控操作发出的操作指令。
在一个优选实施例(参考图2)中,所述设备1还包括第四装置(未示出),所述第四装置根据所述目标操作对象的移动信息调整所述操作作用区域,以保持所述操作作用区域叠加于所述目标操作对象。
具体地,所述当前应用中的各个操作对象可能是静止的、也可能是不断运动中的。对于满足所述距离阈值自动与所述操作作用区域叠加的所述目标操作对象,当其移动时,基于其在当前应用、具体到在所述触摸屏上的移动信息调整所述操作作用区域,以保证所述操作作用区域与所述目标操作对象的叠加状态相对不变,即保持所述目标操作对象被所述操作作用区持续锁定。
基于此,优选地,在调整所述操作作用区域的过程中,所述第四装置还通过校准所述应用的场景视角使得所述操作作用区域位于所述应用的中心区域。
具体地,在此,为了给所述当前应用的用户带来更好的视觉体验,所述触摸终端对应的当前应用是可以是基于所述用户视角、并呈现在所述触摸屏上的,相对应的一个优选场景是,所述操作作用区域固定显示在所述当前应用的中心区域,例如具体显示在对应的触摸屏的中心区域。为了保证所述操作作用区域始终固定位于所述应用的中心区域,可以通过校准所述应用的场景视角的方法,调整所述应用的场景以达到所述操作作用区域与所述应用中的其他场景相对移动的视觉效果。在此,进一步,当所述应用中与所述操作作用区域叠加的所述目标操作对象移动时,即可以通过所述场景视角的移动,确保所述操作作用区域与目标操作对象始终叠加,且所述操作作用区域持续位于所述应用的中心区域。
在一个优选实施例(参考图2)中,所述第二单元112’在调整所述相对位置过程中,若多个所述目标操作对象中每一个与所述操作作用区域的距离小于或等于距离阈值,将所述操作作用区域叠加于所述多个目标操作 对象中至少一个。
具体地,在一个优选场景下,同一时刻,在所述当前应用中,满足与所述操作作用区域的距离小于或等于距离阈值这一条件的所述目标操作对象可能是一个或是多个,在此,可以基于具体的应用场景,例如,基于所述操作作用区域对应的操作的特点,设置将所述操作作用区域叠加于一个、或是同时叠加于多个所述目标操作对象上。进一步,后续可以选择将所述操作作用区域对应的操作相应地作用于一个或是多个所述目标操作对象上。例如,以游戏应用为例,所述操作作用区域是游戏技能作用区域,如射击技能的瞄准框,所述操作作用区域对应的操作是可以向所述瞄准框中的多个对象同时发出多次射击,则若是同时有多个目标操作对象满足所述距离小于或等于所述距离阈值的条件,则可以将所述操作作用区域同时自动叠加于上述多个所述目标操作对象。
优选地,所述第二单元112’在调整所述相对位置过程中,若多个所述目标操作对象中每一个与所述操作作用区域的距离小于或等于距离阈值,将所述操作作用区域叠加于所述多个目标操作对象中与所述操作作用区域距离最近的目标操作对象。
具体地,当同一时刻,在所述当前应用中,满足与所述操作作用区域的距离小于或等于距离阈值这一条件的所述目标操作对象可能是一个或是多个,可以以就近原则,将所述操作作用区域叠加于所述多个目标操作对象中与所述操作作用区域距离最近的目标操作对象。在此,所述距离可以基于需要,灵活地设置为所述目标操作对象与所述操作作用区域的中心点之间的距离,或是所述目标操作对象与所述操作作用区域的最短距离、或是最远距离等。例如,在游戏应用场景中,所述操作作用区域是游戏技能作用区域,如射击技能的瞄准框,优选地,以所述被攻击的目标操作对象与所述瞄准框的中心点的距离为所述距离的测量标准,选择离该瞄准框中心点最近的目标操作对象为所叠加的目标操作对象。
优选地,所述第二单元112’在调整所述相对位置过程中,若多个所述目标操作对象中每一个与所述操作作用区域的距离小于或等于距离阈值,将所述操作作用区域叠加于所述多个目标操作对象中对象等级最高的目 标操作对象。
具体地,当同一时刻,在所述当前应用中,满足与所述操作作用区域的距离小于或等于距离阈值这一条件的所述目标操作对象可能是一个或是多个,此时,还可以基于各个目标操作对象的属性特征确定最终被叠加的所述目标操作对象。优选地,可以确定所述多个目标操作对象中对象等级最高的目标操作对象。在不同的应用场景中,所述对象等级最高的具体含义会有不同。以一个游戏应用为例,所述操作作用区域是游戏技能作用区域,如射击技能的瞄准框,所述操作作用区域对应的操作是可以向所述瞄准框中场景对象发出射击,所述目标操作对象包括被射击的游戏人物、建筑,或是其他道具等。具体地,此时所述对象等级最高的目标操作对象,即可以是基于对象的难攻克程度设定的。此外,在其他应用中,还可以基于所述目标操作对象的大小、体积、颜色、形状等属性确定最终被叠加的目标操作对象。
在本实施例中,所述设备1基于不同的预设条件却确定实际与所述操作作用区域所叠加的目标操作对象,其中,包括基于与所述操作作用区域距离最近的条件确定所述目标操作对象、和基于所述对象等级最高的条件确定目标操作对象。在此,各种确定方法可以单独适用,也可以配合适用,例如,优选地,基于与所述操作作用区域距离最近的条件确定所述目标操作对象,若是存在两个以上距离条件一致的目标操作对象,则可以进一步基于所述对象等级最高的标准确定被所述操作作用区域叠加的所述目标操作对象。
在此,本领域技术人员应该能理解上述各种确定与所述操作作用区域叠加的目标操作对象的方法仅为举例,现有或今后可能出现的其他确定被所述操作作用区域叠加的所述目标操作对象的方法如适用于本申请的,也应包含在本申请保护范围以内,并以引用方式包含于此。
在一个优选实施例(参考图2)中,所述第一装置11’还包括第三单元(未示出),所述第三单元在调整所述相对位置过程中,若所述目标操作对象的状态为不可见,根据所述应用的当前场景中操作对象与所述操作作用区域的距离重新确定所述目标操作对象。
具体地,所述第二单元112’会在所述目标操作对象与所述操作作用区域小于或是等于距离阈值时,将所述操作作用区域叠加于所述目标操作对象,继而,该目标操作对象被该操作作用区域锁定,以游戏应用为例,即被攻击对象被瞄准框锁定。此时,进一步,当所述与所述操作作用区域叠加的目标操作对象的状态为不可见时,优选地,所述设备1将自动根据所述应用的当前场景中操作对象与所述操作作用区域的距离重新确定所述目标操作对象。在此,所述原有的目标操作对象的不可见状态包括在所述当前应用中,所述目标操作对象无法被所述用户看见、或识别,例如,所述目标操作对象在移动过程中被所述场景中的道具遮挡,或是被其他操作对象遮挡;或是基于某些触发设置,所述目标操作对象从所述当前应用中消失等。继而,所述设备1会放弃所确定的与所述操作作用区域叠加的所述目标操作对象,而是重新基于所述距离阈值的判断条件,选择新的目标操作对象,并自动将所述操作作用区域与重新确定的所述目标操作对象叠加。若是没有符合所述距离阈值的判断条件的操作对象,则优选地,会维持当前场景,并等待所述用户的后续操作。
在一个优选实施例(参考图1)中,所述设备1还包括第五装置(未示出),若所述操作执行完成,所述第五装置根据所述应用的当前场景中操作对象与所述操作作用区域的距离重新确定所述目标操作对象。
具体地,在此,所述操作执行完成包括所述在所述目标操作对象与所述操作作用区域相叠后,设备1将所述操作作用区域对应的操作作用于所述目标操作对象。以游戏应用为例,即所述瞄准框中的被攻击对象被射击中。在其他应用中,则相应地理解为,需要对所述目标操作对象执行的任务操作已结束。则此时,所述设备1会放弃所确定的与所述操作作用区域叠加的所述目标操作对象,所述第五装置会重新基于所述距离阈值的判断条件,选择新的目标操作对象,并自动将所述操作作用区域与重新确定的所述目标操作对象叠加。此外,若是所述目标操作对象对应的所述操作被所述用户以外的其他用户抢先执行完成,所述设备1也会根据所述应用的当前场景中操作对象与所述操作作用区域的距离为所述用户重新确定所述目标操作对象。若是没有符合所述距离阈值的判断条件的操作对象,则 优选地,会维持当前场景,并等待所述用户的后续操作。
图3示出根据本申请另一个方面的一种在触摸终端上执行操作的方法流程图。其中,所述方法包括步骤S31和步骤S32。
其中,在步骤S31中,所述设备1基于用户在当前应用的触控操作,调整所述应用中目标操作对象与操作作用区域的相对位置;在步骤S32中,若所述目标操作对象与所述操作作用区域相叠,所述设备1将所述操作作用区域对应的操作作用于所述目标操作对象。
具体地,本申请中,所述触摸终端包括但不限于含有触摸屏的智能终端设备,例如,有触摸屏的个人计算机、智能手机、平板电脑、游戏设备等,所述触摸操控还包括可以外接触屏设备的其他智能终端设备。在此,所述设备1可以是触摸终端本身,还可以是安装在所述触摸终端上、并能实现所述执行操作的硬件、软件或软硬件的结合。
在此,在步骤S31中,所述设备1基于用户在当前应用的触控操作,调整所述应用中目标操作对象与操作作用区域的相对位置。在此,优选地,所述当前应用包括但不限于加载于所述触摸终端上的应用程序,如游戏应用,进一步,所述当前应用在所述触摸终端的触摸屏上显示。所述目标操作对象包括所述触摸终端的当前应用中所呈现的、并为所述用户选择的、作为后续执行操作对应的目标对象。例如,在游戏应用中,所述目标操作对象可以是当前应用的场景中所述用户玩家希望锁定的对象,或是希望攻击的对象,例如,各种静止或是移动的人物、建筑、标识等。所述操作作用区域包括在所述当前应用中通过范围标识突出显示的区域;或是虽不突出显示,但是实际存在对应区域。在此,所述操作作用区域可以固定显示于所述当前应用的相应区域,例如,固定显示于中心区域。此外,还可以基于用户的区域移动触控操作,在当前应用中灵活调整所述操作作用区域的所在区域,如基于用户在当前应用的滑动操作,相应的移动所述操作作用区域。在此,可以在所述操作作用区域执行对应的操作。例如,在所述触摸终端的当前应用中,以一个可区分显示的区域代表所述操作作用区域,具体地,如在一个游戏应用中,所述操作作用区域对应于游戏技能作用区域,如辅助执行者进行攻击的瞄准框等。在此,可以基于用户的触控 操作,通过调整所述当前应用中目标操作对象的位置、或是通过调整所述操作作用区域的位置、或是同时调整所述目标操作对象和操作作用区域的位置来实现所述应用中目标操作对象与操作作用区域的相对位置的调整。在此,在一个优选场景中,所述操作作用区域固定显示在所述当前应用的中心区域,具体呈现在对应的触摸屏中心区域。所述触控操作包括但不限于用户在所述当前应用的自定义操作,具体地,可以是用户在当前应用对应的触控终端的触摸屏上相应区域的手势操作,例如包括但不限于单击、多次点击、长按、轻按、重按或滑动触摸屏上对应区域等具体操作方式。基于用户的不同触控操作,所述设备1将对应不同的响应效果,例如,优选地,在游戏应用场景下,所述当前应用的场景是基于所述用户的视野角度呈现的,若是在右触摸屏上的滑动操作对应的是所述当前应用的场景视角的改变,如,向左滑动,场景视角左移,向右滑动,场景视角右移,则当所述用户在当前应用的右触摸屏上向右滑动时,所述场景视角右移,即在当前应用场景中的目标操作对象也会随着场景视角的改变而相应移动,由此可以基于所述触控操作,调整所述应用中目标操作对象的位置,进一步,在所述操作作用区域固定显示在所述当前应用的中心区域的场景下,所述目标操作对象与所述操作作用区域的相对位置,也会随着所述用户的具体触控操作而变化,例如,所述目标操作对象与所述操作作用区域的相对位置变远,或是变近,或是重叠。
接着,在步骤S32中,所述设备1若所述目标操作对象与所述操作作用区域相叠,将所述操作作用区域对应的操作作用于所述目标操作对象。在此,若是基于所述调整使得所述目标操作对象与所述操作作用区域的相对位置重叠,则所述设备1将对重叠在所述操作作用区域的目标操作对象执行相应的操作,该操作包括预先设置的、与所述操作作用区域对应的操作。在此,所述重叠可以包括:所述目标操作对象被所述操作作用区域完全覆盖;或是所述目标操作对象与所述操作作用区域的叠加部分达预设的叠加范围,如叠加部分达到可相交区域的最大值。以游戏应用为例,若设置所述操作作用区域对应的操作是像所述操作作用区域内的对象发起攻击,当所述目标操作对象与所述操作作用区域相叠时,则所述设备1自动 触发向所述操作作用区域相叠的目标操作对象发起进攻。在此,所述操作作用区域对应的操作会基于不同应用场景的需要,而灵活自定义设置。
在此,本申请基于用户在当前应用的触控操作调整所述应用中目标操作对象与操作作用区域的相对位置,在当所述目标操作对象与操作作用区域相叠时,所述设备1将对所述目标操作对象执行所述操作作用区域对应的操作。在此,所述设备1通过对所述目标操作对象与操作作用区域相对位置的判断,自动触发执行对所述目标操作对象的相应操作,由此,由于设备1自动判断执行的整个过程比用户自主判断执行要更为高效和准确,使得人机交互的配合度更高,特别是在对所述操作的执行条件判断和执行速度有较高要求的应用场景中,本申请能够更精准和快速地完成用户对所述目标操作对象的预期操作,使得用户体验得到优化。
优选地,所述方法还包括步骤S33(未示出),在步骤S33中,所述设备1根据用户提交的自动执行操作指令设置所述应用的执行状态为自动执行状态;其中,在步骤S32中,若所述目标操作对象与所述操作作用区域相叠,且所述应用的执行状态为自动执行状态,所述设备1将所述操作作用区域对应的操作作用于所述目标操作对象。
具体地,所述当前应用的执行状态,例如,所述当前应用中所述操作作用区域对应的操作的执行状态包括自动执行状态、或是手动执行状态等。所述执行状态可以是设置为默认,或是基于获取的用户指令而变化。在此,优选地,所述设备1会基于用户提交的自动执行操作指令,将所述应用的执行状态设置为自动执行状态。所述自动执行操作指令的提交可以包括用户基于对当前应用中、例如触摸屏上的触控按钮的手势操作,实现指令的提交;或是基于对硬件预设功能,例如对智能手机的开关键、音量键的预置功能的操作实现所述指令的提交。以游戏应用为例,若所述操作作用区域对应的操作为射击相应的操作对象,则用户可以通过触发所述触摸屏上显示的自动执行状态对应的开关按钮,若是滑动到开,则对应自动执行状态开启,若是滑动到关,则对应自动执行状态被关闭。进一步,若是在所述自动执行状态下,出现所述目标操作对象与所述操作作用区域相叠的场景,则所述设备1将对与所述操作作用区域相叠的目标操作对象自 动执行所述对应的操作。
图4示出根据本申请一个优选实施例的一种在触摸终端上执行操作的方法流程图。其中,所述方法包括步骤S41和步骤S42,其中,所述步骤S41还包括步骤S411和步骤S412。
其中,其中,在步骤S411中,所述设备1基于用户在当前应用的触控操作,调整所述应用中目标操作对象与操作作用区域的相对位置;在步骤S412中,在调整所述相对位置过程中,若所述目标操作对象与所述操作作用区域的距离小于或等于距离阈值,所述设备1将所述操作作用区域叠加于所述目标操作对象;在步骤S42中,若所述目标操作对象与所述操作作用区域相叠,所述设备1将所述操作作用区域对应的操作作用于所述目标操作对象。在此,所述步骤S42与图3示出的步骤S32内容相同或基本相同,在此不再赘述,并以引用方式包含于此。
具体地,在步骤S411中,所述设备1基于用户在当前应用的触控操作,调整所述应用中目标操作对象与操作作用区域的相对位置;进而,在步骤S412中,在调整所述相对位置过程中,若所述目标操作对象与所述操作作用区域的距离小于或等于距离阈值,所述设备1将所述操作作用区域叠加于所述目标操作对象。在此,所述设备1基于用户的所述触控操作对所述目标操作对象与操作作用区域的相对位置进行调整,优选地,若是在所述调整过程中,所述目标操作对象与所述操作作用区域的相对位置的距离小于、或是等于距离预置,则自动将所述操作作用区域叠加在所述目标操作对象上。在此,所述距离可以基于需要,灵活地设置为所述目标操作对象与所述操作作用区域的中心点之间的距离;或是,若所述操作作用区域有一定的区域范围,还可以基于所述区域范围的边界确定所述目标操作对象与所述操作作用区域的最短距离、或是最远距离等,以此作为所述距离。在此,当所述距离大于距离阈值时,设置所述操作作用区域并不自动叠加于所述目标操作对象上,而当基于所述用户的所述触控操作调整所述目标操作对象与所述操作作用区域的相对位置达到、或是小于所述距离阈值时,即使所述操作作用区域与所述目标操作对象不相叠、或是不相交、所述设备1也会自动将所述操作作用区域叠加于所述目标操作对象。在此, 优选地,在所述操作作用区域固定显示在所述当前应用的中心区域的场景下,所述当前应用的场景会自动调整,使得所述目标操作对象与所述操作作用区域自动相叠。
在本实施例中,当在调整所述相对位置过程中,若所述目标操作对象与所述操作作用区域的距离小于或等于距离阈值,将所述操作作用区域叠加于所述目标操作对象。在此,当满足一定的阈值条件时,所述设备1可以自动将所述操作作用区域叠加于所述目标操作对象,从而锁定该目标操作对象。由此,所述设备1可以通过智能判断,辅助、校准用户的所述触控操作,提高所述设备顺利执行用户基于所述触控操作发出的操作指令。
在一个优选实施例(参考图4)中,所述方法还包括步骤S44(未示出),在步骤S44中,所述设备1根据所述目标操作对象的移动信息调整所述操作作用区域,以保持所述操作作用区域叠加于所述目标操作对象。
具体地,所述当前应用中的各个操作对象可能是静止的、也可能是不断运动中的。对于满足所述距离阈值自动与所述操作作用区域叠加的所述目标操作对象,当其移动时,基于其在当前应用、具体到在所述触摸屏上的移动信息调整所述操作作用区域,以保证所述操作作用区域与所述目标操作对象的叠加状态相对不变,即保持所述目标操作对象被所述操作作用区持续锁定。
基于此,优选地,在步骤S44中,在调整所述操作作用区域的过程中,所述设备1还通过校准所述应用的场景视角使得所述操作作用区域位于所述应用的中心区域。
具体地,在此,为了给所述当前应用的用户带来更好的视觉体验,所述触摸终端对应的当前应用是可以是基于所述用户视角、并呈现在所述触摸屏上的,相对应的一个优选场景是,所述操作作用区域固定显示在所述当前应用的中心区域,例如具体显示在对应的触摸屏的中心区域。为了保证所述操作作用区域始终固定位于所述应用的中心区域,可以通过校准所述应用的场景视角的方法,调整所述应用的场景以达到所述操作作用区域与所述应用中的其他场景相对移动的视觉效果。在此,进一步,当所述应用中与所述操作作用区域叠加的所述目标操作对象移动时,即可以通过所 述场景视角的移动,确保所述操作作用区域与目标操作对象始终叠加,且所述操作作用区域持续位于所述应用的中心区域。
在一个优选实施例(参考图4)中,在步骤S412中,在调整所述相对位置过程中,若多个所述目标操作对象中每一个与所述操作作用区域的距离小于或等于距离阈值,所述设备1将所述操作作用区域叠加于所述多个目标操作对象中至少一个。
具体地,在一个优选场景下,同一时刻,在所述当前应用中,满足与所述操作作用区域的距离小于或等于距离阈值这一条件的所述目标操作对象可能是一个或是多个,在此,可以基于具体的应用场景,例如,基于所述操作作用区域对应的操作的特点,设置将所述操作作用区域叠加于一个、或是同时叠加于多个所述目标操作对象上。进一步,后续可以选择将所述操作作用区域对应的操作相应地作用于一个或是多个所述目标操作对象上。例如,以游戏应用为例,所述操作作用区域是游戏技能作用区域,如射击技能的瞄准框,所述操作作用区域对应的操作是可以向所述瞄准框中的多个对象同时发出多次射击,则若是同时有多个目标操作对象满足所述距离小于或等于所述距离阈值的条件,则可以将所述操作作用区域同时自动叠加于上述多个所述目标操作对象。
优选地,在步骤S412中,在调整所述相对位置过程中,若多个所述目标操作对象中每一个与所述操作作用区域的距离小于或等于距离阈值,所述设备1将所述操作作用区域叠加于所述多个目标操作对象中与所述操作作用区域距离最近的目标操作对象。
具体地,当同一时刻,在所述当前应用中,满足与所述操作作用区域的距离小于或等于距离阈值这一条件的所述目标操作对象可能是一个或是多个,可以以就近原则,将所述操作作用区域叠加于所述多个目标操作对象中与所述操作作用区域距离最近的目标操作对象。在此,所述距离可以基于需要,灵活地设置为所述目标操作对象与所述操作作用区域的中心点之间的距离,或是所述目标操作对象与所述操作作用区域的最短距离、或是最远距离等。例如,在游戏应用场景中,所述操作作用区域是游戏技能作用区域,如射击技能的瞄准框,优选地,以所述被攻击的目标操作对 象与所述瞄准框的中心点的距离为所述距离的测量标准,选择离该瞄准框中心点最近的目标操作对象为所叠加的目标操作对象。
优选地,在步骤S412中,在调整所述相对位置过程中,若多个所述目标操作对象中每一个与所述操作作用区域的距离小于或等于距离阈值,所述设备1将所述操作作用区域叠加于所述多个目标操作对象中对象等级最高的目标操作对象。
具体地,当同一时刻,在所述当前应用中,满足与所述操作作用区域的距离小于或等于距离阈值这一条件的所述目标操作对象可能是一个或是多个,此时,还可以基于各个目标操作对象的属性特征确定最终被叠加的所述目标操作对象。优选地,可以确定所述多个目标操作对象中对象等级最高的目标操作对象。在不同的应用场景中,所述对象等级最高的具体含义会有不同。以一个游戏应用为例,所述操作作用区域是游戏技能作用区域,如射击技能的瞄准框,所述操作作用区域对应的操作是可以向所述瞄准框中场景对象发出射击,所述目标操作对象包括被射击的游戏人物、建筑,或是其他道具等。具体地,此时所述对象等级最高的目标操作对象,即可以是基于对象的难攻克程度设定的。此外,在其他应用中,还可以基于所述目标操作对象的大小、体积、颜色、形状等属性确定最终被叠加的目标操作对象。
在本实施例中,所述设备1基于不同的预设条件却确定实际与所述操作作用区域所叠加的目标操作对象,其中,包括基于与所述操作作用区域距离最近的条件确定所述目标操作对象、和基于所述对象等级最高的条件确定目标操作对象。在此,各种确定方法可以单独适用,也可以配合适用,例如,优选地,基于与所述操作作用区域距离最近的条件确定所述目标操作对象,若是存在两个以上距离条件一致的目标操作对象,则可以进一步基于所述对象等级最高的标准确定被所述操作作用区域叠加的所述目标操作对象。
在此,本领域技术人员应该能理解上述各种确定与所述操作作用区域叠加的目标操作对象的方法仅为举例,现有或今后可能出现的其他确定被所述操作作用区域叠加的所述目标操作对象的方法如适用于本申请的,也 应包含在本申请保护范围以内,并以引用方式包含于此。
在一个优选实施例(参考图4)中,所述步骤S41还包括步骤S413(未示出),在步骤S413中,在调整所述相对位置过程中,若所述目标操作对象的状态为不可见,所述设备1根据所述应用的当前场景中操作对象与所述操作作用区域的距离重新确定所述目标操作对象。
具体地,在步骤S412中,所述设备1会在所述目标操作对象与所述操作作用区域小于或是等于距离阈值时,将所述操作作用区域叠加于所述目标操作对象,继而,该目标操作对象被该操作作用区域锁定,以游戏应用为例,即被攻击对象被瞄准框锁定。此时,进一步,当所述与所述操作作用区域叠加的目标操作对象的状态为不可见时,优选地,所述设备1将自动根据所述应用的当前场景中操作对象与所述操作作用区域的距离重新确定所述目标操作对象。在此,所述原有的目标操作对象的不可见状态包括在所述当前应用中,所述目标操作对象无法被所述用户看见、或识别,例如,所述目标操作对象在移动过程中被所述场景中的道具遮挡,或是被其他操作对象遮挡;或是基于某些触发设置,所述目标操作对象从所述当前应用中消失等。继而,所述设备1会放弃所确定的与所述操作作用区域叠加的所述目标操作对象,而是重新基于所述距离阈值的判断条件,选择新的目标操作对象,并自动将所述操作作用区域与重新确定的所述目标操作对象叠加。若是没有符合所述距离阈值的判断条件的操作对象,则优选地,会维持当前场景,并等待所述用户的后续操作。
在一个优选实施例(参考图3)中,所述方法还包括步骤S45(未示出),在步骤S45中,若所述操作执行完成,所述设备1根据所述应用的当前场景中操作对象与所述操作作用区域的距离重新确定所述目标操作对象。
具体地,在此,所述操作执行完成包括所述在所述目标操作对象与所述操作作用区域相叠后,设备1将所述操作作用区域对应的操作作用于所述目标操作对象。以游戏应用为例,即所述瞄准框中的被攻击对象被射击中。在其他应用中,则相应地理解为,需要对所述目标操作对象执行的任务操作已结束。则此时,所述设备1会放弃所确定的与所述操作作用区域叠加的所述目标操作对象,所述设备1会重新基于所述距离阈值的判断条 件,选择新的目标操作对象,并自动将所述操作作用区域与重新确定的所述目标操作对象叠加。此外,若是所述目标操作对象对应的所述操作被所述用户以外的其他用户抢先执行完成,所述设备1也会根据所述应用的当前场景中操作对象与所述操作作用区域的距离为所述用户重新确定所述目标操作对象。若是没有符合所述距离阈值的判断条件的操作对象,则优选地,会维持当前场景,并等待所述用户的后续操作。
图5示出根据本申请另一个优选实施例的一种在触摸终端上执行操作的实例示意图,图6示出根据本申请又一个优选实施例的一种在触摸终端上执行操作的实例示意图。
具体地,图5示出了以游戏应用为实例的本申请的一个示意图。图示51示出了一个所述设备1的触控终端的触屏,例如智能手机、或是平板电脑的触摸屏。图示52对应于所述目标操作对象为游戏中的被攻击对象,图示53对应为所述操作作用区域,即圆形的瞄准框,图示54标识出所述操作作用区域的中心点,即瞄准中心。图示61示出了与51相同的一个所述设备1的触控终端的触屏,例如智能手机、或是平板电脑的触摸屏图示62对应于所述目标操作对象为游戏中的被攻击对象,图示63对应为所述操作作用区域,即圆形的瞄准框,图示64标识出所述操作作用区域的中心点,即瞄准中心。当基于所述用户在当前应用的调整操作,所述被攻击对象52移动到图5当前位置,若是此位置与所述中心点54之间的距离小于或是等于所述距离阈值,则所述设备1将自动将所述被攻击对象由所述图示52示出的位置,变化为移动到所述图示62示出的位置,此时,所述被攻击对象62与当前应用的所述瞄准框63重叠。进而,所述操作作用区域对应的操作,此时为射击操作,即自动作用于所述被攻击对象62。
对于本领域技术人员而言,显然本申请不限于上述示范性实施例的细节,而且在不背离本申请的精神或基本特征的情况下,能够以其他的具体形式实现本申请。因此,无论从哪一点来看,均应将实施例看作是示范性的,而且是非限制性的,本申请的范围由所附权利要求而不是上述说明限定,因此旨在将落在权利要求的等同要件的含义和范围内的所有变化涵括在本申请内。不应将权利要求中的任何附图标记视为限制所涉及的权利要 求。此外,显然“包括”一词不排除其他单元或步骤,单数不排除复数。装置权利要求中陈述的多个单元或装置也可以由一个单元或装置通过软件或者硬件来实现。第一,第二等词语用来表示名称,而并不表示任何特定的顺序。

Claims (20)

  1. 一种在触摸终端上执行操作的方法,其中,所述方法包括:
    基于用户在当前应用的触控操作,调整所述应用中目标操作对象与操作作用区域的相对位置;
    若所述目标操作对象与所述操作作用区域相叠,将所述操作作用区域对应的操作作用于所述目标操作对象。
  2. 根据权利要求1所述的方法,其中,所述方法还包括:
    根据用户提交的自动执行操作指令设置所述应用的执行状态为自动执行状态;
    其中,所述若所述目标操作对象与所述操作作用区域相叠,将所述操作作用区域对应的操作作用于所述目标操作对象包括:
    若所述目标操作对象与所述操作作用区域相叠,且所述应用的执行状态为自动执行状态,将所述操作作用区域对应的操作作用于所述目标操作对象。
  3. 根据权利要求1或2所述的方法,其中,所述基于用户在当前应用的触控操作,调整所述应用中目标操作对象与操作作用区域的相对位置包括:
    基于用户在当前应用的触控操作,调整所述应用中目标操作对象与操作作用区域的相对位置;
    在调整所述相对位置过程中,若所述目标操作对象与所述操作作用区域的距离小于或等于距离阈值,将所述操作作用区域叠加于所述目标操作对象。
  4. 根据权利要求3所述的方法,其中,所述方法还包括:
    根据所述目标操作对象的移动信息调整所述操作作用区域,以保持所述操作作用区域叠加于所述目标操作对象。
  5. 根据权利要求4所述的方法,其中,所述根据所述目标操作对象的移动信息调整所述操作作用区域,以保持所述操作作用区域叠加于所述目标操作对象还包括:
    在调整所述操作作用区域的过程中,通过校准所述应用的场景视角使得所述操作作用区域位于所述应用的中心区域。
  6. 根据权利要求3中所述的方法,其中,所述在调整所述相对位置过程中,若所述目标操作对象与所述操作作用区域的距离小于或等于距离阈值,将所述操作作用区域叠加于所述目标操作对象包括:
    在调整所述相对位置过程中,若多个所述目标操作对象中每一个与所述操作作用区域的距离小于或等于距离阈值,将所述操作作用区域叠加于所述多个目标操作对象中至少一个。
  7. 根据权利要求6中所述的方法,其中,所述在调整所述相对位置过程中,若所述目标操作对象与所述操作作用区域的距离小于或等于距离阈值,将所述操作作用区域叠加于所述目标操作对象包括:
    在调整所述相对位置过程中,若多个所述目标操作对象中每一个与所述操作作用区域的距离小于或等于距离阈值,将所述操作作用区域叠加于所述多个目标操作对象中与所述操作作用区域距离最近的目标操作对象。
  8. 根据权利要求6中所述的方法,其中,所述在调整所述相对位置过程中,若所述目标操作对象与所述操作作用区域的距离小于或等于距离阈值,将所述操作作用区域叠加于所述目标操作对象包括:
    在调整所述相对位置过程中,若多个所述目标操作对象中每一个与所述操作作用区域的距离小于或等于距离阈值,将所述操作作用区域叠加于所述多个目标操作对象中对象等级最高的目标操作对象。
  9. 根据权利要求3所述的方法,其中,所述基于用户在当前应用的触控操作,调整所述应用中目标操作对象与操作作用区域的相对位置还包括:
    在调整所述相对位置过程中,若所述目标操作对象的状态为不可见,根据所述应用的当前场景中操作对象与所述操作作用区域的距离重新确定所述目标操作对象。
  10. 根据权利要求1所述的方法,其中,所述方法还包括:
    若所述操作执行完成,根据所述应用的当前场景中操作对象与所述操作作用区域的距离重新确定所述目标操作对象。
  11. 一种在触摸终端上执行操作的设备,其中,所述设备包括:
    第一装置,用于基于用户在当前应用的触控操作,调整所述应用中目 标操作对象与操作作用区域的相对位置;
    第二装置,用于若所述目标操作对象与所述操作作用区域相叠,将所述操作作用区域对应的操作作用于所述目标操作对象。
  12. 根据权利要求11所述的设备,其中,所述设备还包括:
    第三装置,用于根据用户提交的自动执行操作指令设置所述应用的执行状态为自动执行状态;
    其中,所述第二装置用于:
    若所述目标操作对象与所述操作作用区域相叠,且所述应用的执行状态为自动执行状态,将所述操作作用区域对应的操作作用于所述目标操作对象。
  13. 根据权利要求11或12所述的设备,其中,所述第一装置包括:
    第一单元,用于基于用户在当前应用的触控操作,调整所述应用中目标操作对象与操作作用区域的相对位置;
    第二单元,用于在调整所述相对位置过程中,若所述目标操作对象与所述操作作用区域的距离小于或等于距离阈值,将所述操作作用区域叠加于所述目标操作对象。
  14. 根据权利要求13所述的设备,其中,所述设备还包括:
    第四装置,用于根据所述目标操作对象的移动信息调整所述操作作用区域,以保持所述操作作用区域叠加于所述目标操作对象。
  15. 根据权利要求14所述的设备,其中,所述第四装置还用于:
    在调整所述操作作用区域的过程中,通过校准所述应用的场景视角使得所述操作作用区域位于所述应用的中心区域。
  16. 根据权利要求13中所述的设备,其中,所述第二单元用于:
    在调整所述相对位置过程中,若多个所述目标操作对象中每一个与所述操作作用区域的距离小于或等于距离阈值,将所述操作作用区域叠加于所述多个目标操作对象中至少一个。
  17. 根据权利要求16中所述的设备,其中,所述第二单元用于:
    在调整所述相对位置过程中,若多个所述目标操作对象中每一个与所述操作作用区域的距离小于或等于距离阈值,将所述操作作用区域叠加于 所述多个目标操作对象中与所述操作作用区域距离最近的目标操作对象。
  18. 根据权利要求16中所述的设备,其中,所述第二单元用于:
    在调整所述相对位置过程中,若多个所述目标操作对象中每一个与所述操作作用区域的距离小于或等于距离阈值,将所述操作作用区域叠加于所述多个目标操作对象中对象等级最高的目标操作对象。
  19. 根据权利要求13所述的设备,其中,所述第一装置还包括:
    第三单元,用于在调整所述相对位置过程中,若所述目标操作对象的状态为不可见,根据所述应用的当前场景中操作对象与所述操作作用区域的距离重新确定所述目标操作对象。
  20. 根据权利要求11所述的设备,其中,所述设备还包括:
    第五装置,用于若所述操作执行完成,根据所述应用的当前场景中操作对象与所述操作作用区域的距离重新确定所述目标操作对象。
PCT/CN2017/080391 2016-04-19 2017-04-13 一种在触摸终端上执行操作的方法与设备 WO2017181903A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610243518.7A CN105930081A (zh) 2016-04-19 2016-04-19 一种在触摸终端上执行操作的方法与设备
CN201610243518.7 2016-04-19

Publications (1)

Publication Number Publication Date
WO2017181903A1 true WO2017181903A1 (zh) 2017-10-26

Family

ID=56839230

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/080391 WO2017181903A1 (zh) 2016-04-19 2017-04-13 一种在触摸终端上执行操作的方法与设备

Country Status (2)

Country Link
CN (1) CN105930081A (zh)
WO (1) WO2017181903A1 (zh)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105930081A (zh) * 2016-04-19 2016-09-07 上海逗屋网络科技有限公司 一种在触摸终端上执行操作的方法与设备
CN108920069B (zh) * 2018-06-13 2020-10-23 网易(杭州)网络有限公司 一种触控操作方法、装置、移动终端和存储介质
CN110170168B (zh) * 2019-05-30 2022-05-27 腾讯科技(深圳)有限公司 虚拟对象射击控制方法、装置、电子设备及存储介质
CN111176525B (zh) * 2019-12-25 2022-05-31 联想(北京)有限公司 一种操作区域提示方法、电子设备及存储介质
CN111298437A (zh) * 2020-02-11 2020-06-19 腾讯科技(深圳)有限公司 虚拟攻击道具的控制方法及装置
CN111672119B (zh) * 2020-06-05 2023-03-10 腾讯科技(深圳)有限公司 瞄准虚拟对象的方法、装置、设备及介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2383027A2 (en) * 2010-04-28 2011-11-02 Kabushiki Kaisha Square Enix (also trading as Square Enix Co., Ltd.) User interface processing apparatus, method of processing user interface, and non-transitory computer-readable medium embodying computer program for processing user interface
CN103252087A (zh) * 2012-02-20 2013-08-21 富立业资讯有限公司 具有触控面板媒体的游戏控制方法及该游戏媒体
CN104750416A (zh) * 2015-03-13 2015-07-01 上海雪宝信息科技有限公司 一种用于在触摸终端上执行对象操作的方法与设备
CN104750419A (zh) * 2015-04-07 2015-07-01 上海雪宝信息科技有限公司 一种用于在触摸终端上操作对象的方法与设备
CN105148517A (zh) * 2015-09-29 2015-12-16 腾讯科技(深圳)有限公司 一种信息处理方法、终端及计算机存储介质
CN105930081A (zh) * 2016-04-19 2016-09-07 上海逗屋网络科技有限公司 一种在触摸终端上执行操作的方法与设备

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202270334U (zh) * 2011-10-19 2012-06-13 刘洋 射击游戏辅助装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2383027A2 (en) * 2010-04-28 2011-11-02 Kabushiki Kaisha Square Enix (also trading as Square Enix Co., Ltd.) User interface processing apparatus, method of processing user interface, and non-transitory computer-readable medium embodying computer program for processing user interface
CN103252087A (zh) * 2012-02-20 2013-08-21 富立业资讯有限公司 具有触控面板媒体的游戏控制方法及该游戏媒体
CN104750416A (zh) * 2015-03-13 2015-07-01 上海雪宝信息科技有限公司 一种用于在触摸终端上执行对象操作的方法与设备
CN104750419A (zh) * 2015-04-07 2015-07-01 上海雪宝信息科技有限公司 一种用于在触摸终端上操作对象的方法与设备
CN105148517A (zh) * 2015-09-29 2015-12-16 腾讯科技(深圳)有限公司 一种信息处理方法、终端及计算机存储介质
CN105930081A (zh) * 2016-04-19 2016-09-07 上海逗屋网络科技有限公司 一种在触摸终端上执行操作的方法与设备

Also Published As

Publication number Publication date
CN105930081A (zh) 2016-09-07

Similar Documents

Publication Publication Date Title
WO2017181903A1 (zh) 一种在触摸终端上执行操作的方法与设备
US10786733B2 (en) Information processing method, terminal, and computer storage medium for releasing virtual skill object based on user gesture
US9740380B1 (en) Method for interception and blocking of mouse move and resize events on mobile device
US9699411B2 (en) Integration of videoconferencing with interactive electronic whiteboard appliances
US10627987B2 (en) Method for launching a second application using a first application icon in an electronic device
US10313426B2 (en) Method of managing control right, client device therefor, and master device therefor
WO2017181902A1 (zh) 一种在触摸终端上执行用户操作的方法与设备
WO2017036019A1 (zh) 移动终端控制的方法及移动终端
JP2016533100A (ja) 撮影パラメータの調節方法、装置、プログラム、及び記録媒体
US20170262162A1 (en) Method and electronic device for processing terminal folder
US20150138192A1 (en) Method for processing 3d object and electronic device thereof
JP2017130185A (ja) イベント信号処理方法及び装置並びにイベント基盤ビジョンセンサ基盤のピクセル増強方法及び装置
EP3382523A2 (en) Method and device for implementing screenshot, and terminal
WO2021243788A1 (zh) 屏幕截图方法及装置
JP2018514819A (ja) 操作処理方法、装置、プログラム、及び記録媒体
US20150186029A1 (en) Multiscreen touch gesture to determine relative placement of touch screens
WO2020192175A1 (zh) 一种立体图形的标注方法、装置、设备及介质
CN105681916A (zh) 一种视频弹幕显示方法及装置
JP2020515996A (ja) 認識した語を迅速に挿入する方法およびデバイス
US20120324396A1 (en) Method for quick application attribute transfer by user interface instance proximity
US20170364233A1 (en) Operation processing method, electronic device, and computer storage medium
US20150326705A1 (en) Mobile Device Data Transfer Using Location Information
WO2019218622A1 (zh) 元素控制方法、装置、设备及存储介质
WO2024037415A1 (zh) 投屏内容显示方法、装置、设备及存储介质
TW201913299A (zh) 終端設備的操作方法、裝置以及電子設備

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17785388

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 06.03.2019)

122 Ep: pct application non-entry in european phase

Ref document number: 17785388

Country of ref document: EP

Kind code of ref document: A1