CN109857303B - Interaction control method and device - Google Patents

Interaction control method and device Download PDF

Info

Publication number
CN109857303B
CN109857303B CN201910105587.5A CN201910105587A CN109857303B CN 109857303 B CN109857303 B CN 109857303B CN 201910105587 A CN201910105587 A CN 201910105587A CN 109857303 B CN109857303 B CN 109857303B
Authority
CN
China
Prior art keywords
control
area
sliding operation
game
end point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910105587.5A
Other languages
Chinese (zh)
Other versions
CN109857303A (en
Inventor
黄博宇
张明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201910105587.5A priority Critical patent/CN109857303B/en
Publication of CN109857303A publication Critical patent/CN109857303A/en
Application granted granted Critical
Publication of CN109857303B publication Critical patent/CN109857303B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure relates to an interaction control method and device. The method comprises the following steps: providing a game interface, wherein the game interface comprises a scene area and an interaction area, the interaction area comprises at least one control, and each control is associated with an object; detecting whether an input initial event exists in the interactive area, and judging whether the input initial event is positioned on a control when the input initial event exists in the interactive area; when the input starting event is judged to be positioned on a control, selecting an object in a scene area, which is associated with the control where the input starting event is positioned, and detecting whether a sliding operation continuous to the input starting event exists; when the sliding operation continuous with the input starting event is detected, acquiring the end point of the sliding operation track, and judging whether the end point of the track is positioned in a scene area; and when the end point of the track is judged to be positioned in the scene area, controlling the action of the selected object according to the end point of the track. The method and the device can greatly improve the interaction experience of the user.

Description

Interaction control method and device
The patent application of the invention is a divisional application of Chinese patent application with the application date of 2016, 5 and 27 months and the application number of 201610362985.1 and named as an interactive control method and device.
Technical Field
The present disclosure relates to the field of human-computer interaction technologies, and in particular, to an interaction control method and an interaction control apparatus.
Background
With the rapid development of communication technology, more and more game applications are emerging on various terminal devices. During the running process of the game application, the terminal equipment displays various game objects according to a certain layout so as to present a game interface to a player.
Referring to FIG. 1, in one type of gaming application, a gaming interface 10 includes a scene area 101 and an interaction area 102, where the scene area 101 primarily provides elements of the environment, buildings, machinery, props, etc. in the game; the interaction area 102 includes avatar controls A0-D0 as well as other controls for implementing interactive functionality. In this type of game application, it may involve creating objects in the scene area 101, and controlling the objects that have been created in the scene area 101 (e.g., object a1 and object B1, etc.).
In one solution, a user may first click on an avatar control in the interactive region 102, then click to select a specific location in the scene region 101, and if the condition is satisfied, an object associated with the avatar control may be created in the specific location in the scene region 101. However, the technical scheme has low operation efficiency, and the user needs to perform two click operations, which wastes operation time and increases physical consumption generated by the interactive operation, thereby affecting game experience.
In one solution, after creating an object in the scene area 101, a touch sensitive area may be registered in the scene manager, and the size is proportional to the sequential frame size of the object. Each time the scene area 101 is clicked, the sensitive area can be traversed to select the object closest to the click position; if there is a slide operation, the movement of the selected object can be controlled with the selected object as a starting point and an end point of the slide operation trajectory as an end point. However, in this technical solution, the size of the response area of the object in the scene area 101 may be changed by the zoom operation of the user on the game scene, or may exceed the range of the scene area 101 by the slide operation of the user on the game scene, which may cause the problem of being unable to reach or reducing the operation efficiency.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
An object of the present disclosure is to provide an interactive control method and an interactive control apparatus, thereby overcoming, at least to some extent, one or more of the problems due to the limitations and disadvantages of the related art.
According to a first aspect of the embodiments of the present disclosure, there is provided an interaction control method, including:
providing a game interface, wherein the game interface comprises a scene area and an interaction area, the interaction area comprises at least one control, and each control is associated with an object;
detecting whether an input initial event exists in the interactive area or not, and judging whether the input initial event is located in one control or not when the input initial event exists in the interactive area;
when the input starting event is judged to be positioned on one control, detecting whether a sliding operation continuous with the input starting event exists;
when the sliding operation continuous to the input starting event is detected, acquiring the end point of the sliding operation track, and judging whether the end point of the track is located in the scene area;
and when the terminal point of the track is judged to be positioned in the scene area, creating an object associated with the control where the input starting event is positioned in the scene area.
In an exemplary embodiment of the present disclosure, the interaction control method further includes:
and when the terminal point of the track is judged to be positioned in the interactive area, canceling the creation of the object associated with the control where the input starting event is positioned in the scene area.
In an exemplary embodiment of the present disclosure, the interaction control method further includes:
when the sliding operation continuous to the input starting event is detected, acquiring the current position of the sliding operation, and judging whether the current position of the sliding operation is located in the scene area;
and when the current position of the sliding operation is judged to be located in the scene area, creating a cursor object of an object associated with the control where the input starting event is located at the current position of the sliding operation.
In an exemplary embodiment of the present disclosure, creating a cursor object of an object associated with a control at which the input initiation event is located includes:
and removing the object resource on the control where the input starting event is located from the interaction area and transmitting the object resource to the scene area so as to repeatedly use the object resource to create the cursor object.
In an exemplary embodiment of the present disclosure, the interaction control method further includes:
acquiring the current position of the sliding operation, and judging whether the distance between the current position of the sliding operation and the edge of the game interface is smaller than a preset value or not;
and when the distance between the current position of the sliding operation and the edge of the game interface is judged to be smaller than the preset value, moving a game scene according to the direction of the edge closest to the current position of the sliding operation.
According to a second aspect of the embodiments of the present disclosure, there is provided an interaction control method, including:
providing a game interface, wherein the game interface comprises a scene area and an interaction area, the interaction area comprises at least one control, and each control is associated with an object;
detecting whether an input initial event exists in the interactive area or not, and judging whether the input initial event is located in one control or not when the input initial event exists in the interactive area;
when the input starting event is judged to be positioned on one control, selecting an object in the scene area, which is associated with the control where the input starting event is positioned, and detecting whether a sliding operation continuous to the input starting event exists;
when the sliding operation continuous to the input starting event is detected, acquiring the end point of the sliding operation track, and judging whether the end point of the track is located in the scene area;
and when the end point of the track is judged to be positioned in the scene area, controlling the action of the selected object according to the end point of the track.
In an exemplary embodiment of the present disclosure, the act of controlling the selected object according to the end point of the trajectory includes:
judging whether an object of a preset type exists at a position corresponding to the end point of the track in the scene area;
when the position corresponding to the end point of the track in the scene area is judged to have the object of the preset type, controlling the selected object to execute a first action;
and controlling the selected object to execute a second action when the position corresponding to the end point of the track in the scene area is judged to have no object of the preset type.
According to a third aspect of the embodiments of the present disclosure, there is provided an interaction control apparatus including:
the game system comprises a presentation unit, a game unit and a display unit, wherein the presentation unit is used for providing a game interface, the game interface comprises a scene area and an interaction area, the interaction area comprises at least one control, and each control is associated with an object;
the first detection unit is used for detecting whether an input starting event exists in the interactive area or not, and judging whether the input starting event is located in one control or not when the input starting event exists in the interactive area;
the second detection unit is used for detecting whether the sliding operation continuous to the input starting event exists or not when the input starting event is judged to be positioned on the control;
the first judgment unit is used for acquiring the end point of the sliding operation track when the sliding operation continuous to the input starting event is detected, and judging whether the end point of the track is positioned in the scene area;
and the first creating unit is used for creating an object associated with the control where the input starting event is located in the scene area when the terminal point of the track is judged to be located in the scene area.
In an exemplary embodiment of the present disclosure, the first creating unit is further configured to:
and when the terminal point of the track is judged to be positioned in the interactive area, canceling the creation of the object associated with the control where the input starting event is positioned in the scene area.
In an exemplary embodiment of the present disclosure, the interactive control device further includes:
a second determining unit, configured to, when a sliding operation that is continuous with the input start event is detected, obtain a current position of the sliding operation, and determine whether the current position of the sliding operation is located in the scene area;
and a second creating unit, configured to create, when it is determined that the current position of the sliding operation is located in the scene area, a cursor object of an object associated with the control where the input start event is located at the current position of the sliding operation.
In an exemplary embodiment of the present disclosure, creating a cursor object of an object associated with a control at which the input initiation event is located includes:
and removing the object resource on the control where the input starting event is located from the interaction area and transmitting the object resource to the scene area so as to repeatedly use the object resource to create the cursor object.
In an exemplary embodiment of the present disclosure, the interactive control device further includes:
a third judging unit, configured to obtain a current position of the sliding operation, and judge whether a distance between the current position of the sliding operation and an edge of the game interface is smaller than a preset value;
and the scene control unit is used for moving a game scene according to the direction of the edge closest to the current position of the sliding operation when the distance between the current position of the sliding operation and the edge of the game interface is judged to be smaller than the preset value.
According to a fourth aspect of the embodiments of the present disclosure, there is provided an interaction control apparatus including:
the game system comprises a presentation unit, a game unit and a display unit, wherein the presentation unit is used for providing a game interface, the game interface comprises a scene area and an interaction area, the interaction area comprises at least one control, and each control is associated with an object;
the first detection unit is used for detecting whether an input starting event exists in the interactive area or not, and judging whether the input starting event is located in one control or not when the input starting event exists in the interactive area;
the second detection unit is used for selecting an object in the scene area, which is associated with the control where the input starting event is located, and detecting whether the sliding operation continuous to the input starting event exists or not when the input starting event is judged to be located on the control;
the judging unit is used for acquiring the end point of the sliding operation track when the sliding operation continuous to the input starting event is detected, and judging whether the end point of the track is positioned in the scene area;
and the control unit is used for controlling the action of the selected object according to the end point of the track when the end point of the track is judged to be positioned in the scene area.
In an exemplary embodiment of the present disclosure, the act of controlling the selected object according to the end point of the trajectory includes:
judging whether an object of a preset type exists at a position corresponding to the end point of the track in the scene area;
when the position corresponding to the end point of the track in the scene area is judged to have the object of the preset type, controlling the selected object to execute a first action;
and controlling the selected object to execute a second action when the position corresponding to the end point of the track in the scene area is judged to have no object of the preset type.
In an embodiment of the present disclosure, an interactive control method between a dynamic scene control and a static interface control is provided, so that two originally dispersed operations can be integrated into one smooth and complete operation on the premise of not affecting an existing interactive framework, thereby saving operation steps of a user, reducing user reaction time, and reducing physical consumption of the user, so that interactive experience of the user can be improved to a great extent.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
FIG. 1 schematically illustrates a gaming interface of a gaming application of the prior art.
Fig. 2 schematically illustrates a flow chart of an interaction control method in an exemplary embodiment of the present disclosure.
Fig. 3A, 3B, and 3C schematically illustrate a game interface of a game application in an exemplary embodiment of the disclosure.
Fig. 4 schematically illustrates a flow chart of an interaction control method in an exemplary embodiment of the present disclosure.
Fig. 5A, 5B, and 5C schematically illustrate a game interface of a game application in an exemplary embodiment of the disclosure.
Fig. 6 schematically illustrates a block diagram of an interactive control device in an exemplary embodiment of the present disclosure.
Fig. 7 schematically illustrates a block diagram of an interactive control device in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The exemplary embodiment first provides an interaction control method, which may be applied to a terminal device. The terminal device may be, for example, a PC terminal such as a notebook computer and a desktop computer, or may be various touch terminals having a touch screen such as a mobile phone, a tablet PC game machine, and a PDA. In the present exemplary embodiment, a touch terminal having a touch screen will be mainly used as an example for description; however, it is easily understood by those skilled in the art that the technical solutions in the present exemplary embodiment can also be applied to a PC terminal controlled by a mouse or a touch pad. Referring to fig. 2 and fig. 3A to 3C, in this example embodiment, the method for controlling the formation of the virtual character may include the following steps:
s11, providing a game interface, wherein the game interface comprises a scene area and an interaction area, the interaction area comprises at least one control, and each control is associated with an object.
Referring to fig. 3A, a game application controls a screen display game interface 10 of a terminal device through an Application Program Interface (API) of the terminal device, and the game interface 10 in the present exemplary embodiment may be the entire displayable area of the terminal device, i.e., full screen display; or may be a part of a displayable area, i.e. a window display, of the terminal device. The game interface 10 may include a scene area 101 and an interaction area 102. Wherein, the scene area 101 mainly provides elements such as environment, building, machinery, props, etc. in the game. The interaction area 102 may include avatar controls A0-D0 as well as other controls for implementing interactive functionality. In the present exemplary embodiment, the interaction area 102 is located below the game interface 10, but in other exemplary embodiments of the present disclosure, more layout manners may be adopted, and this is not particularly limited in the present exemplary embodiment.
Taking the example that the control of the interaction region 102 is an avatar control, in this exemplary embodiment, the interaction region 102 may include a plurality of avatar controls, and each avatar control is associated with an object; for example, the avatar control A0 is associated with object A1, the avatar control B0 is associated with object B1, the avatar control C0 is associated with object C1, the avatar control D0 is associated with object D1, and so on. Accordingly, operations related to the object a1 can be performed by the avatar control a0, operations related to the object B1 can be performed by the avatar control B0, and the like.
S12, detecting whether an input starting event exists in the interactive area, and judging whether the input starting event is located in one control when the input starting event exists in the interactive area.
The input events of the interaction area 102 are periodically detected, for example, a finger touch operation, and the input events may include a start motion of a finger, a hand raising motion, and a sliding operation. In various operations, a finger needs to touch the touch screen first, i.e. an initial action is generated. In the case of mouse control, the input event may also include a mouse press, release, and slide operation. In various operations, the mouse needs to be pressed first, i.e. an initial action is generated. Therefore, in this exemplary embodiment, the starting action may be the input starting event described in this exemplary embodiment.
After detecting an input start event, such as a start action, in the interaction area 102, coordinates of the input start event may be further obtained, and it may be determined whether the input start event is located in one of the controls. In order to provide a feedback prompt to the user, in the present exemplary embodiment, after determining that the input start event is located in one of the avatar controls, the control in which the input start event is located is highlighted or otherwise highlighted. Referring to FIG. 3A, upon determining that the input initiation event is located at avatar control C0, and that object C1 associated with avatar control C0 satisfies the condition for assignable commands (e.g., no drop), avatar control C0 will be highlighted. In addition, in the present exemplary embodiment, other contents such as the ID and the attribute of the object C1 associated with the avatar control C0 may be acquired in response to the input start event, and the object may be attached to the avatar of the object C1.
And S13, when the input starting event is judged to be positioned on one control, detecting whether the sliding operation continuous to the input starting event exists.
In this exemplary embodiment, after determining that the input start event is located on one of the controls, a hand raising motion is not detected, and a sliding motion greater than a preset distance, for example, a sliding motion greater than 15 pixels, is detected after the start motion, it may be determined that there is a sliding operation that continues with the input start event. After the input start event is determined to be located on one control, if the sliding motion larger than the preset distance is not detected and the hand-lifting motion is detected, it may be determined that there is a continuous click operation or a long-time press operation with the input start event. The specific parameters and the like of the sliding operation determination may be set by the user, the terminal device manufacturer, or the game facilitator according to needs, and this is not particularly limited in this exemplary embodiment.
And S14, when the sliding operation continuous to the input starting event is detected, acquiring the end point of the sliding operation track, and judging whether the end point of the track is located in the scene area.
In this exemplary embodiment, after the sliding operation that is continuous with the input start event is detected, if a hand-raising motion is detected, it may be determined that the sliding operation has ended, and at this time, the position of the sliding operation trajectory is the end point. According to the coordinate position of the end point of the track in the screen, it can be determined whether the end point of the track is located in the scene area 101 or the interaction area 102.
In addition, referring to fig. 3B, in the present exemplary embodiment, during the sliding process of the finger or the mouse, a cursor object of an object associated with the control where the input start event is located may also be created at the current position of the finger or the mouse. For example, in this example embodiment, the interaction control method may further include:
when a sliding operation continuous to the input start event is detected, the current position of the sliding operation is acquired, and whether the current position of the sliding operation is located in the scene area 101 is determined.
For example, in the present exemplary embodiment, after the sliding operation that is continuous with the input start event is detected, if the hand raising motion is not detected, it may be determined that the sliding operation is still continuing, and the position of the sliding operation at this time is the current position. According to the coordinates of the current position of the sliding operation in the screen, it can be determined whether the current position of the sliding operation is located in the scene area 101 or the interaction area 102.
When the current position of the sliding operation is judged to be located in the scene area 101, a cursor object of an object associated with the control where the input start event is located is created at the current position of the sliding operation.
For example, in the present exemplary embodiment, when it is determined that the current position of the sliding operation is located in the scene area 101, the coordinate position of the current position of the sliding operation in the screen is converted into coordinates in the scene area 101, and a cursor object of an object associated with the control at which the input start event is located is created at the coordinate position. As shown in fig. 3B, upon determining that the current position of the slide operation is located in the scene area 101, a cursor object C2 of an object C1 may be created at the current position of the slide operation; the cursor object will move with the movement of the current position of the sliding operation.
Further, in this exemplary embodiment, creating a cursor object of an object associated with the control at which the input start event is located may include removing an object resource on the control at which the input start event is located from the interaction region 102 and transferring the object resource to the scene region 101, so as to repeatedly use the object resource to create the cursor object. For example, as shown in fig. 3B, a script file (e.g., ccb project file) corresponding to the object C1 in the avatar control C0 may be removed from the interactive region 102 and transferred to the scene region 101, and the scene region 101 repeatedly uses the script file to create a cursor object C2 at the coordinates corresponding to the current position of the sliding operation. Thus, game resources can be saved, and the game pause can be reduced to a certain extent.
And S15, when the terminal point of the track is judged to be located in the scene area, creating an object associated with the control where the input starting event is located in the scene area.
For example, in the present exemplary embodiment, when it is determined that the end point of the trajectory is located in the scene area 101, the coordinate position of the end point of the trajectory in the screen is converted into the coordinate in the scene area 101, and an object associated with the control where the input start event is located is created at the coordinate position. As shown in fig. 3C, when it is determined that the current position of the sliding operation is located in the scene area 101, an object C1 associated with the avatar control C0 may be created at the end point of the trajectory, and an AI component or the like may be assigned to the object C1.
Correspondingly, when it is determined that the end point of the trajectory is located in the interaction region 102, creating an object associated with the control where the input start event is located in the scene region 101 may be cancelled. For example, if the end of the trajectory is located in the interaction region 102 in FIG. 3B, the cursor object C2 may be dismissed, and the avatar of the object C1 may be displayed at avatar control C0, and the object C1 associated with avatar control C0 may be dismissed from being created in the scene region 101.
In most gaming applications, the scene area 101 may only display a portion of the game scene. In this example embodiment, the interaction control method may further include:
acquiring the current position of the sliding operation, and judging whether the distance between the current position of the sliding operation and the edge of the game interface 10 is smaller than a preset value (for example, smaller than 20% of the screen size); when the distance between the current position of the sliding operation and the edge of the game interface 10 is judged to be smaller than the preset value, a timer can be started, and a game scene can be moved to the edge direction in each frame according to the direction in which the edge closest to the current position of the sliding operation is located; for example, if the current position of the sliding operation is closest to the right edge of the game interface, the game scene is moved to the right. And, the closer the current position of the sliding operation is to the edge of the game interface 10, the larger the magnitude of each movement of the game scene. When the distance between the current position of the sliding operation and the edge of the game interface 10 is greater than a preset value, the timer may be closed, and the game scene may stop moving.
Further, another interaction control method is provided in this exemplary embodiment to control the created object. Referring to fig. 4, the interaction control method in this example embodiment may include:
s21, providing a game interface, wherein the game interface comprises a scene area and an interaction area, the interaction area comprises at least one control, and each control is associated with an object.
S22, detecting whether an input starting event exists in the interactive area, and judging whether the input starting event is located in one control when the input starting event exists in the interactive area.
And S23, when the input starting event is judged to be positioned on one control, selecting an object in the scene area, which is associated with the control where the input starting event is positioned.
For example, referring to fig. 5C, when it is determined that the input start event is located on the avatar control C0, if the object C1 associated with the avatar control C0 has been created, the object C1 may be selected and the user may be given a feedback prompt by halo; if the object C1 associated with the avatar control C0 has not been created, the above-described steps S13 through S15 may be performed.
And S24, when the sliding operation continuous to the input starting event is detected, acquiring the end point of the sliding operation track, and judging whether the end point of the track is positioned in the scene area.
And S25, when the terminal point of the track is judged to be located in the scene area, controlling the action of the selected object according to the terminal point of the track.
For example, in the present exemplary embodiment, when it is determined that the end point of the trajectory is located in the scene area 101, the coordinate position of the end point of the trajectory in the screen is converted into a coordinate in the scene area 101, and it is determined whether or not a preset type of object, which may be, for example, an enemy object, an friend object, or the like, exists within the coordinate range. Referring to fig. 5B, when it is determined that an object of the preset type (e.g., a local object E1) exists at a position corresponding to the end point of the trajectory in the scene area 101, controlling the selected object C1 to perform a first action, such as an attack action or the like; the first action may also be a gain action, etc., if the preset type of object is a friend object. Referring to fig. 5C, when it is determined that the preset type of object does not exist at the position corresponding to the end point of the trajectory in the scene area 101, the selected object C1 is controlled to perform a second action, for example, move to the position corresponding to the end point of the trajectory.
The same steps in the interactive control method in fig. 4 as those in the interactive control method in fig. 2 have already been described in detail in the exemplary embodiment corresponding to the interactive control method in fig. 2, and therefore are not repeated here.
In the two interaction control methods between the dynamic scene control and the static interface control in the example embodiment, originally two times of dispersed operations can be integrated into one smooth and complete operation on the premise of not affecting the existing interaction framework, so that the operation steps of the user are saved, the user reaction time is reduced, and meanwhile, the physical consumption of the user is reduced, so that the interaction experience of the user can be improved to a great extent.
Further, the present exemplary embodiment also provides an interaction control apparatus, which is applied to a terminal device. Referring to fig. 6, the interactive control device 1 may include:
the presentation unit 11 may be configured to provide a game interface, where the game interface includes a scene area and an interaction area, where the interaction area includes at least one control, and each control is associated with an object.
The first detecting unit 12 may be configured to detect whether an input start event exists in the interactive area, and determine whether the input start event is located in one of the controls when the input start event exists in the interactive area.
The second detecting unit 13 may be configured to detect whether there is a sliding operation that is continuous with the input start event when the input start event is determined to be located on one of the controls.
The first determining unit 14 may be configured to, when a sliding operation that is continuous with the input start event is detected, obtain an end point of the sliding operation trajectory, and determine whether the end point of the trajectory is located in the scene area.
The first creating unit 15 may be configured to, when it is determined that the end point of the trajectory is located in the scene area, create an object associated with the control where the input start event is located in the scene area.
In an exemplary embodiment of the present disclosure, the first creating unit 15 may be further configured to:
and when the terminal point of the track is judged to be positioned in the interactive area, canceling the creation of the object associated with the control where the input starting event is positioned in the scene area.
In an exemplary embodiment of the present disclosure, the interaction control apparatus may further include:
the second determining unit may be configured to, when a sliding operation that is continuous with the input start event is detected, obtain a current position of the sliding operation, and determine whether the current position of the sliding operation is located in the scene area.
The second creating unit may be configured to create, when it is determined that the current position of the sliding operation is located in the scene area, a cursor object of an object associated with the control where the input start event is located at the current position of the sliding operation.
In an exemplary embodiment of the present disclosure, creating a cursor object of an object associated with a control at which the input initiation event is located includes:
and removing the object resource on the control where the input starting event is located from the interaction area and transmitting the object resource to the scene area so as to repeatedly use the object resource to create the cursor object.
In an exemplary embodiment of the present disclosure, the interactive control device further includes:
the third determining unit may be configured to obtain a current position of the sliding operation, and determine whether a distance between the current position of the sliding operation and an edge of the game interface is smaller than a preset value.
And the scene control unit may be configured to, when it is determined that the distance between the current position of the sliding operation and the edge of the game interface is smaller than the preset value, move a game scene according to a direction in which the edge closest to the current position of the sliding operation is located.
Further, the present exemplary embodiment also provides an interaction control apparatus, which is applied to a terminal device. Referring to fig. 7, the interactive control device 2 may include:
the presentation unit 21 may be configured to provide a game interface, where the game interface includes a scene area and an interaction area, where the interaction area includes at least one control, and each control is associated with an object.
The first detecting unit 22 may be configured to detect whether an input start event exists in the interactive area, and determine whether the input start event is located in one of the controls when the input start event exists in the interactive area.
The second detecting unit 23 may be configured to, when it is determined that the input start event is located on one of the controls, select an object associated with the control where the input start event is located in the scene area, and detect whether there is a sliding operation continuous with the input start event.
The determining unit 24 may be configured to, when a sliding operation that is continuous with the input start event is detected, obtain an end point of the sliding operation track, and determine whether the end point of the track is located in the scene area.
The control unit 25 may be configured to control the motion of the selected object according to the end point of the trajectory when it is determined that the end point of the trajectory is located in the scene area.
In an exemplary embodiment of the present disclosure, the act of controlling the selected object according to the end point of the trajectory includes:
judging whether an object of a preset type exists at a position corresponding to the end point of the track in the scene area;
when the position corresponding to the end point of the track in the scene area is judged to have the object of the preset type, controlling the selected object to execute a first action;
and controlling the selected object to execute a second action when the position corresponding to the end point of the track in the scene area is judged to have no object of the preset type.
The details of each module in the interaction control device have been described in detail in the corresponding interaction control method, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Moreover, although the steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that the steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a touch terminal, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (4)

1. An interaction control method, comprising:
providing a game interface, wherein the game interface comprises a scene area and an interaction area, the interaction area comprises at least one control, and each control is associated with a game object;
detecting whether an input initial event exists in the interactive area, and judging whether the input initial event is positioned on a control or not when the input initial event exists in the interactive area;
when the input starting event is judged to be positioned on one control, selecting a game object in the scene area, which is associated with the control where the input starting event is positioned, and detecting whether sliding operation continuous to the input starting event exists or not;
when the sliding operation continuous to the input starting event is detected, acquiring the end point of the sliding operation track, and judging whether the end point of the track is located in the scene area;
when the current position of the sliding operation is judged to be located in the scene area, a cursor object of a game object related to the control where the input starting event is located is created at the current position of the sliding operation; the cursor object is created by removing the object resource on the control where the input starting event is located from the interaction area and transferring the object resource to the scene area so as to repeatedly use the object resource; when it is determined that the end point of the trajectory is located in the scene area,
judging whether a preset type of game object exists at a position corresponding to the end point of the track in the scene area;
when the game object of the preset type exists at the position corresponding to the end point of the track in the scene area, controlling the selected game object to execute a first action; the first action comprises an attack action and/or a gain action;
and controlling the selected game object to execute a second action when the position corresponding to the end point of the track in the scene area is judged not to have the game object of the preset type.
2. The interaction control method according to claim 1, further comprising:
acquiring the current position of the sliding operation, and judging whether the distance between the current position of the sliding operation and the edge of the game interface is smaller than a preset value or not;
and when the distance between the current position of the sliding operation and the edge of the game interface is judged to be smaller than the preset value, moving a game scene according to the direction of the edge closest to the current position of the sliding operation.
3. An interactive control apparatus, comprising:
the game system comprises a presentation unit, a game processing unit and a control unit, wherein the presentation unit is used for providing a game interface, the game interface comprises a scene area and an interaction area, the interaction area comprises at least one control, and each control is associated with a game object;
the first detection unit is used for detecting whether an input starting event exists in the interactive area or not, and judging whether the input starting event is located in one control or not when the input starting event exists in the interactive area;
the second detection unit is used for selecting a game object in the scene area, which is associated with the control where the input starting event is located, when the input starting event is judged to be located on the control, and detecting whether sliding operation continuous to the input starting event exists or not;
the judging unit is used for acquiring the end point of the sliding operation track when the sliding operation continuous to the input starting event is detected, and judging whether the end point of the track is positioned in the scene area;
when the current position of the sliding operation is judged to be located in the scene area, a cursor object of a game object related to the control where the input starting event is located is created at the current position of the sliding operation; the cursor object is created by removing the object resource on the control where the input starting event is located from the interaction area and transferring the object resource to the scene area so as to repeatedly use the object resource;
a control unit for, when it is judged that the end point of the trajectory is located in the scene area,
judging whether a preset type of game object exists at a position corresponding to the end point of the track in the scene area;
when the game object of the preset type exists at the position corresponding to the end point of the track in the scene area, controlling the selected game object to execute a first action; the first action comprises an attack action and/or a gain action;
and controlling the selected game object to execute a second action when the position corresponding to the end point of the track in the scene area is judged not to have the game object of the preset type.
4. The interactive control device of claim 3, further comprising:
a third judging unit, configured to obtain a current position of the sliding operation, and judge whether a distance between the current position of the track and an edge of the game interface is smaller than a preset value;
and the scene control unit is used for moving a game scene according to the direction of the edge closest to the current position of the sliding operation when the distance between the current position of the track and the edge of the game interface is judged to be smaller than the preset value.
CN201910105587.5A 2016-05-27 2016-05-27 Interaction control method and device Active CN109857303B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910105587.5A CN109857303B (en) 2016-05-27 2016-05-27 Interaction control method and device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910105587.5A CN109857303B (en) 2016-05-27 2016-05-27 Interaction control method and device
CN201610362985.1A CN106020633A (en) 2016-05-27 2016-05-27 Interaction control method and device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201610362985.1A Division CN106020633A (en) 2016-05-27 2016-05-27 Interaction control method and device

Publications (2)

Publication Number Publication Date
CN109857303A CN109857303A (en) 2019-06-07
CN109857303B true CN109857303B (en) 2021-04-02

Family

ID=57094518

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201610362985.1A Pending CN106020633A (en) 2016-05-27 2016-05-27 Interaction control method and device
CN201910105587.5A Active CN109857303B (en) 2016-05-27 2016-05-27 Interaction control method and device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201610362985.1A Pending CN106020633A (en) 2016-05-27 2016-05-27 Interaction control method and device

Country Status (1)

Country Link
CN (2) CN106020633A (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106422329A (en) * 2016-11-01 2017-02-22 网易(杭州)网络有限公司 Game control method and device
CN106453638B (en) * 2016-11-24 2018-07-06 腾讯科技(深圳)有限公司 Information interacting method and system in a kind of applied business
WO2018196552A1 (en) 2017-04-25 2018-11-01 腾讯科技(深圳)有限公司 Method and apparatus for hand-type display for use in virtual reality scene
CN107168530A (en) * 2017-04-26 2017-09-15 腾讯科技(深圳)有限公司 Object processing method and device in virtual scene
CN111279303B (en) * 2017-08-29 2024-03-19 深圳传音通讯有限公司 Control triggering method and terminal equipment
CN114237807B (en) * 2018-11-20 2024-06-11 创新先进技术有限公司 Associated control interaction method and device
CN110193190B (en) * 2019-06-03 2023-02-28 网易(杭州)网络有限公司 Game object creating method, touch terminal device, electronic device and medium
CN114237413A (en) * 2020-09-09 2022-03-25 华为技术有限公司 Method and device for processing interaction event
CN112631492A (en) * 2020-12-30 2021-04-09 北京达佳互联信息技术有限公司 Task creation method and device
CN113599825B (en) * 2021-08-10 2023-06-20 腾讯科技(深圳)有限公司 Method and related device for updating virtual resources in game
CN115826828B (en) * 2023-02-23 2023-07-14 天津联想协同科技有限公司 Network disk file operation method, device, terminal and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103785170A (en) * 2012-10-26 2014-05-14 株式会社得那 Game providing device
CN104436657A (en) * 2014-12-22 2015-03-25 青岛烈焰畅游网络技术有限公司 Method and device for controlling game and electronic equipment
CN105025061A (en) * 2014-04-29 2015-11-04 中国电信股份有限公司 Method and server for constructing cloud-end shared game scene
CN105194871A (en) * 2015-09-14 2015-12-30 网易(杭州)网络有限公司 Method for controlling game role
JP2016004500A (en) * 2014-06-18 2016-01-12 株式会社コロプラ User interface program
CN105582670A (en) * 2015-12-17 2016-05-18 网易(杭州)网络有限公司 Aimed-firing control method and device
CN105597310A (en) * 2015-12-24 2016-05-25 网易(杭州)网络有限公司 Game control method and device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103019426A (en) * 2011-09-28 2013-04-03 腾讯科技(深圳)有限公司 Interacting method and interacting device in touch terminal
CN104182880B (en) * 2014-05-16 2015-10-28 孙锋 A kind of net purchase method and system based on true man and/or 3D model in kind

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103785170A (en) * 2012-10-26 2014-05-14 株式会社得那 Game providing device
CN105025061A (en) * 2014-04-29 2015-11-04 中国电信股份有限公司 Method and server for constructing cloud-end shared game scene
JP2016004500A (en) * 2014-06-18 2016-01-12 株式会社コロプラ User interface program
CN104436657A (en) * 2014-12-22 2015-03-25 青岛烈焰畅游网络技术有限公司 Method and device for controlling game and electronic equipment
CN105194871A (en) * 2015-09-14 2015-12-30 网易(杭州)网络有限公司 Method for controlling game role
CN105582670A (en) * 2015-12-17 2016-05-18 网易(杭州)网络有限公司 Aimed-firing control method and device
CN105597310A (en) * 2015-12-24 2016-05-25 网易(杭州)网络有限公司 Game control method and device

Also Published As

Publication number Publication date
CN106020633A (en) 2016-10-12
CN109857303A (en) 2019-06-07

Similar Documents

Publication Publication Date Title
CN109857303B (en) Interaction control method and device
US9898180B2 (en) Flexible touch-based scrolling
CN104364734B (en) Remote session control using multi-touch inputs
CN106687922B (en) Parametric inertia and API
US20140089824A1 (en) Systems And Methods For Dynamically Altering A User Interface Based On User Interface Actions
KR102108583B1 (en) Instantiable gesture objects
US10055388B2 (en) Declarative style rules for default touch behaviors
JP2016527578A (en) Application scenario identification method, power consumption management method, apparatus, and terminal device
CN104063128B (en) A kind of information processing method and electronic equipment
EP3133481B1 (en) Terminal device display method and terminal device
CN109710343B (en) Method, device and equipment for switching windows of computer desktop and storage medium
US9383908B2 (en) Independent hit testing
US20210205698A1 (en) Program, electronic device, and method
US20140372903A1 (en) Independent Hit Testing for Touchpad Manipulations and Double-Tap Zooming
CN104063071A (en) Content input method and device
CN103513874A (en) Information display method and device
CN103076974A (en) Unlocking method and device of touch screen and touch screen equipment
CN112379958A (en) Sliding control method and device for application program page
CN106984044B (en) Method and equipment for starting preset process
CN107092433B (en) Touch control method and device of touch control all-in-one machine
CN105393214B (en) Self-revealing symbolic gestures
US10162602B2 (en) Method for handling user-level events for programming an application
US20120117517A1 (en) User interface
CN105404439B (en) Folder creating method and device
CN111367406A (en) Method, device, equipment and medium for pull-down refreshing of small program window

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant