CN117224957A - Method, apparatus, device and storage medium for interactive control - Google Patents

Method, apparatus, device and storage medium for interactive control Download PDF

Info

Publication number
CN117224957A
CN117224957A CN202311191393.4A CN202311191393A CN117224957A CN 117224957 A CN117224957 A CN 117224957A CN 202311191393 A CN202311191393 A CN 202311191393A CN 117224957 A CN117224957 A CN 117224957A
Authority
CN
China
Prior art keywords
virtual object
target
interface
electronic device
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311191393.4A
Other languages
Chinese (zh)
Inventor
陈宗民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zitiao Network Technology Co Ltd
Original Assignee
Beijing Zitiao Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zitiao Network Technology Co Ltd filed Critical Beijing Zitiao Network Technology Co Ltd
Priority to CN202311191393.4A priority Critical patent/CN117224957A/en
Publication of CN117224957A publication Critical patent/CN117224957A/en
Pending legal-status Critical Current

Links

Abstract

Embodiments of the present disclosure provide methods, apparatuses, devices, and storage medium for interactive control. The method comprises the following steps: providing a target interface, wherein the target interface presents a virtual object; controlling the virtual object to move in the target interface based on the drag operation of the virtual object in the target interface; receiving touch operation in a target interface during the duration of the drag operation; and changing the moving target attribute and/or controlling to execute the target action while moving the virtual object based on the touch operation. In this way, the composite operation can be added in the dragging process, the moving speed of the virtual object corresponding to the dragging operation can be changed, and the interaction experience of the user in the dragging process can be improved.

Description

Method, apparatus, device and storage medium for interactive control
Technical Field
Example embodiments of the present disclosure relate generally to the field of computers and, more particularly, relate to methods, apparatuses, devices, and computer-readable storage media for interactive control.
Background
With the development of the computer level, various forms of electronic devices can greatly enrich the daily life of people. For example, people may utilize electronic devices to conduct various interactions in a virtual scene.
The electronic device may display the virtual scene and the interactive effects by displaying an interface associated with the virtual scene. The interface may have a plurality of virtual operation controls presented therein for controlling the virtual character. The user may adjust the layout of the virtual operation control in the interface by dragging the virtual operation control.
Disclosure of Invention
In a first aspect of the present disclosure, an interactive control method is provided. The method comprises the following steps: providing a target interface, wherein the target interface presents a virtual object; controlling the virtual object to move in the target interface based on the drag operation of the virtual object in the target interface; receiving touch operation in a target interface during the duration of the drag operation; and changing the moving target attribute and/or controlling to execute the target action while moving the virtual object based on the touch operation.
In a second aspect of the present disclosure, an apparatus for interactive control is provided. The device comprises: the interface providing module is configured to provide a target interface, and the target interface presents a virtual object; the mobile control module is configured to control the virtual object to move in the target interface based on the drag operation of the virtual object in the target interface; the operation receiving module is configured to receive touch operation in the target interface during the duration of the drag operation; and an operation response module configured to change a target property of the movement and/or control to perform a target action while moving the virtual object based on the touch operation.
In a third aspect of the present disclosure, an electronic device is provided. The electronic device comprises at least one processing unit; and at least one memory coupled to the at least one processing unit and storing instructions for execution by the at least one processing unit. The instructions, when executed by at least one processing unit, cause the electronic device to perform the method according to the first aspect of the present disclosure.
In a fourth aspect of the present disclosure, a computer-readable storage medium is provided. The computer readable storage medium has stored thereon a computer program executable by a processor to implement a method according to the first aspect of the present disclosure.
It should be understood that what is described in this section of the disclosure is not intended to limit key features or essential features of the embodiments of the disclosure, nor is it intended to limit the scope of the disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The above and other features, advantages and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. In the drawings, wherein like or similar reference numerals denote like or similar elements, in which:
FIG. 1 illustrates a schematic diagram of an example environment in which embodiments of the present disclosure may be implemented;
FIGS. 2A-2F illustrate example interfaces according to some embodiments of the present disclosure;
FIG. 3 illustrates an example interface according to further embodiments of the present disclosure;
FIG. 4 illustrates a flow chart of an interactive control process, according to some embodiments of the present disclosure;
FIG. 5 illustrates a schematic block diagram of an apparatus for interactive control, according to some embodiments of the present disclosure; and
fig. 6 illustrates a block diagram of an apparatus capable of implementing various embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been illustrated in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather, these embodiments are provided so that this disclosure will be more thorough and complete. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be noted that any section/subsection headings provided herein are not limiting. Various embodiments are described throughout this document, and any type of embodiment may be included under any section/subsection. Furthermore, the embodiments described in any section/subsection may be combined in any manner with any other embodiment described in the same section/subsection and/or in a different section/subsection.
In describing embodiments of the present disclosure, the term "comprising" and its like should be taken to be open-ended, i.e., including, but not limited to. The term "based on" should be understood as "based at least in part on". The term "one embodiment" or "the embodiment" should be understood as "at least one embodiment". The term "some embodiments" should be understood as "at least some embodiments". Other explicit and implicit definitions are also possible below. The terms "first," "second," and the like, may refer to different or the same object. Other explicit and implicit definitions are also possible below.
Embodiments of the present disclosure may relate to user data, the acquisition and/or use of data, and the like. These aspects all follow corresponding legal and related regulations. In embodiments of the present disclosure, all data collection, acquisition, processing, forwarding, use, etc. is performed with knowledge and confirmation by the user. Accordingly, in implementing the embodiments of the present disclosure, the user should be informed of the type of data or information, the range of use, the use scenario, etc. that may be involved and obtain the authorization of the user in an appropriate manner according to the relevant laws and regulations. The particular manner of notification and/or authorization may vary depending on the actual situation and application scenario, and the scope of the present disclosure is not limited in this respect.
The schemes described in the present specification and embodiments, if related to personal information processing, all perform processing on the premise of having a validity base (for example, obtaining agreement of a personal information body, or being necessary for executing a contract, etc.), and perform processing only within a prescribed or agreed range. The user refuses to process the personal information except the necessary information of the basic function, and the basic function is not influenced by the user.
As briefly mentioned above, the electronic device may display the virtual scene and the interactive effects by displaying an interface associated with the virtual scene. The interface may have a plurality of virtual operation controls presented therein for controlling the virtual character. The user may adjust the layout of the virtual operation control in the interface by dragging the virtual operation control.
Conventionally, the dragging speed and the finger movement speed belong to an equal proportion mapping relation, and the dragging speed aiming at the control can only completely follow the movement speed of the finger. When a user needs a more accurate slow operation or a more free fast operation, the fixed mapping speed transformation relation cannot meet the requirements, and especially in a scene with special operation precision requirements or for users with less flexible fingers, the degree of freedom of operation is low, so that the actual operation efficiency is greatly affected.
Traditionally, it has been relatively difficult to re-perform the compounding operation during the towing process. Such as cancellation during a drag, etc., requires a restore or drag to the cancelled interaction area after the drag. The time-delay interaction mode increases the operation flow and time intangibly, and reduces the operation freedom degree. In addition, the conventional functions are two functions of cancel or check in the dragging process, the functions are relatively few, the composite operation only supports one of the functions at the same time, the functions of cancel and check can not be supported at the same time, and the interaction requirements under more operation scenes, such as functions of back selection, and the like, can not be met.
Therefore, conventionally, the drag operation for the control is often single, the playability is poor, and the interaction experience of the user can be affected.
The embodiment of the disclosure provides a scheme for interaction control. According to the scheme of the disclosure, a target interface is provided, wherein the target interface presents a virtual object; controlling the virtual object to move in the target interface based on the drag operation of the virtual object in the target interface; receiving touch operation in a target interface during the duration of the drag operation; and changing the moving target attribute and/or controlling to execute the target action while moving the virtual object based on the touch operation. In this way, the composite operation can be added in the dragging process, the moving speed of the virtual object corresponding to the dragging operation can be changed, and the interaction experience of the user in the dragging process can be improved.
Various example implementations of the scheme are described in further detail below in conjunction with the accompanying drawings. To illustrate the principles and concepts of the embodiments of the disclosure, some of the following description will refer to the field of gaming. It will be appreciated, however, that this is merely exemplary and is not intended to limit the scope of the present disclosure. The embodiment of the disclosure can be applied to various fields of simulation, virtual reality, augmented reality and the like.
Example Environment
FIG. 1 illustrates a schematic diagram of an example environment 100 in which embodiments of the present disclosure may be implemented. As shown in fig. 1, an example environment 100 may include an electronic device 110.
In this example environment 100, an electronic device 110 may be running an application 120 that supports virtual scenarios. The application 120 may be any suitable type of application for rendering a virtual scene, examples of which may include, but are not limited to: simulation applications, gaming applications, virtual reality applications, augmented reality applications, and the like, embodiments of the disclosure are not limited in this respect. Where the application 120 is a gaming application, it includes, but is not limited to, a challenge-class game scenario (e.g., in a MOBA game scenario), a first person shooter game (FPS), a simulated strategy game (SLG), a role-playing game (RPG), an action game (ACT), a simulated business game, and so forth. The user 140 may interact with the application 120 via the electronic device 110 and/or its attached device.
In the environment 100 of fig. 1, if the application 120 is in an active state, the electronic device 110 may present an interface 150 associated with the virtual environment through the application 120. At least one screen associated with the virtual environment may be presented in the interface 150. The at least one screen may include a screen associated with a virtual object corresponding to the current user, a screen associated with a virtual object corresponding to other users, a screen corresponding to a non-player character, a screen associated with a place in the virtual environment, and the like. Illustratively, the interface 150 may be a game application interface to present a corresponding game scene. Alternatively, the interface 150 may be another suitable type of interactive interface that may support the user to control the virtual objects in the interface to perform corresponding actions in the virtual environment.
In some embodiments, the electronic device 110 communicates with the server 130 to enable provisioning of services for the application 120. The electronic device 110 may be any type of mobile terminal, fixed terminal, or portable terminal, including a mobile handset, desktop computer, laptop computer, notebook computer, netbook computer, tablet computer, media computer, multimedia tablet, palmtop computer, portable gaming terminal, VR/AR device, personal communication system (Personal Communication System, PCS) device, personal navigation device, personal digital assistant (Personal Digital Assistant, PDA), audio/video player, digital camera/video camera, positioning device, television receiver, radio broadcast receiver, electronic book device, gaming device, or any combination of the preceding, including accessories and peripherals for these devices, or any combination thereof. In some embodiments, electronic device 110 is also capable of supporting any type of interface to the user (such as "wearable" circuitry, etc.).
The server 130 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, a content distribution network, basic cloud computing services such as big data and an artificial intelligence platform. Server 130 may include, for example, a computing system/server, such as a mainframe, edge computing node, computing device in a cloud environment, and so on. The server 130 may provide background services for applications 120 in the electronic device 110 that support virtual scenes.
A communication connection may be established between server 130 and electronic device 110. The communication connection may be established by wired means or wireless means. The communication connection may include, but is not limited to, a bluetooth connection, a mobile network connection, a universal serial bus (Universal Serial Bus, USB) connection, a wireless fidelity (Wireless Fidelity, wiFi) connection, etc., as embodiments of the disclosure are not limited in this respect. In embodiments of the present disclosure, the server 130 and the electronic device 110 may implement signaling interactions through a communication connection therebetween.
Some example embodiments of the present disclosure will be described below with continued reference to the accompanying drawings.
Example interface
To more intuitively represent the display control mechanism of the screen in the interface, a game scene is taken as an example to describe below, and embodiments of the present disclosure may enable a user to understand the corresponding screen display principle by presenting an example interface.
Fig. 2A-2F and 3 illustrate schematic diagrams of example interfaces according to some embodiments of the present disclosure. In some embodiments, the electronic device 110 may present the interfaces shown in fig. 2A-2F and fig. 3 upon receiving an interactive request for a virtual object. As introduced above, such interfaces may include, for example, graphical interfaces associated with virtual scenes. Such virtual scenes may include, for example, but are not limited to: various types of game scenes, simulated scenes, and so forth.
In the interfaces shown in fig. 2A-2F and 3, the electronic device 110 may present at least one virtual object. The at least one virtual object may include, for example, a virtual operation control in the interface. Such at least one operational control may be located, for example, at a particular location (e.g., left and right sides, underside, etc.) of the interface. The electronic device 110 may adjust the layout of the at least one virtual object in the interface based on the user operation. In some embodiments, the electronic device 110 may also adjust the respective roles, attributes, sensitivities, etc. of the at least one virtual object based on user operations.
In an embodiment of the present disclosure, the electronic device 110 provides a target interface that is presented with a virtual object. In some embodiments, the target interface may be, for example, an interface (also referred to as a first interface) for adjusting the layout of the interface. The interface layout may indicate a distribution of a set of operational controls (i.e., virtual operational controls) in the interface, and the virtual object is an operational control in the set of operational controls. Virtual operation controls and operation controls may be used interchangeably herein. In such a target interface, the electronic device 110 may adjust the position, size, color, etc. of at least one of the set of operation controls already present in the interface based on the user operation, thereby adjusting the interface layout of the target interface.
Alternatively or additionally, in some embodiments, the target interface may also be, for example, an interface (which may also be referred to as a second interface) for placing virtual objects into the virtual environment. In such a target interface, the electronic device 110 may place virtual objects in the interface based on user operations. The electronic device 110 may adjust the position, size, color, etc. of the virtual object in the interface based on the operation for the virtual object.
It will be appreciated that the target interface may be any other suitable interface, and the virtual object presented in the target interface may be any suitable virtual object, which is not limited by the present disclosure. By way of example, the target interface may also be an interface that presents a virtual scene, the virtual object may also be a virtual character in a virtual scene, and so on.
As shown in fig. 2A, the interface 200A may be, for example, an interface for adjusting an interface layout. A set of operational controls is presented in interface 200A, including at least operational control 210, operational control 220, operational control 230, and operational control 240. In some embodiments, the set of operational controls may also include any other suitable controls, which the present disclosure is not limited to. Different operational controls may include, for example, different functions. Illustratively, the functionality of the operational control 210 and the operational control 220 may be different.
The electronic device 110 controls the virtual object to move in the target interface based on the drag operation for the virtual object in the target interface. The electronic device 110 may determine that a drag operation for the virtual object is received, for example, in response to detecting a long press operation + a move operation by a user for the virtual object (it may be appreciated that the long press operation is concurrent with the move operator). The electronic device 110 may also determine to end the movement operation, and thus the drag operation, in response to detecting the end of the long press action, for example. The electronic device 110 further determines the current position as the position where the drag operation ends, that is, the end position of the virtual object in the drag operation.
For example, as shown in fig. 2B, in response to receiving a drag operation on the operational control 230, the electronic device 110 may present a movement indication element 250 in association with the operational control 230. It is to be appreciated that the movement indication element 250 is presented in association with a dragged operational control, which may be any suitable operational control in the interface 200B. The movement indication element 250 may be used to indicate the operation control being dragged. Of course, in some embodiments, the electronic device 110 may not present the movement indication element 250. The electronic device 110 may prompt the dragged operation control in any other suitable manner. For example, the electronic device 110 may highlight the dragged operation control. In some embodiments, when presenting the movement indication element 250, the electronic device 110 may also render a semi-transparent mask layer, for example, on an area of the interface 200B other than the operation control 230 and the movement indication element 250 to highlight the dragged operation control 230.
The electronic device 110 may in turn control the movement of the operational control 230 in the interface 200B based on a drag operation in the interface 200B on the operational control 230. For example, in response to detecting a drag operation on the operational control 230, the electronic device 110 may present a movement indication element 250 and move the operational control 230 based on a drag route indicated by the drag operation.
During the duration of the drag operation, the electronic device 110 receives a touch operation in the target interface. The touch operation and the drag operation may be operations for different virtual objects. Illustratively, a set of operational controls are presented in a target interface provided by the electronic device 110, and the electronic device 110 may receive a touch operation for the operational control 210 for the duration of a drag operation for the operational control 230 in the set of operational controls. The touch operation for the operation control 210 may include, but is not limited to, a click operation, a long press operation, a double click operation, a slide operation, and the like.
Based on the received touch operation for the operation control 210, the electronic device 110 can change a target property for movement of the operation control 230 and/or control to perform a target action while moving a virtual object (e.g., the operation control 230). The target attribute of the movement may be, for example, a movement speed of the virtual object being dragged (e.g., the operation control 230). That is, based on the touch operation, the electronic device 110 may change the moving speed of the virtual object. In some embodiments, in a case where the touch operation is a sliding operation, the electronic device 110 may change the moving speed of the virtual object based on the direction of the sliding operation.
For example, as shown in fig. 2C, during the duration of a drag operation for the operational control 230, the electronic device 110 may also receive a sliding operation for other controls (e.g., the operational control 210) in the interface 200C. In some embodiments, in response to receiving a sliding operation for the operation control 210, the electronic device 110 may present the operation control 210'. The operational control 210' may be used to present a sliding indication with respect to the operational control 210. In some embodiments, the operational control 210' may be used to indicate a direction of the swipe indicated by the swipe operation, a position of the operational control 210 after the swipe, and the like. For example, if the operation control 210' is located on the upper side of the operation control 210 (i.e., as shown in fig. 2C), the sliding direction may be indicated to be an upper sliding direction.
In some embodiments, the electronic device 110 may also change the speed of movement of the virtual object being dragged (i.e., the operational control 230) based on the direction of the sliding operation. Specifically, in response to the sliding operation corresponding to the first direction, the electronic device 110 may increase the speed of movement of the operational control 230; in response to the sliding operation corresponding to the second direction, the electronic device 110 can reduce the speed of movement of the operational control 230. The first direction and the second direction should be two different directions. The first direction and the second direction may be, for example, opposite directions. For example, in some embodiments, in response to receiving a sliding operation for the operation control 210, the electronic device 110 may present the sliding hint element 202 along with the sliding hint element 204 in the interface 200C. The sliding hint element 202 can be used, for example, to indicate that the electronic device 110 can increase the speed of movement of the operational control 230 if the sliding operation corresponds to an upper direction. The sliding hint element 204 can be used, for example, to indicate that if the sliding operation corresponds to a downward direction, the electronic device 110 can reduce the speed of movement of the operational control 230. It is to be understood that the first direction and the second direction may be any suitable different directions, which is not limiting of the present disclosure.
In some embodiments, the specific magnitude of the change in the speed of movement of the virtual object may be associated with the distance of the slide. That is, the moving distance of the virtual object in the same time may be associated with the sliding distance. The electronic device 110 may determine a specific magnitude of the movement speed based on the distance of the slide, for example. Taking the example that the operation control 230 is moved by 1 pixel in the initial case and the operation control 230 is correspondingly moved by 1 pixel, in the case that the sliding operation instruction for the operation control 210 slides upward (the movement speed of the operation control 230 is also increased), if the operation control 230 is moved by 1 pixel in the initial case, the operation control 230 may be moved by 10 pixels. In the case where the slide operation for the operation control 210 instructs to slide down (also reduces the moving speed of the operation control 230), if the drag operation instructs to control the operation control 230 to move by 1 pixel, the operation control 230 may move by 0.1 pixel. It is to be understood that the specific numerical values herein are exemplary only and are not limiting upon the present disclosure.
The target action performed may include, for example, changing the selected state of the virtual object currently being dragged (also referred to as the first virtual object) and/or other virtual objects (e.g., the second virtual object). For example, if the first virtual object corresponding to the drag operation is in the selected state and the other second virtual object is in the unselected state, the electronic device 110 may switch the first virtual object to the unselected state and switch the second virtual object to the selected state in response to receiving the touch operation for the second virtual object.
The target action performed may also include, for example, changing an appearance attribute of the virtual object currently being dragged. Appearance attributes may include, for example, but are not limited to, shape, size, color, and the like. For example, the electronic device 110 may change the color of the virtual object currently being dragged based on the touch operation.
The target action performed may also include, for example, changing the type of virtual object currently being dragged. For example, if the virtual object is a virtual object of a first type, the electronic device 110 may switch the virtual object from the first type to a second type, such as changing the type of the operation control, based on the touch operation.
Regarding the specific content of controlling the currently dragged virtual object to perform the target action, in some embodiments, the electronic device 110 may control to perform the target action corresponding to the target position while moving the virtual object based on the target position associated with the touch operation. In particular, the electronic device 110 may determine at least one functional region associated with the touched operation control. The electronic device 110 may further control the virtual object to perform a target action matched with the functional area based on the functional area matched with the touch operation. For example, as shown in fig. 2D, taking a dragged virtual object as an operation control 230 and a touched virtual object as an operation control 210 as an example, the electronic device 110 may determine at least one functional area 260 matching the operation control 210. The functional zones of the at least one functional zone 260 may, for example, correspond to speed adjustment, color adjustment, size adjustment, transparency adjustment, rounded corner radius adjustment, and the like. The electronic device 110 may also determine a functional area that matches the touch operation, such as functional area 8. The electronic device 110 may in turn control the operational control 230 to perform a target action that matches the functional area 8.
In some embodiments, the electronic device 110 may further control, based on the operation type of the touch operation, to perform a target action corresponding to the operation type while moving the virtual object. The touch operation may be, for example, a pressing operation. The pressing operation includes at least a click operation type and a long press operation type. If the touch operation is a click operation type of pressing operation, the electronic device 110 may control the virtual object to execute a target action corresponding to the click operation type of pressing operation. For example, as shown in fig. 2E, taking a dragged virtual object as an operation control 230 and a touched virtual object as an operation control 210 as an example, the electronic device 110 may present a prompt element 270 in response to determining that the touch operation is a pressing operation of a click operation type, and determine to execute a target operation named function 13 corresponding to the operation. The electronic device 110, in turn, performs the target operation while controlling the mobile operation control 230. If the touch operation is a long press operation, the electronic device 110 may control the virtual object to execute a target action corresponding to the long press operation. For example, as shown in fig. 2F, taking a dragged virtual object as an operation control 230 and a touched virtual object as an operation control 210 as an example, the electronic device 110 may present a prompt element 280 in response to determining that the touch operation is a pressing operation of a long press operation type, and determine to execute a target operation named function 14 corresponding to the operation. The electronic device 110, in turn, performs the target operation while controlling the mobile operation control 230.
In some embodiments, electronic device 110 supports other interfaces in addition to the interfaces shown in fig. 2A-2F described above, for example. By way of example, as shown in FIG. 3, interface 300C may be, for example, an interface for placing virtual objects into a virtual environment. The interface 300 has presented therein a plurality of operational controls and a plurality of virtual items (including virtual object 310 and virtual object 320). Electronic device 110 may control movement of any virtual item and/or operational control in interface 300 in response to a drag operation for the corresponding virtual item and/or operational control. Further, the electronic device 110 may perform the target action while controlling the virtual object and/or the operation control to move in the interface 300 (i.e., the duration of the drag operation), receive a touch operation in the interface 300, and change the target property of the movement and/or control the movement of the virtual object based on the touch operation.
In summary, embodiments of the present disclosure may provide a target interface presented with a virtual object. The virtual object may be controlled to move in the target interface based on a drag operation for the virtual object in the target interface. During the duration of the drag operation, a touch operation in the target interface may be received and based on the touch operation, the moving target property may be changed and/or the target action may be controlled to be performed while the virtual object is being moved. In this way, the composite operation can be added in the dragging process, the moving speed of the virtual object corresponding to the dragging operation can be changed, and the interaction experience of the user in the dragging process can be improved.
Example procedure
Fig. 4 illustrates a flow chart of a process 400 for display control according to some embodiments of the present disclosure. The process 400 may be implemented at the electronic device 110. The process 400 is described below with reference to fig. 1.
At block 410, the electronic device 110 provides a target interface that presents a virtual object.
At block 420, the electronic device 110 controls the virtual object to move in the target interface based on the drag operation for the virtual object in the target interface.
At block 430, the electronic device 110 receives a touch operation in the target interface for the duration of the drag operation.
At block 440, the electronic device 110 changes the target property of the movement and/or controls the execution of the target action while moving the virtual object based on the touch operation.
In some embodiments, the touch operation is a sliding operation, and changing the target property of the movement includes: the moving speed of the virtual object is changed based on the direction of the sliding operation.
In some embodiments, changing the speed of movement of the virtual object comprises: increasing a moving speed of the virtual object in response to the sliding operation corresponding to the first direction; and decreasing the moving speed of the virtual object in response to the sliding operation corresponding to the second direction.
In some embodiments, controlling the execution of the target action while moving the virtual object includes: based on the target position associated with the touch operation, it is controlled to perform a target action corresponding to the target position while moving the virtual object.
In some embodiments, controlling the execution of the target action while moving the virtual object includes: based on the operation type of the touch operation, the target action corresponding to the operation type is controlled to be executed while the virtual object is moved.
In some embodiments, the touch operation is a press operation, and the operation type indicates that the press operation is a click operation and/or a long press operation.
In some embodiments, the target interface is a first interface for adjusting an interface layout, the interface layout indicating a distribution of a set of operational controls in the interface, and the virtual object is an operational control in the set of operational controls.
In some embodiments, the target interface is a second interface for placing virtual objects into the virtual environment.
In some embodiments, the virtual object is a first virtual object, and the target action includes at least one of: changing the selected state of the first virtual object and/or the second virtual object; changing appearance attributes of the virtual object; the type of the virtual object is changed.
Example apparatus and apparatus
Embodiments of the present disclosure also provide corresponding apparatus for implementing the above-described methods or processes. Fig. 5 illustrates a schematic block diagram of an apparatus 500 for interactive control, according to some embodiments of the present disclosure. The apparatus 500 may be implemented as or included in the electronic device 110. The various modules/components in apparatus 500 may be implemented in hardware, software, firmware, or any combination thereof.
As shown in fig. 5, the apparatus 500 includes an interface providing module 510 configured to provide a target interface, the target interface being presented with a virtual object. The apparatus 500 further comprises a movement control module 520 configured to control movement of the virtual object in the target interface based on a drag operation for the virtual object in the target interface. The apparatus 500 further comprises an operation receiving module 530 configured to receive a touch operation in the target interface during the duration of the drag operation. The apparatus 500 further comprises an operation response module 540 configured to change a target property of the movement and/or control to perform a target action while moving the virtual object based on the touch operation.
In some embodiments, the touch operation is a sliding operation, and the operation response module 540 includes: and a speed changing module configured to change a moving speed of the virtual object based on a direction of the sliding operation.
In some embodiments, the speed change module includes: a speed increasing module configured to increase a moving speed of the virtual object in response to the sliding operation corresponding to the first direction; and a speed reduction module configured to reduce a moving speed of the virtual object in response to the sliding operation corresponding to the second direction.
In some embodiments, the operation response module 540 includes: and a first action execution module configured to control, based on a target position associated with the touch operation, to execute a target action corresponding to the target position while moving the virtual object.
In some embodiments, the operation response module 540 includes: and a second action execution module configured to control, based on an operation type of the touch operation, to execute a target action corresponding to the operation type while moving the virtual object.
In some embodiments, the touch operation is a press operation, and the operation type indicates that the press operation is a click operation and/or a long press operation.
In some embodiments, the target interface is a first interface for adjusting an interface layout, the interface layout indicating a distribution of a set of operational controls in the interface, and the virtual object is an operational control in the set of operational controls.
In some embodiments, the target interface is a second interface for placing virtual objects into the virtual environment.
In some embodiments, the virtual object is a first virtual object, and the target action includes at least one of: changing the selected state of the first virtual object and/or the second virtual object; changing appearance attributes of the virtual object; the type of the virtual object is changed.
Fig. 6 illustrates a block diagram of an electronic device 600 in which one or more embodiments of the disclosure may be implemented. It should be understood that the electronic device 600 illustrated in fig. 6 is merely exemplary and should not be construed as limiting the functionality and scope of the embodiments described herein. The electronic device 600 shown in fig. 6 may be used to implement the electronic device 110 of fig. 1.
As shown in fig. 6, the electronic device 600 is in the form of a general-purpose electronic device. The components of electronic device 600 may include, but are not limited to, one or more processors or processing units 610, memory 620, storage 630, one or more communication units 640, one or more input devices 650, and one or more output devices 660. The processing unit 610 may be an actual or virtual processor and is capable of performing various processes according to programs stored in the memory 620. In a multiprocessor system, multiple processing units execute computer-executable instructions in parallel to increase the parallel processing capabilities of electronic device 600.
The electronic device 600 typically includes a number of computer storage media. Such a medium may be any available media that is accessible by electronic device 600, including, but not limited to, volatile and non-volatile media, removable and non-removable media. The memory 620 may be volatile memory (e.g., registers, cache, random Access Memory (RAM)), non-volatile memory (e.g., read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory), or some combination thereof. Storage device 630 may be a removable or non-removable media and may include machine-readable media such as flash drives, magnetic disks, or any other media that may be capable of storing information and/or data and that may be accessed within electronic device 600.
The electronic device 600 may further include additional removable/non-removable, volatile/nonvolatile storage media. Although not shown in fig. 6, a magnetic disk drive for reading from or writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk may be provided. In these cases, each drive may be connected to a bus (not shown) by one or more data medium interfaces. Memory 620 may include a computer program product 625 having one or more program modules configured to perform the various methods or acts of the various embodiments of the disclosure.
The communication unit 640 enables communication with other electronic devices through a communication medium. Additionally, the functionality of the components of the electronic device 600 may be implemented in a single computing cluster or in multiple computing machines capable of communicating over a communication connection. Thus, the electronic device 600 may operate in a networked environment using logical connections to one or more other servers, a network Personal Computer (PC), or another network node.
The input device 650 may be one or more input devices such as a mouse, keyboard, trackball, etc. The output device 660 may be one or more output devices such as a display, speakers, printer, etc. The electronic device 600 may also communicate with one or more external devices (not shown), such as storage devices, display devices, etc., with one or more devices that enable a user to interact with the electronic device 600, or with any device (e.g., network card, modem, etc.) that enables the electronic device 600 to communicate with one or more other electronic devices, as desired, via the communication unit 640. Such communication may be performed via an input/output (I/O) interface (not shown).
According to an exemplary implementation of the present disclosure, a computer-readable storage medium having stored thereon computer-executable instructions, wherein the computer-executable instructions are executed by a processor to implement the method described above is provided. According to an exemplary implementation of the present disclosure, there is also provided a computer program product tangibly stored on a non-transitory computer-readable medium and comprising computer-executable instructions that are executed by a processor to implement the method described above.
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus, devices, and computer program products implemented according to the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer readable program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable medium having the instructions stored therein includes an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various implementations of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The foregoing description of implementations of the present disclosure has been provided for illustrative purposes, is not exhaustive, and is not limited to the implementations disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the various implementations described. The terminology used herein was chosen in order to best explain the principles of each implementation, the practical application, or the improvement of technology in the marketplace, or to enable others of ordinary skill in the art to understand each implementation disclosed herein.

Claims (12)

1. An interaction control method, comprising:
providing a target interface, wherein the target interface presents a virtual object;
controlling the virtual object to move in the target interface based on the drag operation of the virtual object in the target interface;
receiving touch operation in the target interface during the duration of the drag operation; and
based on the touch operation, changing the target attribute of the movement and/or controlling to execute a target action while moving the virtual object.
2. The method of claim 1, wherein the touch operation is a sliding operation, and changing the target property of the movement comprises:
and changing the moving speed of the virtual object based on the direction of the sliding operation.
3. The method of claim 2, wherein changing the speed of movement of the virtual object comprises:
increasing the moving speed of the virtual object in response to the sliding operation corresponding to a first direction; and
the moving speed of the virtual object is reduced in response to the sliding operation corresponding to a second direction.
4. The method of claim 1, wherein controlling the target action to be performed while moving the virtual object comprises:
and controlling to execute a target action corresponding to the target position while moving the virtual object based on the target position associated with the touch operation.
5. The method of claim 1, wherein controlling the target action to be performed while moving the virtual object comprises:
and controlling to execute the target action corresponding to the operation type while moving the virtual object based on the operation type of the touch operation.
6. The method of claim 5, wherein the touch operation is a press operation and the operation type indicates that the press operation is a click operation and/or a long press operation.
7. The method of claim 1, wherein the target interface is a first interface for adjusting an interface layout, the interface layout indicating a distribution of a set of operational controls in an interface, and the virtual object is an operational control in the set of operational controls.
8. The method of claim 1, wherein the target interface is a second interface for placing the virtual object into a virtual environment.
9. The method of claim 1, wherein the virtual object is a first virtual object and the target action comprises at least one of:
changing the selected state of the first virtual object and/or the second virtual object;
changing appearance attributes of the virtual object;
the type of the virtual object is changed.
10. An apparatus for interactive control, comprising:
an interface providing module configured to provide a target interface, the target interface presenting a virtual object;
a movement control module configured to control the virtual object to move in the target interface based on a drag operation for the virtual object in the target interface;
an operation receiving module configured to receive a touch operation in the target interface during the duration of the drag operation; and
and the operation response module is configured to change the target attribute of the movement and/or control the virtual object to be moved and simultaneously execute target actions based on the touch operation.
11. An electronic device, comprising:
at least one processing unit; and
at least one memory coupled to the at least one processing unit and storing instructions for execution by the at least one processing unit, which when executed by the at least one processing unit, cause the electronic device to perform the method of any one of claims 1 to 9.
12. A computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method according to any of claims 1 to 9.
CN202311191393.4A 2023-09-14 2023-09-14 Method, apparatus, device and storage medium for interactive control Pending CN117224957A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311191393.4A CN117224957A (en) 2023-09-14 2023-09-14 Method, apparatus, device and storage medium for interactive control

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311191393.4A CN117224957A (en) 2023-09-14 2023-09-14 Method, apparatus, device and storage medium for interactive control

Publications (1)

Publication Number Publication Date
CN117224957A true CN117224957A (en) 2023-12-15

Family

ID=89092452

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311191393.4A Pending CN117224957A (en) 2023-09-14 2023-09-14 Method, apparatus, device and storage medium for interactive control

Country Status (1)

Country Link
CN (1) CN117224957A (en)

Similar Documents

Publication Publication Date Title
US10434418B2 (en) Navigation and interaction controls for three-dimensional environments
US10394346B2 (en) Using a hardware mouse to operate a local application running on a mobile device
US20200110568A1 (en) Multiplatform based experience generation
CN101553775B (en) Operating touch screen interfaces
JP2020504851A (en) Game screen display control method, device, storage medium, and electronic device
US9430106B1 (en) Coordinated stylus haptic action
US20150067540A1 (en) Display apparatus, portable device and screen display methods thereof
CN106716332A (en) Gesture navigation for secondary user interface
US20230049033A1 (en) Screen display method and apparatus, device, storage medium, and program product
JP2014033381A (en) Information processing apparatus and program, and image processing system
US20230241499A1 (en) Position adjustment method and apparatus for operation control, terminal, and storage medium
WO2023020125A1 (en) Virtual environment picture display method and device, terminal, medium, and program product
CN111481923A (en) Rocker display method and device, computer storage medium and electronic equipment
CN113476823A (en) Virtual object control method and device, storage medium and electronic equipment
CN113198179A (en) Virtual object steering control method and device, storage medium and electronic equipment
CN108885556A (en) Control numeral input
CN117224957A (en) Method, apparatus, device and storage medium for interactive control
WO2023029859A1 (en) Real-time game information display method and apparatus, terminal, and storage medium
CN108351888B (en) Generating deferrable data streams
CN114849228A (en) Method, device and equipment for layout of control in game and storage medium
US10963141B2 (en) Smart multi-touch layout control for mobile devices
US20160117080A1 (en) Hit-test to determine enablement of direct manipulations in response to user actions
CN116943164A (en) Method, apparatus, device and storage medium for interaction
CN106201213A (en) The control method of a kind of virtual reality focus and terminal
CN112402967B (en) Game control method, game control device, terminal equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination