CN117827068A - Interactive control method, device, equipment and storage medium - Google Patents

Interactive control method, device, equipment and storage medium Download PDF

Info

Publication number
CN117827068A
CN117827068A CN202410022300.3A CN202410022300A CN117827068A CN 117827068 A CN117827068 A CN 117827068A CN 202410022300 A CN202410022300 A CN 202410022300A CN 117827068 A CN117827068 A CN 117827068A
Authority
CN
China
Prior art keywords
control
target
controls
touch input
touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410022300.3A
Other languages
Chinese (zh)
Inventor
冉然
朱敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zitiao Network Technology Co Ltd
Original Assignee
Beijing Zitiao Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zitiao Network Technology Co Ltd filed Critical Beijing Zitiao Network Technology Co Ltd
Priority to CN202410022300.3A priority Critical patent/CN117827068A/en
Publication of CN117827068A publication Critical patent/CN117827068A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

According to embodiments of the present disclosure, an interaction control method, apparatus, device, and storage medium are provided. The method includes presenting a set of controls associated with a target object in a target interface; responsive to a received touch input, determining, within a common control region of a plurality of controls in the set of controls, a target control of the plurality of controls that is activated based on directional information associated with the touch input, wherein the control region indicates a range for controlling the corresponding control; and executing a first operation corresponding to the target control on the target object based on the touch input. In this way, misoperation on the control caused by overlapping of control areas of the control can be effectively avoided, and the picture editing efficiency is effectively improved.

Description

Interactive control method, device, equipment and storage medium
Technical Field
Example embodiments of the present disclosure relate generally to the field of computers and, more particularly, relate to interactive control methods, apparatuses, devices, and computer-readable storage media.
Background
With the development of image technology, the user's demand for interactive control is increasing. For example, a user desires to edit a picture using a portable device, such as a smart phone or tablet computer, or the like. For example, a user desires to add a particular layer to a picture or crop and/or zoom the picture. This places higher demands on the accuracy and flexibility of the interactive control during the picture editing process.
Disclosure of Invention
In a first aspect of the present disclosure, an interactive control method is provided. The method includes presenting a set of controls associated with a target object in a target interface; responsive to a received touch input, determining, within a common control region of a plurality of controls in the set of controls, a target control of the plurality of controls that is activated based on directional information associated with the touch input, wherein the control region indicates a range for controlling the corresponding control; and executing a first operation corresponding to the target control on the target object based on the touch input.
In a second aspect of the present disclosure, an interactive control device is provided. The apparatus includes a presentation module configured to present a set of controls associated with a target object in a target interface; a target control determining module configured to determine, in response to a touch location of a received touch input within a common control region of a plurality of controls in the set of controls, an activated target control of the plurality of controls based on direction information associated with the touch input, wherein the control region indicates a range for controlling the corresponding control; and a first operation execution module configured to execute a first operation corresponding to the target control on the target object based on the touch input.
In a third aspect of the present disclosure, an electronic device is provided. The apparatus comprises at least one processing unit; and at least one memory coupled to the at least one processing unit and storing instructions for execution by the at least one processing unit. The instructions, when executed by at least one processing unit, cause the apparatus to perform the method of the first aspect.
In a fourth aspect of the present disclosure, a computer-readable storage medium is provided. A medium has stored thereon a computer program which, when executed by a processor, implements the method of the first aspect.
It should be understood that what is described in this summary is not intended to limit the critical or essential features of the embodiments of the disclosure nor to limit the scope of the disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The above and other features, advantages and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. In the drawings, wherein like or similar reference numerals denote like or similar elements, in which:
FIG. 1 illustrates a schematic diagram of an example environment in which embodiments of the present disclosure may be implemented;
2A-2C illustrate schematic diagrams of layer editing according to some embodiments of the present disclosure;
FIGS. 3A and 3B illustrate a schematic diagram of a process of interactive control, according to some embodiments of the present disclosure;
FIG. 4 illustrates a schematic diagram of layer editing according to some embodiments of the present disclosure;
FIG. 5 illustrates a schematic diagram of a process of interactive control according to another embodiment of the present disclosure;
FIGS. 6A and 6B illustrate schematic diagrams of layer editing according to some embodiments of the present disclosure;
FIG. 7 shows a schematic diagram of a process of interactive control according to yet another embodiment of the present disclosure;
FIG. 8 illustrates a flow chart of a process of interactive control according to some embodiments of the present disclosure;
FIG. 9 illustrates a block diagram of an interactive control device, according to some embodiments of the present disclosure; and
fig. 10 illustrates a block diagram of an apparatus capable of implementing various embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been illustrated in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather, these embodiments are provided so that this disclosure will be more thorough and complete. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
In describing embodiments of the present disclosure, the term "comprising" and its like should be taken to be open-ended, i.e., including, but not limited to. The term "based on" should be understood as "based at least in part on". The term "one embodiment" or "the embodiment" should be understood as "at least one embodiment". The term "some embodiments" should be understood as "at least some embodiments". Other explicit and implicit definitions are also possible below.
It will be appreciated that the image data (including but not limited to the data itself, the acquisition, use, storage or deletion of the data) according to the present technical solution should comply with the corresponding legal regulations and the requirements of the relevant regulations.
It will be appreciated that prior to using the technical solutions disclosed in the embodiments of the present disclosure, the relevant users, which may include any type of rights subjects, such as individuals, enterprises, groups, etc., should be informed and authorized by appropriate means of the types, usage ranges, usage scenarios, etc. of the data involved in the present disclosure according to the relevant laws and regulations.
As described above, users desire to implement editing of pictures by interacting with a portable device such as a smart phone, tablet, etc., such as adding a specific layer to a picture or cropping and/or zooming the picture. When a user interacts with the portable device to edit a picture, for example, when zooming a picture layer, because the interaction area is small, when the picture layer is reduced to a certain extent, component controls for controlling the zooming of the picture layer may be hidden, or control areas of the controls may overlap, so that the user is difficult to perform the next picture layer editing operation, and the interaction accuracy and convenience are affected.
The technical scheme of the present disclosure provides an interactive control scheme. According to an interactive control scheme of an embodiment of the present disclosure, a set of controls associated with a target object is presented in a target interface. And if the touch position of the received touch input is within a common control area of a plurality of controls in the group of controls, determining an activated target control in the plurality of controls based on direction information associated with the touch input. The control region indicates a range for controlling the corresponding control. And executing a first operation corresponding to the target control on the target object based on the touch input. In this way, misoperation on the control caused by overlapping of control areas of the control can be effectively avoided, and the picture editing efficiency is effectively improved.
Example Environment
Referring first to FIG. 1, a schematic diagram of an example environment 100 in which an example implementation according to the present disclosure may be implemented is schematically illustrated.
FIG. 1 illustrates a schematic diagram of an example environment 100 in which embodiments of the present disclosure may be implemented. In this example environment 100, a target application 120 is installed in an electronic device 110. The user 102 may interact with the target application 120 via the electronic device 110 and/or an attached device of the electronic device 110. The target application 120 may be an application capable of providing services related to media content to the user 102, including authoring (e.g., shooting and/or editing), publishing, browsing, etc. of the media content. In this context, a "media content" may be a variety of forms of content, including video, audio, images, image sets, text, and so forth.
In the environment 100 of fig. 1, if the target application 120 is in an active state, the electronic device 110 may present the user 102 with an interface 140 of the target application 120. The interface 140 may be various types of pages that can be provided by the target application 120, such as presentation pages of media content, content authoring pages, content editing pages, and so forth.
In some embodiments, the electronic device 110 communicates with the server 130 to enable provisioning of services for the target application 120. The electronic device 110 may be any type of mobile terminal, fixed terminal, or portable terminal, including a mobile handset, desktop computer, laptop computer, notebook computer, netbook computer, tablet computer, media computer, multimedia tablet, personal Communication System (PCS) device, personal navigation device, personal Digital Assistant (PDA), audio/video player, digital camera/camcorder, positioning device, television receiver, radio broadcast receiver, electronic book device, game device, or any combination of the preceding, including accessories and peripherals for these devices, or any combination thereof. In some embodiments, electronic device 110 is also capable of supporting any type of interface to the user (such as "wearable" circuitry, etc.). Server 130 is a computing system/server of various types capable of providing computing power, including, but not limited to, a mainframe, an edge computing node, a computing device in a cloud environment, and so forth.
It should be understood that the structure and function of environment 100 are described for illustrative purposes only and are not meant to suggest any limitation as to the scope of the disclosure.
Interactive control process
Fig. 2A-2C illustrate schematic diagrams of layer editing according to some embodiments of the present disclosure. The interface 200A shown in fig. 2A, the interface 200B shown in fig. 2B, and the interface 200C shown in fig. 2C may be presented by the electronic device 110. For ease of discussion, the process illustrated in FIG. 2 will be described with reference to environment 100 of FIG. 1.
As shown in fig. 2A, the electronic device 110 presents an interface 200A for picture editing. Upon detecting operation of control 213 (e.g., adding graphics) by user 102, an initial state of user 102 selecting the added target object (e.g., graphics 211) may be presented on interface 200A. The graphic 211 is within an edit box 212. A plurality of controls 201, 202, 203, 204, 205, 206, 207, and 208 for zooming the graphic 211 are presented at the periphery of the edit box 212. The plurality of controls 201, 202, 203, 204, 205, 206, 207, and 208 may be understood as a set of controls associated with a target object (e.g., graphic 211). By manipulating these controls 201, 202, 203, 204, 205, 206, 207, and 208, the graphic 211 may be scaled equally or unequally.
For example, if a user operates controls 201, 202, 203, or 204, graphic 211 may undergo an equal scaling, while if user 102 operates controls 205, 206, 207, or 208, graphic 211 may undergo an unequal scaling. Operations performed on the control herein may include single-click, double-click, drag, slide, etc. operations on the control, the scope of the disclosure being not limited in this respect.
In some embodiments, if user 102 scales graphic 211 non-equally by manipulating controls 205, 206, 207, and/or 208, some of the plurality of controls 201, 202, 203, 204, 205, 206, 207, and 208 for scaling graphic 211 are in very close proximity.
For example, in interface 200B shown in fig. 2B, controls 203, 208, and 204 are in very close proximity and controls 201, 206, and 202 are also in very close proximity due to unequal scaling of graphic 211 by user 102 via 205 and/or 207.
As another example, in interface 200C shown in fig. 2C, controls 203, 205, and 201 are in very close proximity and controls 204, 207, and 202 are also in very close proximity due to unequal scaling of graphic 211 by user 102 via 206 and/or 208.
Each control corresponds to a respective control region for indicating a range for controlling the corresponding control. When the positions of the plurality of controls are in close proximity, overlapping of control areas corresponding to the control areas of the respective controls may occur. In some scenarios, the control region may also be understood as a hot zone.
While in the above-described and below-described embodiments of the present disclosure, the plurality of controls 201, 202, 203, 204, 205, 206, 207, and 208 are used to scale a target object (e.g., graphic 211), it should be understood that the aspects described in the present disclosure are applicable to controls having other functionality, such as controls that crop for a target object. The scope of the present disclosure is not limited in this respect.
Fig. 3A and 3B illustrate schematic diagrams of a process of interactive control according to some embodiments of the present disclosure. The process of interactive control for a plurality of controls in the event that control areas of the plurality of controls overlap is described in detail below in connection with fig. 3A and 3B.
As shown in fig. 3A, control 301 and control 302 are in very close proximity. Control region 311 of control 301 and control region 312 of control 302 overlap, i.e., control 301 and control 302 have a common control region 313.
If the touch location of the touch input received by electronic device 110 is in the common control region 313, electronic device 110 determines an activated target control from controls 301 and 302 based on directional information associated with the touch input.
Electronic device 110 can obtain range information indicating a plurality of angular ranges corresponding to control 301 and control 302. For example, angles corresponding to control 301 range from 0 ° to 75 ° and 180 ° to 255 °, while angles corresponding to control 302 range from 75 ° to 180 ° and 255 ° to 0 ° (i.e., 360 °).
In some embodiments, the range information may be determined based on historical touch data. For example, electronic device 110 obtains historical touch information, which may be generated based on a set of historical touch inputs for a plurality of controls (e.g., control 301 and control 302). For example, the historical numerical control information may be obtained by a tester in one or more touches of a set of test controls. Based on the direction indicated by the historical touch information, electronic device 110 can determine an angular range corresponding to each of the plurality of controls (e.g., control 301 and control 302).
It should be appreciated that in the example shown in FIG. 3A, control 301 is, for example, an equally scaled control at the top left corner of the edit box, and control 302 is, for example, a control at the top boundary of the edit box. Once control 301 is, for example, an equally scaled control at the lower left corner of the edit box, control 302 is, for example, a control at the lower boundary of the edit box, the angular ranges corresponding to control 301 and control 302, respectively, may change. The example shown in fig. 3A is intended to depict only one embodiment for determining a target control, and the present disclosure is not limited in terms of the corresponding angular ranges of the plurality of controls involved.
If the touch location of the touch input received by the electronic device 110 is in the common control region 313 and it is determined that the directional information associated with the touch input matches a particular angular range, the electronic device 110 determines a control corresponding to the particular angular range as a target control. For example, if electronic device 110 detects that direction 314 associated with the touch input of user 102 falls within an angle of 180 ° to 255 °, electronic device 110 determines control 301 as the target control.
For example, the electronic device 110 may determine the directional information associated with the touch input of the user 102 based on a trajectory of the touch input within a predetermined distance range. In some embodiments, the electronic device 110 may detect an angle of a sliding track within a predetermined number of pixels (e.g., 8 px) of the user 102, which may relate to a direction determined by a line connecting from a start point to an end point, or a fitting direction associated with the sliding track. In some other embodiments, the electronic device 110 may detect a sliding track within a particular time period (e.g., a few microseconds) after the start of a touch input (the user 102 touches the interface) to determine direction information associated with the touch input.
After the electronic device 110 determines the target control, the electronic device 110 performs an operation corresponding to the determined target control on the target object based on the touch input of the user. For example, if control 301 is determined to be a target control, electronic device 110 performs an operation on the target object corresponding to control 301, i.e., performs an equal scale zoom on the target object, based on the touch input of user 102.
It should be appreciated that if the touch location of the touch input received by electronic device 110 is not within the common control region of the plurality of controls, e.g., in fig. 3A, the touch location of the touch input received by electronic device 110 is in a region other than common control region 313 in control region 311 or in a region other than common control region 313 in control region 312, control 301 or control 302 may be determined directly as the target control.
In some other embodiments, it may also occur that control areas of more than two controls overlap. For example, as shown in FIG. 3B, control 331, control 332, and control 333 are in very close proximity. Overlapping of control region 341 of control 331, control region 342 of control 332, and control region 343 of control 333 occurs, i.e., control 331, control 332, and control 333 have a common control region 351. In addition to the common control region 351 of the controls 331, 332, and 333, the controls 331 and 333 also have a common control region 352, and the controls 332 and 333 also have a common control region 353.
Electronic device 110 can obtain range information indicating a plurality of angular ranges corresponding to control 331, control 332, and control 333. For example, angles corresponding to control 331 range from 0 ° to 75 ° and 180 ° to 255 °, angles corresponding to control 332 range from 105 ° to 180 ° and 285 ° to 0 ° (i.e., 360 °), and angles corresponding to control 333 range from 75 ° to 105 ° and 255 ° to 285 °.
In some embodiments, the range information may be determined based on historical touch data. For example, electronic device 110 obtains historical touch information, which may be generated based on a set of historical touch inputs for a plurality of controls (e.g., control 331, control 332, and control 333). For example, the historical numerical control information may be obtained by a tester in one or more touches of a set of test controls. Based on the direction indicated by the historical touch information, electronic device 110 can determine an angular range corresponding to each of the plurality of controls (e.g., control 331, control 332, and control 333).
It should be appreciated that in the example shown in FIG. 3B, control 331 is, for example, an equally scaled control in the upper left corner of the edit box, and control 332 is, for example, an equally scaled control in the upper right corner of the edit box. Once control 331 is, for example, an equal scale control in the lower left corner of the edit box, control 332 is, for example, an equal scale control in the lower right corner of the edit box, the angular ranges corresponding to control 331 and control 332, respectively, may change. The example shown in fig. 3B is intended to depict only one embodiment for determining a target control, and the present disclosure is not limited in terms of the corresponding angular ranges of the plurality of controls involved.
In fig. 3B, if the touch location of the touch input received by electronic device 110 is in a common control region 351 of control 331, control 332, and control 333, electronic device 110 determines an activated target control from among control 331, control 332, and control 333 based on whether the directional information associated with the touch input matches a particular angular range.
For example, if the touch location of a touch input received by electronic device 110 is in the common control region 351 and it is determined that the direction 371 associated with the touch input falls within an angle of 180 ° to 255 °, then electronic device 110 determines control 331 as the target control. The determination of the direction associated with the touch input has been described in connection with fig. 3A, and is not described in detail in this embodiment.
After the electronic device 110 determines the target control, the electronic device 110 performs an operation corresponding to the determined target control on the target object based on the touch input of the user. For example, if control 331 is determined to be a target control, electronic device 110 performs an operation on the target object corresponding to control 331, i.e., performs an equal scale zoom on the target object, based on the touch input of user 102.
It should be appreciated that if the touch location of the touch input received by electronic device 110 is in the common control region of two of the three controls, e.g., the touch location is in common control region 352 of control 331 and control 333 or common control region 353 of control 332 and control 333, then the target control may be determined according to the process described in connection with FIG. 3A.
It should be appreciated that if the touch location of the touch input received by electronic device 110 is not within the common control region of the plurality of controls, i.e., is neither within common control region 351 of control 331, control 332, and control 333, nor within common control region 352 of control 331 and control 333, or common control region 353 of control 332 and control 333, control 331 or control 332 or control 333 may be determined directly as the target control based on the control region corresponding to the touch location.
Fig. 4 illustrates a schematic diagram of layer editing according to some embodiments of the present disclosure. The interface 400 illustrated in fig. 4 may be presented by the electronic device 110. For ease of discussion, the process illustrated in FIG. 2 will be described with reference to environment 100 of FIG. 1.
In the interface 400 shown in fig. 4, a plurality of controls 451, 452, 453, and 454 for zooming a target object (e.g., graphic 461) are presented. The plurality of controls 451, 452, 453, and 454 may be understood as a set of controls associated with a target object (e.g., graphic 461). By manipulating these controls 451, 452, 453, and 454, the graphic 461 may be scaled non-equally. In interface 400, controls 451, 452, 453, and 454 are in close proximity.
Similarly, each control corresponds to a respective control region that is used to indicate the scope of controlling the corresponding control. When the positions of the plurality of controls are in close proximity, overlapping of control areas corresponding to the control areas of the respective controls may occur.
Fig. 5 illustrates a schematic diagram of a process of interactive control, according to some embodiments of the present disclosure. The process of interactive control for the plurality of controls 451, 452, 453, and 454 in the case where the control areas of the plurality of controls overlap in the example shown in fig. 4 is described in detail below in connection with fig. 5.
As shown in fig. 5, control region 461 of control 451, control region 462 of control 452, control region 463 of control 453, and control region 464 of control 454 have a common control region 531. Electronic device 110 can obtain range information indicating a plurality of angular ranges corresponding to controls 451, 452, 453, and 454. For example, the angle range corresponding to control 451 is 45 ° to 135 °, the angle range corresponding to control 452 is 135 ° to 225 °, the angle range corresponding to control 453 is 225 ° to 315 °, and the angle range corresponding to control 454 is 315 ° to 0 ° (i.e., 360 °).
The range information corresponding to the respective angular ranges of controls 451, 452, 453, and 454 may be similarly determined based on historical touch data and are not described in detail herein.
In fig. 3B, if the touch location of the touch input received by electronic device 110 is in the common control region 531 of controls 451, 452, 453, and 454, electronic device 110 determines the activated target control from among controls 451, 452, 453, and 454 based on whether the directional information associated with the touch input matches a particular angular range.
For example, if the touch location of a touch input received by electronic device 110 is in the common control region 531 and it is determined that the direction 541 associated with the touch input falls within an angle of 225 ° to 315 °, then electronic device 110 determines control 453 as the target control. The determination of the direction associated with the touch input has been described in connection with fig. 3A, and is not described in detail in this embodiment.
After the electronic device 110 determines the target control, the electronic device 110 performs an operation corresponding to the determined target control on the target object based on the touch input of the user. For example, if control 453 is determined to be the target control, electronic device 110 performs an operation on the target object corresponding to control 453, i.e., performs a downward anisometric scaling on the target object, based on the touch input of user 102.
It should be appreciated that if the touch location of the touch input received by electronic device 110 is within a common control region of three or two of controls 451, 452, 453, and 454, it may also be determined based on the direction information of the touch input and the respective angular ranges corresponding to the controls described above.
It should be appreciated that if the touch location of the touch input received by electronic device 110 is not within a common control region of the plurality of controls, then control 451, 452, 453, or 454 may be determined directly as the target control based on the control region to which the touch location corresponds.
In some other scenarios, the target object may be scaled down to a particular size, causing some controls to overlap. In order to ensure that the user can continue to perform subsequent operations on the target object, some overlapping controls will be hidden. Fig. 6A and 6B illustrate schematic diagrams of layer editing according to some embodiments of the present disclosure. The case where part of the control is hidden is described below in connection with fig. 6A and 6B.
In the interface 600A shown in fig. 6A, for example, in the process of performing a zoom-out operation on the target object 621, the control 601 and the control 606 for scaling the target object equally may overlap, and the control 606 is hidden. Other controls 602, 603, 604, and 605 for non-equal scaling of the target object remain displayed.
It should be appreciated that although not shown, other possible equal scale controls may be hidden from overlapping control 601. Optionally, to make the user operation more explicit, other possible equal-scale controls may be hidden even if they do not overlap control 601. That is, in this case, only one control 601 for scaling the target object equally is reserved.
For example, the control 601 may be determined from a plurality of controls corresponding to an equal scale zoom function based on historical trigger information for the control 601. For example, the history trigger information indicates the number of times the control 601 was used to trigger the equal-scale zoom function.
It should be appreciated that the control 601 that is reserved for equal scaling is for exemplary purposes only, and that it is also possible to hide the control 601 while other controls for equal scaling are reserved.
FIG. 6B illustrates an interface 600B that shows yet another embodiment in which control overlap occurs. In interface 600B, all available controls for editing the target object that may appear on edit box 661 overlap control 631, in which case other controls than control 631 are hidden.
For the example shown in fig. 6A, fig. 7 shows a schematic diagram illustrating a process of interactive control according to yet another embodiment of the present disclosure. Similar to the example shown in FIG. 6A, the equal scale zoom control is hidden except for control 601. In this case, if electronic device 110 detects that touch location 721 of the user's touch input is in control region 711 of control 601, control 601 is determined to be the target control. Electronic device 110 in turn performs the operation corresponding to control 601. That is, in this case the control area of control area 711 of control 601 is set top, meaning that control area 711 of control 601 is above the control areas of controls 602, 603, 604, and 605 for non-equal scaling of the target object. Therefore, the user can be ensured to perform equal-proportion scaling on the target object under the extreme condition, the interactive operation of the interface can be more in line with the operation logic of the user through the design, and the user experience is improved.
For the example shown in fig. 6B, since only space 631 exists on the interface, electronic device 110 may then understand any touch input by the user as an operation on control 631, thereby performing an operation corresponding to control 631 on the target object.
Through the detailed description of the embodiments of the present disclosure, it can be appreciated that the scheme of the present disclosure provides a scheme for selecting a target control for touch input of a user, which can effectively avoid misoperation on the control caused by overlapping control areas of the control and/or overlapping the control, effectively improve accuracy and convenience of pictures, and further improve efficiency of editing pictures.
Example procedure
Fig. 8 illustrates a flow chart of a process 800 of interactive control according to some embodiments of the present disclosure. Process 800 may be implemented at electronic device 110. It should be appreciated that the process 800 may also be implemented at the server 130.
At block 810, the electronic device 110 presents a set of controls associated with the target object in the target interface.
If, at block 820, the touch location of the touch input received by the electronic device 110 is within a common control region of a plurality of controls in the set of controls, then, at block 830, the electronic device 110 determines a target control of the plurality of controls that is activated based on directional information associated with the touch input, wherein the control region indicates a range for controlling the corresponding control.
At block 840, the electronic device 110 performs a first operation on the target object corresponding to the target control based on the touch input.
In some embodiments, the electronic device 110 determines the target control from the plurality of controls that corresponds to a target angular range in response to the direction information matching the target angular range.
In some embodiments, the electronic device 110 obtains range information indicating a plurality of angular ranges corresponding to the plurality of controls; and determining that the direction information matches the target angle range of the plurality of angle ranges based on the range information.
In some embodiments, the range information is determined based on the following process: the electronic device 110 obtains historical touch information generated based on a set of historical touch inputs for the plurality of controls; and determining the plurality of angle ranges corresponding to the plurality of controls based on the direction indicated by the historical touch information.
In some embodiments, the electronic device 110 determines the directional information associated with the touch input based on a trajectory of the touch input within a predetermined distance range.
In some embodiments, the touch input is a first touch input, the touch location is a first touch location, and the electronic device 110 receives a zoom operation for the target interface; stopping displaying the second control in response to overlapping of a first control and the second control in the group of controls, wherein the first control and the second control have corresponding target functions; and responding to the received second touch position of the second touch input in the target control area of the first control, and executing a second operation corresponding to the first control based on the second touch input.
In some embodiments, the first control is determined from a plurality of target controls corresponding to the target function based on historical trigger information for the first control, the historical trigger information indicating a number of times the target function was triggered using the first control.
In some embodiments, the target function includes an equal scale scaling of the target object.
In some embodiments, the target interface comprises a media editing interface and the target object comprises a media element to be edited.
In some embodiments, the first operation comprises: a scaling operation on the media element; or a clipping operation on the media element.
Example apparatus and apparatus
Embodiments of the present disclosure also provide corresponding apparatus for implementing the above-described methods or processes. Fig. 9 illustrates a schematic block diagram of an apparatus 900 for data processing according to some embodiments of the present disclosure.
As shown in fig. 9, apparatus 900 may include a presentation module 910 configured to present a set of controls associated with a target object in a target interface. The apparatus 900 may further include a target control determination module 920 configured to determine, in response to a touch location of a received touch input being within a common control region of a plurality of controls in the set of controls, an activated target control of the plurality of controls based on directional information associated with the touch input, wherein the control region indicates a range for controlling the corresponding control. The apparatus 900 may further include a first operation performing module 930 configured to perform a first operation corresponding to the target control on the target object based on the touch input.
In some embodiments, the target control determination module 920 is further configured to: and determining the target control corresponding to the target angle range from the plurality of controls in response to the direction information being matched with the target angle range.
In some embodiments, apparatus 900 further comprises: a range information acquisition module configured to: acquiring range information, wherein the range information indicates a plurality of angle ranges corresponding to the plurality of controls; and a range matching module configured to: based on the range information, it is determined that the direction information matches the target angular range of the plurality of angular ranges.
In some embodiments, the range information is determined based on the following process: obtaining historical touch information, the historical touch information generated based on a set of historical touch inputs for the plurality of controls; and determining the plurality of angle ranges corresponding to the plurality of controls based on the direction indicated by the historical touch information.
In some embodiments, the apparatus 900 further comprises a direction information determination module configured to: the direction information associated with the touch input is determined based on a trajectory of the touch input within a predetermined distance range.
In some embodiments, the touch input is a first touch input, the touch position is a first touch position, and the apparatus 900 further includes a zoom operation receiving module configured to: receiving a zoom operation for the target interface; a stop display module configured to: stopping displaying the second control in response to overlapping of a first control and the second control in the group of controls, wherein the first control and the second control have corresponding target functions; and a second operation execution module configured to: and responding to the received second touch position of the second touch input to be in the target control area of the first control, and executing a second operation corresponding to the first control based on the second touch input.
In some embodiments, the first control is determined from a plurality of target controls corresponding to the target function based on historical trigger information for the first control, the historical trigger information indicating a number of times the target function was triggered using the first control.
In some embodiments, the target function includes an equal scale scaling of the target object.
In some embodiments, the target interface comprises a media editing interface and the target object comprises a media element to be edited.
In some embodiments, the first operation comprises: a scaling operation on the media element; or a clipping operation on the media element.
The elements included in apparatus 900 may be implemented in various manners, including software, hardware, firmware, or any combination thereof. In some embodiments, one or more units may be implemented using software and/or firmware, such as machine executable instructions stored on a storage medium. In addition to or in lieu of machine-executable instructions, some or all of the elements in apparatus 900 may be at least partially implemented by one or more hardware logic components. By way of example and not limitation, exemplary types of hardware logic components that can be used include Field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standards (ASSPs), systems On Chip (SOCs), complex Programmable Logic Devices (CPLDs), and the like.
Fig. 7 illustrates a block diagram of a computing device/server 1000 in which one or more embodiments of the disclosure may be implemented. It should be understood that the computing device/server 1000 illustrated in fig. 10 is merely exemplary and should not be construed as limiting the functionality and scope of the embodiments described herein.
As shown in fig. 10, the computing device/server 1000 is in the form of a general purpose computing device. Components of computing device/server 1000 may include, but are not limited to, one or more processors or processing units 1010, memory 1020, storage 1030, one or more communication units 1040, one or more input devices 1060, and one or more output devices 1060. The processing unit 1010 may be an actual or virtual processor and is capable of executing various processes according to programs stored in the memory 1020. In a multiprocessor system, multiple processing units execute computer-executable instructions in parallel to increase the parallel processing capabilities of computing device/server 1000.
The computing device/server 1000 typically includes a number of computer storage media. Such media can be any available media that is accessible by computing device/server 1000 and includes, but is not limited to, volatile and non-volatile media, removable and non-removable media. The memory 1020 may be volatile memory (e.g., registers, cache, random Access Memory (RAM)), non-volatile memory (e.g., read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory), or some combination thereof. Storage 1030 may be a removable or non-removable medium and may include machine-readable media such as flash drives, magnetic disks, or any other medium that may be capable of storing information and/or data (e.g., training data for training) and may be accessed within computing device/server 1000.
The computing device/server 1000 may further include additional removable/non-removable, volatile/nonvolatile storage media. Although not shown in fig. 10, a magnetic disk drive for reading from or writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk may be provided. In these cases, each drive may be connected to a bus (not shown) by one or more data medium interfaces. Memory 1020 may include a computer program product 1025 having one or more program modules configured to perform the various methods or acts of the various embodiments of the disclosure.
Communication unit 1040 enables communication with other computing devices via a communication medium. Additionally, the functionality of the components of the computing device/server 1000 may be implemented in a single computing cluster or in multiple computing machines capable of communicating over a communication connection. Accordingly, the computing device/server 1000 may operate in a networked environment using logical connections to one or more other servers, a network Personal Computer (PC), or another network node.
The input device 1050 may be one or more input devices such as a mouse, keyboard, trackball, etc. The output device 1060 may be one or more output devices such as a display, speakers, printer, etc. The computing device/server 1000 may also communicate with one or more external devices (not shown), such as storage devices, display devices, etc., as needed through the communication unit 1040, with one or more devices that enable users to interact with the computing device/server 1000, or with any device (e.g., network card, modem, etc.) that enables the computing device/server 1000 to communicate with one or more other computing devices. Such communication may be performed via an input/output (I/O) interface (not shown).
According to an exemplary implementation of the present disclosure, a computer-readable storage medium is provided, on which one or more computer instructions are stored, wherein the one or more computer instructions are executed by a processor to implement the method described above.
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer readable program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable medium having the instructions stored therein includes an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various implementations of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The foregoing description of implementations of the present disclosure has been provided for illustrative purposes, is not exhaustive, and is not limited to the implementations disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the various implementations described. The terminology used herein was chosen in order to best explain the principles of each implementation, the practical application, or the improvement of technology in the marketplace, or to enable others of ordinary skill in the art to understand each implementation disclosed herein.

Claims (13)

1. An interaction control method, comprising:
presenting a set of controls associated with a target object in a target interface;
responsive to a received touch input, determining, within a common control region of a plurality of controls in the set of controls, a target control of the plurality of controls that is activated based on directional information associated with the touch input, wherein the control region indicates a range for controlling the corresponding control; and
and executing a first operation corresponding to the target control on the target object based on the touch input.
2. The method of claim 1, wherein determining an activated target control of the plurality of controls based on directional information associated with the touch input comprises:
and determining the target control corresponding to the target angle range from the plurality of controls in response to the direction information being matched with the target angle range.
3. The method of claim 2, further comprising:
acquiring range information, wherein the range information indicates a plurality of angle ranges corresponding to the plurality of controls; and
based on the range information, it is determined that the direction information matches the target angular range of the plurality of angular ranges.
4. A method according to claim 3, wherein the range information is determined based on the following procedure:
obtaining historical touch information, the historical touch information generated based on a set of historical touch inputs for the plurality of controls; and
and determining the plurality of angle ranges corresponding to the plurality of controls based on the direction indicated by the historical touch information.
5. The method of claim 1, further comprising:
the direction information associated with the touch input is determined based on a trajectory of the touch input within a predetermined distance range.
6. The method of claim 1, wherein the touch input is a first touch input, the touch location is a first touch location, the method further comprising:
receiving a zoom operation for the target interface;
stopping displaying the second control in response to overlapping of a first control and the second control in the group of controls, wherein the first control and the second control have corresponding target functions; and
and responding to the received second touch position of the second touch input to be in the target control area of the first control, and executing a second operation corresponding to the first control based on the second touch input.
7. The method of claim 6, wherein the first control is determined from a plurality of target controls corresponding to the target function based on historical trigger information for the first control, the historical trigger information indicating a number of times the target function was triggered using the first control.
8. The method of claim 6 or 7, wherein the target function comprises an equal scale scaling of the target object.
9. The method of claim 1, wherein the target interface comprises a media editing interface and the target object comprises a media element to be edited.
10. The method of claim 9, wherein the first operation comprises:
a scaling operation on the media element; or (b)
And cutting the media element.
11. An interactive control device, comprising:
a presentation module configured to present a set of controls associated with a target object in a target interface;
a target control determining module configured to determine, in response to a touch location of a received touch input within a common control region of a plurality of controls in the set of controls, an activated target control of the plurality of controls based on direction information associated with the touch input, wherein the control region indicates a range for controlling the corresponding control; and
And the first operation execution module is configured to execute a first operation corresponding to the target control on the target object based on the touch input.
12. An electronic device, comprising:
at least one processing unit; and
at least one memory coupled to the at least one processing unit and storing instructions for execution by the at least one processing unit, which when executed by the at least one processing unit, cause the electronic device to perform the method of any one of claims 1 to 10.
13. A computer readable storage medium having stored thereon a computer program which when executed by a processor implements the method according to any of claims 1 to 10.
CN202410022300.3A 2024-01-05 2024-01-05 Interactive control method, device, equipment and storage medium Pending CN117827068A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410022300.3A CN117827068A (en) 2024-01-05 2024-01-05 Interactive control method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410022300.3A CN117827068A (en) 2024-01-05 2024-01-05 Interactive control method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117827068A true CN117827068A (en) 2024-04-05

Family

ID=90523988

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410022300.3A Pending CN117827068A (en) 2024-01-05 2024-01-05 Interactive control method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117827068A (en)

Similar Documents

Publication Publication Date Title
EP3371693B1 (en) Method and electronic device for managing operation of applications
CN109215037B (en) Target image segmentation method and device and terminal equipment
AU2021348459B2 (en) Method and apparatus for user guide, device and storage medium
US20180046351A1 (en) Controlling display object on display screen
CN115730092A (en) Method, apparatus, device and storage medium for content presentation
CN115097975A (en) Method, apparatus, device and storage medium for controlling view angle conversion
CN111597009B (en) Application program display method and device and terminal equipment
WO2023241583A1 (en) Method and apparatus for page interaction, device, and storage medium
US20230388266A1 (en) Method, apparatus, device and storage medium for reposting
US20230325451A1 (en) Method, apparatus, device and storage medium for search recommendation
CN117827068A (en) Interactive control method, device, equipment and storage medium
CN115079921A (en) Method, device, equipment and storage medium for controlling loading of scene information
CN115640783A (en) Method, device, equipment and storage medium for document content display
US10963141B2 (en) Smart multi-touch layout control for mobile devices
CN115686289A (en) Method, device, equipment and storage medium for user interaction
CN115344121A (en) Method, device, equipment and storage medium for processing gesture event
CN115509412A (en) Method, device, equipment and storage medium for special effect interaction
CN115115786A (en) Method, apparatus, device and storage medium for three-dimensional model generation
CN115576636A (en) Method, apparatus, device and storage medium for content presentation
CN111913635B (en) Three-dimensional panoramic picture display method and device, mobile terminal and storage medium
CN115100359A (en) Image processing method, device, equipment and storage medium
CN114861110A (en) Method, device, equipment and storage medium for work forwarding
CN114564921A (en) Document editing method and device
US20230418427A1 (en) Method, apparatuses, device and storage medium for video recommendation
WO2019127770A1 (en) Display method, device, equipment and storage medium for organization window

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination