CN117950557A - Picture interaction method, electronic device and storage medium - Google Patents

Picture interaction method, electronic device and storage medium Download PDF

Info

Publication number
CN117950557A
CN117950557A CN202410015192.7A CN202410015192A CN117950557A CN 117950557 A CN117950557 A CN 117950557A CN 202410015192 A CN202410015192 A CN 202410015192A CN 117950557 A CN117950557 A CN 117950557A
Authority
CN
China
Prior art keywords
picture
primary
touch operation
list
item
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410015192.7A
Other languages
Chinese (zh)
Inventor
谭成龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Xingji Meizu Technology Co ltd
Original Assignee
Wuhan Xingji Meizu Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Xingji Meizu Technology Co ltd filed Critical Wuhan Xingji Meizu Technology Co ltd
Priority to CN202410015192.7A priority Critical patent/CN117950557A/en
Publication of CN117950557A publication Critical patent/CN117950557A/en
Pending legal-status Critical Current

Links

Abstract

The application discloses a picture interaction method, electronic equipment and a storage medium, which belong to the technical field of terminals. Detecting a touch operation on the picture; in response to the touch operation, displaying a primary list, wherein the primary list comprises a plurality of primary operation items; based on the process of continuing the touch operation in the first direction, adjusting the selected primary operation item in the primary list; and responding to the selected primary operation item when the touch operation is finished, and executing the operation corresponding to the selected primary operation item on the picture.

Description

Picture interaction method, electronic device and storage medium
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a picture interaction method, an electronic device, and a storage medium.
Background
With the popularization of mobile terminals and the development of mobile terminal technologies, the functions of products realized by mobile terminals are increasing, and user operations required for realizing the functions tend to be complex.
For example, in gallery application, a user needs to click a picture to enter a picture detail interface, then click operation buttons at the bottom of a screen to execute corresponding operations in the picture detail interface, and in the case that some operation buttons correspond to subordinate operation items, the user also needs to click the subordinate operation items to realize specific operations. In the process, the user needs to click for multiple times to realize the corresponding function, and the complex and tedious operation can influence the user experience.
Disclosure of Invention
In a first aspect, an embodiment of the present application provides a method for interacting pictures, including:
Detecting a touch operation on the picture;
In response to the touch operation, displaying a primary list, wherein the primary list comprises a plurality of primary operation items;
based on the process of continuing the touch operation in the first direction, adjusting the selected primary operation item in the primary list;
And responding to the selected primary operation item when the touch operation is finished, and executing the operation corresponding to the selected primary operation item on the picture.
In some embodiments, the displaying a primary list in response to the touch operation includes:
Responding to the position of the touch operation start, and carrying out region division on the picture to obtain at least two sub-regions;
The first level list is displayed on one of the at least two sub-areas.
In some embodiments, the dividing the picture into at least two sub-regions includes:
Dividing the picture into an upper sub-region and a lower sub-region based on the position, or dividing the picture into a left sub-region and a right sub-region based on the position, or dividing the picture into an upper left sub-region, an upper right sub-region, a lower left sub-region and a lower right sub-region based on the position.
In some embodiments, further comprising:
displaying a secondary list formed by the secondary operation items under the condition that the secondary operation items exist in the selected primary operation items;
adjusting the selected secondary operation item in the secondary list based on continuing the process of the touch operation in a second direction, wherein the second direction is orthogonal to the first direction;
And responding to the selected secondary operation item when the touch operation is finished, and executing the operation corresponding to the selected secondary operation item on the picture.
In some embodiments, the adjusting the selected one of the one-level list based on the process of continuing the touch operation in the first direction includes:
And when the speed of the touch operation is smaller than the speed threshold value, adjusting the selected primary operation item in the primary list based on the process of continuing the touch operation in the first direction.
In some embodiments, further comprising:
and when the speed of the touch operation is greater than or equal to a speed threshold and the direction of the touch operation is the first direction, exiting the display interface of the picture.
In some embodiments, further comprising:
Switching the picture when the direction of the touch operation is the second direction;
The second direction is orthogonal to the first direction.
In some embodiments, the displaying the first-level list further includes:
taking a default operation item in the primary list as an initially selected primary operation item;
the default operation item is determined based on historical operation information of each level operation item in the level list.
In some embodiments, the step of determining the default operation item includes:
determining the probability of selecting each level operation item in the current touch operation based on the execution times and/or the latest execution time in the historical operation information of each level operation item;
and determining the first-level operation item with the highest selected probability as the default operation item.
In some embodiments, the ordering of the primary operational items in the primary list is determined based on historical operational information for the primary operational items.
In a second aspect, an embodiment of the present application provides an electronic device, including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor implements a method as described in any one of the above when executing the program.
In a third aspect, embodiments of the present application provide a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements a method as described in any of the above.
In a fourth aspect, embodiments of the present application provide a computer program product comprising a computer program which, when executed by a processor, implements a method as described in any of the above.
Drawings
In order to more clearly illustrate the application or the technical solutions of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are some embodiments of the application, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of a terminal according to an embodiment of the present application;
fig. 2 is a schematic flow chart of a picture interaction method according to an embodiment of the present application;
fig. 3 is a schematic diagram of a display effect of picture interaction according to an embodiment of the present application;
FIG. 4 is a second schematic diagram of a display effect of picture interaction according to an embodiment of the present application;
FIG. 5 is a third schematic diagram of a display effect of picture interaction according to an embodiment of the present application;
FIG. 6 is a diagram illustrating a display effect of picture interaction according to an embodiment of the present application;
FIG. 7 is a fifth exemplary diagram illustrating a display effect of picture interaction according to an embodiment of the present application;
FIG. 8 is a diagram illustrating a display effect of picture interaction according to an embodiment of the present application;
FIG. 9 is a diagram illustrating a display effect of picture interaction according to an embodiment of the present application;
FIG. 10 is a second flowchart of a method for interacting pictures according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The terms first, second and the like in the description and in the claims, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or otherwise described herein, and that the "first" and "second" distinguishing between objects generally are not limited in number to the extent that the first object may, for example, be one or more. Furthermore, in the description and claims, "and/or" means at least one of the connected objects, and the character "/" generally means a relationship in which the associated object is an "or" before and after.
For current mobile terminals, especially mobile terminals that interact through a touch screen, when a user opens a gallery application or album application and clicks on a picture to view its detailed information, there are generally two cases:
one is that a row of commonly used operation buttons, such as send, edit, collect, delete, and more, are displayed immediately at a certain position of the picture (e.g., below the picture), and the user can directly click on these buttons to perform operations, which obviously affects the experience of viewing the picture.
Another is that a certain position of the picture (e.g. below the picture) does not directly show the operation button, and the user needs to click the picture first and then the operation button will appear. Then, the user clicks the required button to perform the operation, that is, the user is required to lift the touching finger to perform the clicking operation.
In either case, the user's operation needs to be performed in two steps: the picture is selected first and then moved to a corresponding position of the screen to click a corresponding operation button instead of operating at a position where the user wishes to operate. Such a design obviously increases the operational complexity of the user.
Therefore, the application provides a picture interaction method, which is used for displaying a first-level list in response to a touch operation on a picture and adjusting selected first-level operation items in the duration of the touch operation, so that the selected first-level operation items are executed after the touch operation is finished. According to the method and the device, a user can realize a series of operations from triggering the first-level list to selecting the first-level operation item to executing the first-level operation item only by one touch sliding, so that a complex execution flow of operations on the picture can be realized by clicking for many times, the operations can be executed by touch sliding and lifting hands, the interaction flow on the picture is greatly simplified, and the interaction experience of the user is optimized.
The interaction method of the picture provided by the embodiment of the application can be applied to terminals such as mobile phones, tablet computers, wearable devices, vehicle-mounted devices, augmented reality (augmented reality, AR)/Virtual Reality (VR) devices, notebook computers, ultra-mobile personal computer (UMPC), netbooks, personal digital assistants (personal DIGITAL ASSISTANT, PDA) and the like, and can also be applied to databases, servers and service response systems based on terminal artificial intelligence.
For example, the terminal may be a Station (ST) in a WLAN, a cellular telephone, a cordless telephone, a Session initiation protocol (Session InitiationProtocol, SIP) telephone, a wireless local loop (WirelessLocal Loop, WLL) station, a Personal digital assistant (Personal DIGITAL ASSISTANT, PDA) device, a handheld device with wireless communication capabilities, a computing device or other processing device connected to a wireless modem, a computer, a laptop computer, a handheld communication device, a handheld computing device, and/or other devices for communicating over a wireless system, as well as next generation communication systems, such as a mobile terminal in a 5G network, a mobile terminal in a future evolved public land mobile network (PublicLand Mobile Network, PLMN), or a mobile terminal in a future evolved Non-terrestrial network (Non-TERRESTRIAL NETWORK, NTN), etc.
As an example and not by way of limitation, when the terminal is a wearable device, the wearable device may also be a generic term for intelligent design of daily wear using wearable technology, development of wearable devices, such as gloves, watches, AR ((Augmented Reality, augmented Reality) head-mounted display devices, VR (Virtual Reality) head-mounted display devices or MR (Mixed Reality) head-mounted display devices, etc. configured with far field communication modules and/or near field communication modules.
In some embodiments, the terminal may be a mobile phone 100 with a hardware structure as shown in fig. 1, and as shown in fig. 1, the mobile phone 100 may specifically include: radio Frequency (RF) circuitry 110, memory 120, input unit 130, display unit 140, sensor 150, audio circuitry 160, short-range wireless communication module 170, processor 180, and power supply 190. Those skilled in the art will appreciate that the configuration of the handset 100 shown in fig. 1 is not limiting of the electronic device, and the electronic device may include more or fewer components than shown, or may combine certain components, or may have a different arrangement of components.
The following describes the components of the mobile phone in detail with reference to fig. 1:
The RF circuit 110 may be used for receiving and transmitting signals during the process of receiving and transmitting information or communication, specifically, after receiving downlink information of the base station, the downlink information is processed by the processor 180; in addition, the data of the design uplink is sent to the base station. Typically, RF circuitry includes, but is not limited to, antennas, at least one amplifier, transceivers, couplers, low noise amplifiers (Low NoiseAmplifier, LNA), diplexers, and the like. In addition, RF circuit 110 may also communicate with networks and other devices via wireless communications. The wireless communications may use any communication standard or protocol, which may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (GENERAL PACKET radio service, GPRS), code division multiple access (code divisionmultiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), new Radio (NR), GNSS, FM, low-orbit satellite connection and/or IR techniques, and the like. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation SATELLITE SYSTEM, GLONASS), a beidou satellite navigation system (beidou navigation SATELLITE SYSTEM, BDS), a quasi zenith satellite system (quasi-zenithsatellite system, QZSS) and/or a satellite based augmentation system (SATELLITE BASED AUGMENTATION SYSTEMS, SBAS), etc.
The memory 120 may be used to store software programs and modules, and the processor 180 performs various functional applications and data processing of the cellular phone by running the software programs and modules stored in the memory 120. The memory 120 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like; the storage data area may store data (such as pictures, audio data, phonebooks, etc.) created according to the use of the cellular phone, etc. In addition, memory 120 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device. In particular, the memory 120 may store pictures taken by an electronic device or downloaded over a wireless network.
The input unit 130 may be used to receive input numeric or character information and to generate key signal inputs related to user settings and function control of the mobile phone 100. In particular, the input unit 130 may include a touch panel 131 and other input devices 132. The touch panel 131, also referred to as a touch screen, may collect touch operations thereon or thereabout by a user (e.g., operations of the user on the touch panel 131 or thereabout by using any suitable object or accessory such as a finger, a stylus, etc.), and drive the corresponding connection device according to a predetermined program. Alternatively, the touch panel 131 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device and converts it into touch point coordinates, which are then sent to the processor 180, and can receive commands from the processor 180 and execute them. In addition, the touch panel 131 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave. The input unit 130 may include other input devices 132 in addition to the touch panel 131. In particular, other input devices 132 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, mouse, joystick, etc.
The display unit 140 may be used to display information input by a user or information provided to the user and various menus of the mobile phone. The display unit 140 may include a display panel 141, and alternatively, the display panel 141 may be configured in the form of a Liquid crystal display (Liquid CRYSTALDISPLAY, LCD), a Light Emitting Diode (LIGHT EMITTING Diode, LED), an Organic Light-Emitting Diode (OLED), an Active-Matrix Organic LIGHT EMITTING Diode (AMOLED), or the like. Further, the touch panel 131 may cover the display panel 141, and when the touch panel 131 detects a touch operation thereon or thereabout, the touch panel is transferred to the processor 180 to determine the type of the touch event, and then the processor 180 provides a corresponding visual output on the display panel 141 according to the type of the touch event. Although in fig. 1, the touch panel 131 and the display panel 141 are two independent components to implement the input and output functions of the mobile phone, in some embodiments, the touch panel 131 and the display panel 141 may be integrated to implement the input and output functions of the mobile phone.
The handset 100 may also include at least one sensor 150, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel 141 according to the brightness of ambient light, and a proximity sensor that may turn off the display panel 141 and/or the backlight when the mobile phone moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the acceleration in all directions (generally three axes), and can detect the gravity and direction when stationary, and can be used for applications of recognizing the gesture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer and knocking), and the like; other sensors such as gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc. that may also be configured with the handset are not described in detail herein.
Audio circuitry 160, speaker 161, microphone 162 may provide an audio interface between the user and the handset. The audio circuit 160 may transmit the received electrical signal converted from audio data to the speaker 161, and the electrical signal is converted into a sound signal by the speaker 161 to be output; on the other hand, the microphone 162 converts the collected sound signal into an electrical signal, receives the electrical signal from the audio circuit 160, converts the electrical signal into audio data, outputs the audio data to the processor 180 for processing, and transmits the audio data to another electronic device, for example, via the RF circuit 110, or outputs the audio data to the memory 120 for further processing.
Wi-Fi, bluetooth, near Field Communication (NFC) and other Communication technologies belong to short-range wireless transmission technologies, and a mobile phone can help a user to send and receive emails, browse webpages, access streaming media and the like through a short-range wireless module 170, so that wireless broadband Internet access is provided for the user. The short-distance wireless module 170 may include a Wi-Fi chip, a bluetooth chip, and an NFC chip, through which the Wi-Fi chip may implement a Wi-Fi Direct connection function between the mobile phone 100 and other electronic devices, or may enable the mobile phone 100 to operate in an AP mode (Access Point mode) capable of providing a wireless Access service and allowing other wireless devices to Access or in a STA mode (Station mode) capable of being connected to an AP and not accepting Access of the wireless devices, so as to establish peer-to-peer communication between the mobile phone 100 and other Wi-Fi devices.
The processor 180 is a control center of the mobile phone, connects various parts of the entire mobile phone using various interfaces and lines, and performs various functions and processes data of the mobile phone by running or executing software programs and/or modules stored in the memory 120 and calling data stored in the memory 120, thereby performing overall monitoring of the mobile phone. Optionally, the processor 180 may include one or more processing units; alternatively, the processor 180 may include, for example, an application processor (application processor, AP), a modem processor, a graphics processor (graphics processingunit, GPU), an image signal processor (IMAGE SIGNAL processor, ISP), a controller, a video codec, a digital signal processor (DIGITAL SIGNAL processor, DSP), a baseband processor, and/or a neural-network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The handset 100 further includes a power supply 190 (e.g., a battery) for powering the various components, which may preferably be logically connected to the processor 180 via a power management system so as to provide for managing charging, discharging, and power consumption by the power management system.
The handset 100 may also include a camera. Optionally, the position of the camera on the mobile phone may be front or rear, which is not limited by the embodiment of the present application. The mobile phone can acquire a scene image of a current scene through the camera, and determine scene information and scene types through analyzing the scene image.
Fig. 2 is a schematic flow chart of a picture interaction method according to an embodiment of the present application. As shown in fig. 2, there is provided a picture interaction method, including the steps of: step 210, step 220, step 230, step 240. The method flow steps are only one possible implementation of the application.
Step 210, detecting a touch operation on the picture.
The pictures herein, i.e. the pictures that need to be interacted with, i.e. the pictures that are presented on the touch screen of the terminal. It can be understood that in the display state of the picture, the touch screen can directly display the picture, and related options for operating the picture are not displayed by default, so that experience of a user when the user views the picture is optimized, interference to the user viewing the picture caused by display of the related options at the picture is avoided, and immersive experience is provided for the user.
The user may touch the picture through the touch screen of the terminal, where the user's touch action on the picture may occur in the area on the touch screen where the picture is displayed.
Accordingly, the terminal can detect a touch operation of the user on the picture.
And 220, responding to the touch operation, and displaying a primary list, wherein the primary list comprises a plurality of primary operation items.
Specifically, after detecting a touch operation of the user on the picture, the terminal may respond to the touch operation. A specific way of responding may be to display a primary list.
The primary list here includes a plurality of primary operation items, that is, options that can be operated on the picture, and the primary operation items can be understood as conventional options for operating on the picture, for example, "edit", "delete", "collect", "send", "more", and the like.
That is, in the case where a picture is displayed, a touch operation with respect to the picture may be detected, and the primary list may be displayed as a response to the touch operation, that is, the user may trigger the terminal to display the primary list by a trigger operation.
Further, it is possible to display the first-level list in a case where the touch operation stays at one position for a short time without sliding.
Step 230, adjusting the selected primary operation item in the primary list based on the process of continuing the touch operation in the first direction.
Specifically, the user may continue the above-described touch operation while the terminal displays the primary list. Accordingly, the terminal may continuously monitor the touch operation, and in the process that the continuous touch operation is performed along the first direction, the terminal may adjust the selected primary operation item in the primary list. Here, the first direction may be an arrangement direction of each of the first-stage operation items in the first-stage list.
It can be understood that the touch operation of the user on the picture can trigger the terminal to display the first-level list, and the selection intention of the user on the first-level operation items in the first-level list can be transmitted to the terminal so as to control the terminal to display the first-level list and adjust the first-level operation items selected in the first-level list through the touch operation of the user. That is, in this process, the selected one-level operation item in the one-level list may be adjusted as the touch operation is continued.
Further, in the first-level list, the selected first-level operation item and the unselected first-level operation item may be displayed in different display manners. For example, the selected primary operation item and the unselected primary operation item may be distinguished by different font colors or frame colors, for example, the selected primary operation item and the unselected primary operation item may be distinguished by different font sizes, for example, the selected primary operation item and the unselected primary operation item may be distinguished by different animation effects, and the embodiment of the present application is not limited in this way.
For example, fig. 3 is one of schematic diagrams of display effects of picture interaction according to an embodiment of the present application; the circle drawn on the terminal touch screen in fig. 3 is used to represent the position of the touch operation, where the solid circle is the current position of the touch operation, the dotted circle is the historical position of the touch operation, the arrow between the dotted circle and the solid circle indicates the direction of the touch operation, i.e. the first direction, where the first direction may be represented as up and down with respect to the touch operation of the user, and the corresponding touch operation may be up and down. The list drawn on the terminal touch screen, i.e. the primary list, and the primary operation items in the primary list may include "edit", "delete", "collect", "more", "send", where the selected primary operation item is represented by a primary operation item with a dot-filled and white font, which is of course only exemplary, and those skilled in the art may adjust, for example, to increase or decrease the corresponding control according to the requirement.
As shown in fig. 3, as the touch operation continues in the first direction, the selected primary operation item in the primary list is also adjusted from "edit" to "collect" and finally to "send".
And step 240, responding to the selected primary operation item when the touch operation is finished, and executing the operation corresponding to the selected primary operation item on the picture.
Specifically, in the case that the selected primary operation item in the primary list is the primary operation item that the user desires to select, the user may lift the finger to end the touch. Accordingly, the terminal can detect that the touch operation is finished, and uses the first-level operation item selected when the touch operation is finished as the first-level operation item to be executed for the picture, and further responds to the first-level operation item, and executes the operation corresponding to the first-level operation item for the picture.
For example, when the first-level operation item selected at the end of the touch operation is "collection", the picture is collected; and when the touch operation is finished, the selected primary operation item is 'sending', entering a picture sending interface to determine a receiver of the picture, and sending the picture to the receiver.
For another example, fig. 4 is a second schematic diagram of a display effect of picture interaction according to an embodiment of the present application. As shown in fig. 4, assuming that the selected primary operation item is "edit" at the end of the touch operation, i.e., when the user's finger is lifted, editing may be performed, i.e., jumping to the picture editing interface. As can be seen from fig. 3, under the picture editing interface, picture editing operations including "beautification", "cutting", "filtering", "enhancement" and the like may be set, so that the user can implement picture editing, which is only exemplary, and those skilled in the art can adjust, for example, to increase or decrease corresponding controls according to requirements.
In the embodiment of the application, the first-level list is displayed in response to the touch operation on the picture, and the selected first-level operation item is adjusted in the continuous process of the touch operation, so that the selected first-level operation item is executed after the touch operation is finished, a series of operations from triggering the first-level list to selecting the first-level operation item to executing the first-level operation item can be realized by only one touch sliding, the complex execution flow of the operation on the picture which can be realized by clicking for many times is avoided, the touch sliding and the lifting of the hand can be executed, the interaction flow on the picture is greatly simplified, and the efficient and smooth interaction experience is realized.
It should be noted that each embodiment of the present application may be freely combined, exchanged in order, or separately executed, and does not need to rely on or rely on a fixed execution sequence.
In some embodiments, in step 220, said displaying a primary list in response to said touch operation comprises:
Responding to the position of the touch operation start, and carrying out region division on the picture to obtain at least two sub-regions;
The first level list is displayed on one of the at least two sub-areas.
Specifically, in order to improve the convenience of user operation, the user can more conveniently view the first-level list and the first-level operation items selected in the first-level list in the process of touch operation, and the starting position of the touch operation, namely the position of the finger pressed by the user on the touch screen at the beginning of the touch operation can be determined under the condition that the touch operation is detected.
After determining the position at which the touch operation starts, the picture may be divided into regions based on the position, thereby dividing the picture into at least two sub-regions. Here, the picture division manner in response to the position at which the touch operation starts may be a reference point for picture division with the position at which the touch operation starts, for example, the picture may be divided into left and right two sub-areas based on the position at which the touch operation starts, thereby enabling the first-level list to be displayed on the left or right side of the finger; or dividing the picture into an upper sub-area and a lower sub-area based on the position of the touch operation; still alternatively, the picture may be divided into four sub-areas of upper left, lower left, upper right, and lower right based on the position where the touch operation starts, which is not particularly limited in the embodiment of the present application.
After the picture region division is completed, one sub-region may be selected from at least two sub-regions obtained by the division for displaying the primary list. Here, the criterion for selecting the sub-region for displaying the primary list may be that the size of the sub-region for displaying the primary list should be equal to or larger than the size of the primary list, for example, the length and width of the sub-region should be larger than the length and width of the primary list. For example, fig. 5 is a third schematic diagram of a display effect of picture interaction according to an embodiment of the present application. The circle drawn on the terminal touch screen in fig. 5 is used to represent the position of the touch operation, where the solid circle represents the position of the touch operation start, and the broken line is a dividing line for dividing the picture into regions determined based on the position of the touch operation start, whereby the picture can be divided into two sub-regions of left and right. In fig. 5, the sub-region on the left is taken as the sub-region displaying the primary list.
Or the basis for selecting the subarea for displaying the primary list may be the area size of each subarea, for example, the subarea with the largest area may be selected as the subarea for displaying the primary list, which is not particularly limited in the embodiment of the present application.
Or the basis for selecting the subareas for displaying the primary list can be an area comprising the least significant object in each subarea, and the significant object can be an element of a salient picture of a human body, an object (such as a car, a plant, an animal, a mountain stone, a building and the like).
In some examples, determining salient objects may result in a binarized picture by binarizing the picture, then edge detecting the binarized picture to extract edge regions, and extracting contours from each edge region to determine the region of each salient object in the picture.
In the embodiment of the application, the picture is divided into the areas based on the starting position of the touch operation to determine the subareas for displaying the primary list, so that the display position of the primary list can be changed along with the position of the touch operation, a user can select the primary operation item more intuitively through the touch operation, and the user interaction experience is further optimized.
In some embodiments, in step 220, the performing region division on the picture to obtain at least two sub-regions includes:
Dividing the picture into an upper sub-region and a lower sub-region based on the position, or dividing the picture into a left sub-region and a right sub-region based on the position, or dividing the picture into an upper left sub-region, an upper right sub-region, a lower left sub-region and a lower right sub-region based on the position.
Specifically, when dividing the picture by using the position at which the touch operation starts as a reference point for dividing the picture, the picture may be divided into two sub-regions, for example, may be divided in the vertical direction, thereby obtaining left and right sub-regions, or may be divided in the horizontal direction, thereby obtaining upper and lower sub-regions;
Or the picture can be divided into four sub-areas, for example, the four sub-areas can be respectively divided in the horizontal direction and the vertical direction, so that the four sub-areas of upper left, upper right, lower left and lower right are obtained.
It can be understood that the dividing mode of the picture can be selected by default, can be preset by a user, and can be selected by the terminal in an adaptive manner according to the size of the picture. For example, a size threshold may be preset, and in the case that the size of the picture is smaller than the size threshold, a primary division manner is selected, that is, only division is performed in the horizontal direction or only division is performed in the vertical direction, so as to obtain an upper sub-area and a lower sub-area, or obtain a left sub-area and a right sub-area; and under the condition that the size of the picture is larger than or equal to the size threshold, selecting a secondary division mode, namely dividing in the horizontal direction and the vertical direction to obtain four sub-areas of upper left, upper right, lower left and lower right.
In some embodiments, the method further comprises:
displaying a secondary list formed by the secondary operation items under the condition that the secondary operation items exist in the selected primary operation items;
adjusting the selected secondary operation item in the secondary list based on continuing the process of the touch operation in a second direction, wherein the second direction is orthogonal to the first direction;
And responding to the selected secondary operation item when the touch operation is finished, and executing the operation corresponding to the selected secondary operation item on the picture.
Specifically, in the operation for the picture, some primary operation items have corresponding secondary operation items, where the secondary operation items may be operation confirmation items for the primary operation items, or refinement operation items covered by the primary operation items.
For example, a primary operation item "delete" is one relatively sensitive operation item, and once delete is performed, a picture may be difficult to restore, and thus a user is often required to determine here whether to perform a delete operation before delete is performed, whereby the primary operation item "delete" may correspond to two secondary operation items, namely "yes" and "no" whether to delete. For example, fig. 6 is a schematic diagram of a display effect of picture interaction provided in the embodiment of the present application, as shown in fig. 6, when a selected primary operation item is "delete" in the process of touch operation, it may be displayed that "delete" may correspond to two secondary operation items, that is, "yes" and "no" whether to delete.
For another example, a primary operation item "more" may encompass a plurality of refined operation items, whereby the primary operation item "more" may correspond to a plurality of secondary operation items, which may include, for example, the secondary operation items "extract text", "set as wallpaper", "rename", "details", and the like.
And in the continuous process of the touch operation, if the selected primary operation has a secondary operation item, displaying a secondary list corresponding to the primary operation item. It will be appreciated that the secondary list herein is made up of secondary operations corresponding to the primary operation.
Here, in order to facilitate distinguishing whether the touch operation of the user is to adjust the selected primary operation item or to adjust the selected secondary operation item in the secondary list corresponding to the selected primary operation item in the duration of the touch operation, the arrangement direction of the secondary operation items in the secondary list may be set to a second direction, where the second direction is orthogonal to the first direction.
Therefore, the direction of the touch operation can be distinguished whether the touch operation of a user adjusts the selected primary operation item or adjusts the selected secondary operation item in the secondary list corresponding to the selected primary operation item, so that the selected primary operation item in the primary list is adjusted in the process of continuous touch operation in the first direction, and the selected secondary operation item in the secondary list is adjusted in the process of continuous touch operation in the second direction.
For example, fig. 7 is a fifth schematic diagram of a display effect of picture interaction provided by the embodiment of the present application, as shown in fig. 7, a circle drawn on a terminal touch screen in fig. 7 is used to represent a position of a touch operation, where a solid circle is a current position of the touch operation, a dotted circle is a history position of the touch operation, an arrow between the dotted circle and the solid circle indicates a direction of the touch operation, and in fig. 6, a horizontal direction of arrangement of secondary operation items, that is, a second direction, where the second direction may be represented as left and right with respect to the touch operation of a user, and the corresponding touch operation is left and right.
As shown in fig. 7, as the touch operation continues in the second direction (e.g., rightward), the selected secondary operation item in the secondary list is also adjusted from "yes" to "no", and correspondingly, as the touch operation proceeds in the second direction (e.g., leftward), the selected secondary operation item in the secondary list is also adjusted from "no" to "yes".
Fig. 8 is a sixth schematic view of a display effect of picture interaction provided by the embodiment of the present application, as shown in fig. 8, the primary operation item selected at first is "delete", and the primary operation item "delete" has the secondary operation item, so that the secondary operation items "yes" and "no" of "delete" are also displayed in the touch screen.
As the touch operation continues in the vertical direction of the arrangement of the primary operation items, i.e., in the first direction (e.g., downward), the selected primary operation item is adjusted from "delete" to "collect" without the corresponding secondary operation item, so that the secondary operation item is no longer displayed in the touch screen. Then, as the touch operation continues in the first direction (e.g., upward), the selected primary operation item is again adjusted from "favorite" to "edit", and the corresponding secondary operation item is absent from "edit", so that the secondary operation item is not displayed in the touch screen either.
Further, at the end of the touch operation, if the selected primary operation item has a corresponding secondary operation item, an operation corresponding to the selected secondary operation item is performed.
For example, fig. 9 is a fifth schematic diagram of a display effect of picture interaction provided in the embodiment of the present application, as shown in fig. 9, if the selected primary operation item is "delete" and the selected secondary operation item is "yes" under "delete", when the touch operation is ended, that is, when the finger of the user is lifted, deletion may be performed, that is, jump to the deletion interface. As can be seen from FIG. 9, the field "Picture has moved into recycle bin-! And prompting the user that the picture deletion is completed.
In the embodiment of the application, in response to the touch operation on the picture, a series of operations from triggering the primary list to selecting the primary operation item, to selecting the secondary operation item under the primary operation item and then to executing the secondary operation item are realized, so that complex execution flow of the operation on the picture can be realized by clicking for many times, the operation can be executed by touch sliding and lifting hands, the interaction flow on the picture is greatly simplified, and efficient and smooth interaction experience is realized.
In some embodiments, in step 230, the adjusting the selected primary operation item in the primary list based on the process of continuing the touch operation in the first direction includes:
And when the speed of the touch operation is smaller than the speed threshold value, adjusting the selected primary operation item in the primary list based on the process of continuing the touch operation in the first direction.
Specifically, in the embodiment of the present application, more possible execution intents may be given to the touch operation, and specifically, the speed of the touch operation may be taken as a factor for distinguishing the execution intents. In order to distinguish the execution intention based on the speed of the touch operation, a speed threshold may be preset, and in the case that the speed of the touch operation is less than the speed threshold, it is determined that the intention of the touch operation is to operate on the picture, at this time, in response to the touch operation, during the duration of the touch operation, the selected primary operation item in the primary list is adjusted, so that when the touch operation is ended, the terminal can perform a corresponding operation on the picture based on the selected primary operation item.
Here, the speed threshold may be a default value, or may be obtained by counting the speed of the user's daily touch operation. For example, the speed of the user performing the touch operation in daily life may be collected, and the speed interval of the user performing the touch operation most often may be counted from the collected speed interval, and the upper limit of the speed interval is taken as the maximum speed of the user when touching at the normal speed, i.e. the speed threshold. It will be appreciated that in the event that the speed of the touch operation is less than the speed threshold thus determined, it may be determined that the user is touching at a normal speed, at which time the selected primary operation item is adjusted and executed.
In some embodiments, further comprising:
and when the speed of the touch operation is greater than or equal to a speed threshold and the direction of the touch operation is the first direction, exiting the display interface of the picture.
Specifically, in the case where a touch operation is detected and the speed of the touch operation is equal to or greater than the above-described speed threshold, it may be determined that the intention of the touch operation at this time is not to operate on the picture.
In this case, the user's intention to perform the touch operation may be further resolved based on the direction of the touch operation.
At this time, assuming that the direction of the touch operation is the first direction, it may be determined that the intention of the touch operation is to exit, and thus the terminal may exit the display interface of the picture in response to the touch operation.
It is understood that, assuming that the speed threshold herein is the maximum speed when the user touches at a normal speed, and the speed of the touch operation is equal to or greater than the speed threshold, it is understood that the touch operation herein is an operation in which the user quickly touches and slides. In addition, assuming that the first direction is a vertical direction, the touch operation is an operation that the finger of the user slides up and down quickly, that is, the user can exit the display interface of the picture through the finger sliding up and down quickly.
In some embodiments, further comprising:
Switching the picture when the direction of the touch operation is the second direction;
The second direction is orthogonal to the first direction.
Specifically, in the case where a touch operation is detected and it is determined that the direction of the touch operation is the second direction, it may be determined that the intention of the touch operation at this time is not to operate with respect to the picture but to perform picture switching, whereby picture switching may be performed in response to the touch operation.
Assuming that the first direction is the vertical direction, the second direction orthogonal to the first direction is the horizontal direction, and the touch operation in the second direction is an operation of sliding the finger of the user left and right, that is, the user can realize the picture switching by sliding the finger left and right, further, the picture switching is performed from left to right to the previous picture, and the picture switching is performed from right to left to the next picture.
In some embodiments, in step 220, while displaying the first-level list, the method further includes:
taking a default operation item in the primary list as an initially selected primary operation item;
the default operation item is determined based on historical operation information of each level operation item in the level list.
Specifically, while the primary list is displayed, the default operation item in the primary list may also be used as the primary operation item initially selected by the touch operation. It is understood that the subsequent touch operation continued in the first direction is used for adjusting the first-stage operation item with the first-stage operation item selected initially as the starting point.
Compared to the scheme of fixing a certain one of the primary operation items as the default operation item in the related art, in the embodiment of the present application, the default operation item is selected from the primary operation items based on the historical operation information of each of the primary operation items in the primary list, where, for any one of the primary operation items, the historical operation information of the primary operation item may include the number of times the one of the primary operation items was executed in a preset period, and may also include the time when the one of the primary operation items was executed last.
For the historical operation information of each primary operation item, it can be judged which primary operation item is most likely to be selected by a user to execute corresponding operation when the user operates on an image, so that the primary operation item most likely to be selected by the user is used as a default operation item, and the default operation item is used as the primary operation item initially selected by the touch operation when responding to the touch operation, so that the probability that the user selects the primary operation item expected to be executed by the user after triggering the primary list display is improved, the user does not need to continuously slide to select the primary operation item, the operation burden of the user in interaction is further reduced, and the interaction efficiency is improved.
In some embodiments, the step of determining the default operation item includes:
determining the probability of selecting each level operation item in the current touch operation based on the execution times and/or the latest execution time in the historical operation information of each level operation item;
and determining the first-level operation item with the highest selected probability as the default operation item.
Specifically, the history operation information of the primary operation item may include at least one of the number of executions and the latest execution time. Wherein the number of executions may reflect whether the user frequently executed the primary operation item, and the latest execution time may reflect whether the user has recently biased to execute the primary operation item. Thus, the probability that the primary operation item is selected and executed in the current touch operation may be determined based on at least one of the two, which is referred to herein as the selected probability.
It will be appreciated that the higher the number of executions, the higher the probability of selection; the lower the execution number is, the lower the probability of selection is; in addition, the closer the latest execution time is to the current, the higher the probability of being selected; the farther the last execution time is from the current, the farther the probability of being selected.
Therefore, the selected probability of each level operation item can be calculated, and the level operation item with the highest selected probability is used as the default operation item.
In addition, in the related art, the arrangement order of the operation buttons for the image is generally fixed, and the user cannot adjust according to his own use habit, which certainly limits the degree of freedom and use experience of the user. To address the above, in some embodiments, the ordering of the primary operational items in the primary list is determined based on historical operational information for the primary operational items.
Specifically, in the embodiment of the present application, the order of the primary operation items in the primary list is not fixed, but is adjusted based on the historical operation information of the primary operation items.
Here, for any one level operation item, the history operation information of the one level operation item may include the number of times the one level operation item is executed within a preset period, and may also include the time when the one level operation item was executed last time. For the historical operation information of each level operation item, the probability that the user selects each level operation item to execute the corresponding operation when operating the image at this time, namely the selected probability, can be judged. Sequencing each level of operation items according to the sequence from high to low of the selected probability; or the operation items of each level can be ordered in the sequence from near to far in the last time of execution; still alternatively, each level of operation items may be ordered in order of more or less times of execution.
Therefore, in the process of touch sliding in the first direction, a user can select one-level operation item expected to be executed in a closer sliding distance, so that the operation burden of the user in interaction is further reduced, and the interaction efficiency is improved.
Further, the ordering of the operation items for each stage may be different in different scenarios. For example, the historical operation information of the 5 primary operation items can be respectively counted for "edit", "delete", "collect", "send", "more" and the like, and the specific cases can be expressed as follows:
the transmission (performed 10 minutes ago, 100 times in total),
Editing (performed 15 minutes ago, 300 times total),
The collection (performed 20 minutes ago, 150 times total),
Deletion (performed 5 minutes ago, 200 times total),
More (performed 2 minutes ago, 120 total times).
If ordered by the time last executed, the ordering in the primary list is: more-delete-send-edit-collection;
If the order is performed by the number of times, the order in the first level list is: edit-delete-collect-more-send.
In the embodiment of the application, the historical operation information is given to order the first-stage operation items, so that the user can select the first-stage operation items expected to be executed in a closer sliding distance in the process of touch sliding in the first direction, the operation burden of the user in interaction is further reduced, and the interaction efficiency is improved.
In some embodiments, fig. 10 is a second flowchart of a method for interacting pictures according to an embodiment of the present application. As shown in fig. 10, the method includes the steps of:
S1, entering a display interface of a picture:
after entering the display interface of the picture, the picture may be displayed in the display interface, and details of the picture may be displayed in the display interface.
In addition, a primary list aiming at the picture can be obtained, and the historical operation information of each primary operation item in the primary list is aimed at carrying out statistical sorting on each primary operation item in the primary list.
Subsequently, step S2 is performed.
S2, waiting for touch operation:
In this process, the touch operation can be detected in real time. In the case where the touch operation is detected, one of steps S2-1, S2-2, and S3 may be performed based on the touch manner of the touch operation.
S2-1, sliding the finger left and right to switch the picture:
In the case where the detected touch operation is left-right sliding after the finger is pressed, picture switching may be performed.
S2-2, pressing by a finger, rapidly sliding up and down, and exiting from the display interface:
In the case where the detected touch operation is a rapid up-and-down sliding after the finger is pressed, the display interface may be exited.
S3, after the finger is pressed down, the user does not slide for a short time, a first-level list is displayed, and a default operation item is selected:
After the detected touch operation is a finger press, without sliding for a short time, a first-level list is displayed based on the sorting of the first-level operation items determined in step S1, and a default operation item thereof is taken as the first-level operation item selected at the beginning of the touch operation. The default operation item herein may be the first one in the first level list.
The display position of the first list may be determined based on the position of the finger press, and may be displayed, for example, on the left or right of the position of the finger press.
Subsequently, one of the steps S3-1, S3-2, S4 may be performed.
S3-1, directly lifting the finger, and executing the selected primary operation item:
In the case where the finger lift, i.e., the touch operation is ended, is detected, the selected one-level operation item may be executed.
S3-2, the finger slides up and down rapidly. And exiting the display interface. And the finger rapidly slides left and right to switch the picture.
S4, sliding the finger up and down at normal speed, and adjusting the selected primary operation item:
When it is detected that the touch operation is continued in the up-down direction and the speed of the touch operation is a normal speed, the selected one-level operation item in the one-level list is adjusted based on the direction in which the finger slides.
S5, the selected primary operation item has a secondary list, the secondary list is displayed, and a default secondary operation item is selected:
And displaying a secondary list corresponding to the primary operation item when the selected primary operation item has the secondary list, for example, when the primary operation item is a sensitive operation item, and selecting a default secondary operation item in the secondary list.
With the step S5-1 or S5-2 may be performed, or the step S4 may be performed back.
S5-1, directly lifting the finger, and executing the selected secondary operation item:
in the case where the finger lift, i.e., the touch operation is ended, is detected, the selected secondary operation item may be executed.
S5-2, sliding the finger left and right, and adjusting the selected secondary operation item:
When the touch operation is detected to be continued in the left-right direction, the selected secondary operation item in the secondary list is adjusted based on the direction in which the finger is slid.
S5-3, lifting the finger, and executing the selected secondary operation item:
After the adjustment of the secondary operation item is completed, when the finger lift, that is, the touch operation is completed, is detected, the selected secondary operation item may be executed.
According to the method provided by the embodiment of the application, a series of operations from triggering the primary list to selecting the primary operation item, to selecting the secondary operation item under the primary operation item and then to executing the secondary operation item are realized in response to the touch operation on the picture, so that complex execution flow of the operation on the picture can be realized by clicking for many times, the touch sliding and the lifting can be executed, the interaction flow on the picture is greatly simplified, and the efficient and smooth interaction experience is realized.
Fig. 11 illustrates a physical structure diagram of an electronic device, as shown in fig. 11, which may include: processor 1110, communication interface Communications Interface 1120, memory 1130, and communication bus 1140, wherein processor 1110, communication interface 1120, memory 1130 perform communication with each other through communication bus 1140. Processor 1110 may invoke logic instructions in memory 1130 to perform a method of interaction of pictures, the method comprising:
Detecting a touch operation on the picture;
In response to the touch operation, displaying a primary list, wherein the primary list comprises a plurality of primary operation items;
based on the process of continuing the touch operation in the first direction, adjusting the selected primary operation item in the primary list;
And responding to the selected primary operation item when the touch operation is finished, and executing the operation corresponding to the selected primary operation item on the picture.
Further, the logic instructions in the memory 1130 described above may be implemented in the form of software functional units and sold or used as a stand-alone product, stored on a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
In another aspect, the present application also provides a computer program product, the computer program product comprising a computer program, the computer program being storable on a non-transitory computer readable storage medium, the computer program, when executed by a processor, being capable of executing the method provided by the above method embodiments, the method comprising:
Detecting a touch operation on the picture;
In response to the touch operation, displaying a primary list, wherein the primary list comprises a plurality of primary operation items;
based on the process of continuing the touch operation in the first direction, adjusting the selected primary operation item in the primary list;
And responding to the selected primary operation item when the touch operation is finished, and executing the operation corresponding to the selected primary operation item on the picture.
In yet another aspect, the present application also provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, is implemented to perform the method provided by the above-described method embodiments, the method comprising:
Detecting a touch operation on the picture;
In response to the touch operation, displaying a primary list, wherein the primary list comprises a plurality of primary operation items;
based on the process of continuing the touch operation in the first direction, adjusting the selected primary operation item in the primary list;
And responding to the selected primary operation item when the touch operation is finished, and executing the operation corresponding to the selected primary operation item on the picture.
The apparatus embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
From the above description of the embodiments, it will be apparent to those skilled in the art that the embodiments may be implemented by means of software plus necessary general hardware platforms, or of course may be implemented by means of hardware. Based on this understanding, the foregoing technical solution may be embodied essentially or in a part contributing to the prior art in the form of a software product, which may be stored in a computer readable storage medium, such as ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method described in the respective embodiments or some parts of the embodiments.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application.

Claims (12)

1. An interaction method of pictures, comprising:
Detecting a touch operation on the picture;
In response to the touch operation, displaying a primary list, wherein the primary list comprises a plurality of primary operation items;
based on the process of continuing the touch operation in the first direction, adjusting the selected primary operation item in the primary list;
And responding to the selected primary operation item when the touch operation is finished, and executing the operation corresponding to the selected primary operation item on the picture.
2. The method of interaction of pictures according to claim 1, wherein the displaying a primary list in response to the touch operation comprises:
Responding to the position of the touch operation start, and carrying out region division on the picture to obtain at least two sub-regions;
The first level list is displayed on one of the at least two sub-areas.
3. The method for interacting pictures according to claim 2, wherein the dividing the picture into at least two sub-regions comprises:
Dividing the picture into an upper sub-region and a lower sub-region based on the position, or dividing the picture into a left sub-region and a right sub-region based on the position, or dividing the picture into an upper left sub-region, an upper right sub-region, a lower left sub-region and a lower right sub-region based on the position.
4. The picture interaction method according to claim 1, further comprising:
displaying a secondary list formed by the secondary operation items under the condition that the secondary operation items exist in the selected primary operation items;
adjusting the selected secondary operation item in the secondary list based on continuing the process of the touch operation in a second direction, wherein the second direction is orthogonal to the first direction;
And responding to the selected secondary operation item when the touch operation is finished, and executing the operation corresponding to the selected secondary operation item on the picture.
5. The method for interacting pictures according to claim 1, wherein the adjusting the selected one of the one-level operation items in the one-level list based on the process of continuing the touch operation in the first direction includes:
And when the speed of the touch operation is smaller than the speed threshold value, adjusting the selected primary operation item in the primary list based on the process of continuing the touch operation in the first direction.
6. The picture interaction method according to claim 1, further comprising:
and when the speed of the touch operation is greater than or equal to a speed threshold and the direction of the touch operation is the first direction, exiting the display interface of the picture.
7. The picture interaction method according to claim 1, further comprising:
Switching the picture when the direction of the touch operation is the second direction;
The second direction is orthogonal to the first direction.
8. The method for interacting pictures according to any one of claims 1 to 7, wherein the displaying the primary list, simultaneously, further comprises:
taking a default operation item in the primary list as an initially selected primary operation item;
the default operation item is determined based on historical operation information of each level operation item in the level list.
9. The picture interaction method according to claim 8, wherein the determining of the default operation item includes:
determining the probability of selecting each level operation item in the current touch operation based on the execution times and/or the latest execution time in the historical operation information of each level operation item;
and determining the first-level operation item with the highest selected probability as the default operation item.
10. The interaction method of pictures according to any one of claims 1 to 7, wherein the ordering of the primary operation items in the primary list is determined based on historical operation information of the primary operation items.
11. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method of interaction of pictures according to any one of claims 1 to 10 when the program is executed.
12. A non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of interaction of pictures according to any one of claims 1 to 10.
CN202410015192.7A 2024-01-04 2024-01-04 Picture interaction method, electronic device and storage medium Pending CN117950557A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410015192.7A CN117950557A (en) 2024-01-04 2024-01-04 Picture interaction method, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410015192.7A CN117950557A (en) 2024-01-04 2024-01-04 Picture interaction method, electronic device and storage medium

Publications (1)

Publication Number Publication Date
CN117950557A true CN117950557A (en) 2024-04-30

Family

ID=90793673

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410015192.7A Pending CN117950557A (en) 2024-01-04 2024-01-04 Picture interaction method, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN117950557A (en)

Similar Documents

Publication Publication Date Title
CN114764298B (en) Cross-device object dragging method and device
RU2632153C2 (en) Method, device and terminal for displaying virtual keyboard
EP3647926A1 (en) Mobile terminal and split screen control method thereof, and computer readable storage medium
CN109542287B (en) Message reminding method and device, mobile terminal and storage medium
CN108089891B (en) Application program starting method and mobile terminal
KR101974296B1 (en) Gesture control method, apparatus and system
CN110865744A (en) Split-screen display method and electronic equipment
WO2018027551A1 (en) Message display method, user terminal and graphic user interface
WO2017012423A1 (en) Method and terminal for displaying instant message
RU2762311C1 (en) Method for video editing and intelligent mobile terminal
CN107193451B (en) Information display method and device, computer equipment and computer readable storage medium
CN108958593B (en) Method for determining communication object and mobile terminal
CN108111676B (en) Application program control method, mobile terminal and computer readable storage medium
CN110795007B (en) Method and device for acquiring screenshot information
CN110865745A (en) Screen capturing method and terminal equipment
CN110710191A (en) Photographing method and terminal
CN109683768B (en) Application operation method and mobile terminal
WO2014176901A1 (en) Method, device and storage medium for starting application in electronic apparatus
CN108170329B (en) Display control method and terminal equipment
CN110780751B (en) Information processing method and electronic equipment
CN114125546A (en) Information sharing method and device, terminal equipment and storage medium
US20140325449A1 (en) Method, device and storage medium for starting application in electronic apparatus
CN111190515A (en) Shortcut panel operation method, device and readable storage medium
KR20160097913A (en) Watch type terminal
CN111512278B (en) Method for processing application of terminal equipment and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination