WO2024045155A1 - 图标显示控制方法、移动终端及存储介质 - Google Patents

图标显示控制方法、移动终端及存储介质 Download PDF

Info

Publication number
WO2024045155A1
WO2024045155A1 PCT/CN2022/116724 CN2022116724W WO2024045155A1 WO 2024045155 A1 WO2024045155 A1 WO 2024045155A1 CN 2022116724 W CN2022116724 W CN 2022116724W WO 2024045155 A1 WO2024045155 A1 WO 2024045155A1
Authority
WO
WIPO (PCT)
Prior art keywords
icon
target
area
touch area
display
Prior art date
Application number
PCT/CN2022/116724
Other languages
English (en)
French (fr)
Inventor
许可欣
Original Assignee
深圳传音控股股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳传音控股股份有限公司 filed Critical 深圳传音控股股份有限公司
Priority to PCT/CN2022/116724 priority Critical patent/WO2024045155A1/zh
Publication of WO2024045155A1 publication Critical patent/WO2024045155A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present application relates to the field of terminal technology, and specifically relates to an icon display control method, a mobile terminal and a storage medium.
  • a plurality of icons arranged in a matrix are displayed on the display interface of the mobile terminal, and the user clicks on any icon to start the application corresponding to the icon.
  • the arrangement form of existing desktop applications is mainly a matrix arrangement.
  • users cannot hold the mobile terminal with one hand. Touch the entire area of the display screen, especially when the application icon is in an offset position.
  • the user cannot touch the icon far away from the user's hand with one-handed operation. For example, when the user holds the left side of the terminal, the screen is too wide and The icons on the right cannot be operated. At this time, both hands are often required to operate, and the operation steps are cumbersome.
  • this application provides an icon display control method, a mobile terminal and a storage medium, so that users can operate any icon in the icon row based on the moved icon row, and the operation is simple and convenient.
  • this application provides an icon display control method, which is applied to mobile terminals, including:
  • the S20 also includes at least one of the following:
  • the target touch area is determined according to the second triggering action.
  • the first triggering action satisfies at least one of the following:
  • the first triggering action matches the preset action
  • the trigger duration is greater than or equal to the preset duration.
  • the S10 includes:
  • the icon row corresponding to the trigger position is determined as the target icon row.
  • the S10 also includes:
  • the icon display state of the target icon row is changed to prompt the user that the target icon row is in a movable state.
  • the method of determining the target touch area according to the second triggering action includes at least one of the following:
  • the method of determining the target touch area according to the trigger position of the first trigger action includes at least one of the following:
  • the target touch area is determined to be a preset area on the right side of the display interface.
  • the method of determining the target touch area according to the holding position includes at least one of the following:
  • the target touch area is determined to be a preset area on the right side of the display interface.
  • the S20 also includes:
  • the S20 also includes:
  • the icons in the target icon row are arranged vertically in the target touch area, or the icons in the target icon row are arranged in an arc in the target touch area.
  • This application also provides a mobile terminal, including: a memory and a processor, wherein an icon display control program is stored on the memory, and when the icon display program is executed by the processor, the steps of the above method are implemented.
  • the present application also provides a storage medium.
  • a computer program is stored on the storage medium.
  • the steps of the above icon display control method are implemented.
  • the icon display control method of the present application includes the steps of: in response to a first trigger action, determining the target icon row corresponding to the first trigger action according to the first trigger action; in response to the first trigger action 2. Trigger an action, determine the target touch area, and move the icons in the target icon row to the target touch area.
  • Figure 1 is a schematic diagram of the hardware structure of a mobile terminal that implements various embodiments of the present application provided by an embodiment of the present application;
  • FIG. 2 is a communication network system architecture diagram provided by an embodiment of the present application.
  • Figure 3 is a schematic flowchart of an icon display control method according to the first embodiment
  • Figure 4 is a display interface diagram of an icon display control method according to the first embodiment
  • Figure 5 is a detailed flow chart of step S10 of the icon display control method according to the first embodiment
  • Figure 6 is a display interface diagram of an icon display control method according to the first embodiment
  • Figure 7 is a display interface diagram of an icon display control method according to the first embodiment
  • Figure 8 is a detailed flow chart of step S20 of the icon display control method according to the first embodiment
  • Figure 9 is a display interface diagram of the icon display control method according to the first embodiment.
  • Figure 10 is a display interface diagram of the icon display control method according to the first embodiment
  • Figure 11 is a display interface diagram of the icon display control method according to the first embodiment
  • Figure 12 is a display interface diagram of the icon display control method according to the first embodiment
  • Figure 13 is a schematic flowchart of an icon display control method according to the second embodiment
  • Figure 14 is a display interface diagram of an icon display control method according to the second embodiment
  • Figure 15 is a display interface diagram of an icon display control method according to the second embodiment
  • Figure 16 is a schematic flowchart of an icon display control method according to the third embodiment.
  • Figure 17 is a display interface diagram of an icon display control method according to the third embodiment.
  • Figure 18 is a schematic structural diagram of another mobile terminal provided by an embodiment of the present application.
  • Figure 19 is a schematic diagram of the hardware structure of a controller 140 provided by an embodiment of the present application.
  • Figure 20 is a schematic diagram of the hardware structure of a network node 150 provided by an embodiment of the present application.
  • first, second, third, etc. may be used herein to describe various information, the information should not be limited to these terms. These terms are only used to distinguish information of the same type from each other.
  • first information may also be called second information, and similarly, the second information may also be called first information.
  • word “if” as used herein may be interpreted as “when” or “when” or “in response to determining.”
  • singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context indicates otherwise.
  • A, B, C means “any of the following: A; B; C; A and B; A and C; B and C; A and B and C"; another example is, “ A, B or C” or "A, B and/or C” means "any of the following: A; B; C; A and B; A and C; B and C; A and B and C". Exceptions to this definition occur only when the combination of elements, functions, steps, or operations is inherently mutually exclusive in some manner.
  • each step in the flow chart in the embodiment of the present application is displayed in sequence as indicated by the arrows, these steps are not necessarily executed in the order indicated by the arrows. Unless explicitly stated in this article, the execution of these steps is not strictly limited in order, and they can be executed in other orders. Moreover, at least some of the steps in the figure may include multiple sub-steps or multiple stages. These sub-steps or stages are not necessarily executed at the same time, but may be executed at different times, and their execution order is not necessarily sequential. may be performed in turn or alternately with other steps or sub-steps of other steps or at least part of stages.
  • the words “if” or “if” as used herein may be interpreted as “when” or “when” or “in response to determination” or “in response to detection.”
  • the phrase “if determined” or “if (stated condition or event) is detected” may be interpreted as “when determined” or “in response to determining” or “when (stated condition or event) is detected )” or “in response to detecting (a stated condition or event)”.
  • step codes such as S301 and S302 are used for the purpose of describing the corresponding content more clearly and concisely, and do not constitute a substantial restriction on the sequence. Those skilled in the art may S302 will be executed first and then S301, etc., but these should be within the protection scope of this application.
  • Filming equipment can be implemented in various forms.
  • the shooting equipment described in this application may include mobile phones, tablet computers, notebook computers, PDAs, personal digital assistants (Personal Digital Assistant, PDA), portable media players (Portable Media Player, PMP), navigation devices, Wearable devices, smart bracelets, pedometers and other mobile terminals with cameras, as well as fixed terminals with cameras such as digital TVs and desktop computers.
  • PDA Personal Digital Assistant
  • PMP portable media players
  • navigation devices wearable devices, smart bracelets, pedometers and other mobile terminals with cameras, as well as fixed terminals with cameras such as digital TVs and desktop computers.
  • a mobile terminal will be taken as an example.
  • the structure according to the embodiments of the present application can also be applied to fixed-type terminals.
  • FIG. 1 is a schematic diagram of the hardware structure of a mobile terminal that implements various embodiments of the present application provided by an embodiment of the present application.
  • the mobile terminal 100 may include: an RF (Radio Frequency, radio frequency) unit 101, a WiFi module 102, Audio output unit 103, A/V (audio/video) input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111 and other components.
  • RF Radio Frequency, radio frequency
  • the radio frequency unit 101 can be used to receive and send information or signals during a call. Specifically, after receiving the downlink information of the base station, it is processed by the processor 110; in addition, the uplink data is sent to the base station.
  • the radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, transceiver, coupler, low noise amplifier, duplexer, etc.
  • the radio frequency unit 101 can also communicate with the network and other devices through wireless communication.
  • the above wireless communication can use any communication standard or protocol, including but not limited to GSM (Global System of Mobile communication, Global Mobile Communication System), GPRS (General Packet Radio Service, General Packet Radio Service), CDMA2000 (Code Division Multiple Access 2000 , Code Division Multiple Access 2000), WCDMA (Wideband Code Division Multiple Access, Wideband Code Division Multiple Access), TD-SCDMA (Time Division-Synchronous Code Division Multiple Access, Time Division Synchronous Code Division Multiple Access), FDD-LTE (Frequency Division) Duplexing-Long Term Evolution, Frequency Division Duplex Long Term Evolution) and TDD-LTE (Time Division Duplexing-Long Term Evolution, Time Division Duplex Long Term Evolution), etc.
  • GSM Global System of Mobile communication, Global Mobile Communication System
  • GPRS General Packet Radio Service
  • CDMA2000 Code Division Multiple Access 2000
  • WCDMA Wideband Code Division Multiple Access
  • TD-SCDMA Time Division-Synchronous Code Division Multiple Access, Time Division Synchronous Code Division Multiple Access
  • WiFi is a short-distance wireless transmission technology.
  • the mobile terminal can help users send and receive emails, browse web pages, access streaming media, etc. through the WiFi module 102. It provides users with wireless broadband Internet access.
  • FIG. 1 shows the WiFi module 102, it can be understood that it is not a necessary component of the mobile terminal and can be omitted as needed without changing the essence of the application.
  • the audio output unit 103 may, when the mobile terminal 100 is in a call signal receiving mode, a call mode, a recording mode, a voice recognition mode, a broadcast receiving mode, etc., receive the audio signal received by the radio frequency unit 101 or the WiFi module 102 or store it in the memory 109 The audio data is converted into audio signals and output as sound. Furthermore, the audio output unit 103 may also provide audio output related to a specific function performed by the mobile terminal 100 (eg, call signal reception sound, message reception sound, etc.). The audio output unit 103 may include a speaker, a buzzer, or the like.
  • the A/V input unit 104 is used to receive audio or video signals.
  • the A/V input unit 104 may include a graphics processor (Graphics Processing Unit, GPU) 1041 and a microphone 1042.
  • the graphics processor 1041 can process still pictures or images obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode. Video image data is processed.
  • the processed image frames may be displayed on the display unit 106.
  • the image frames processed by the graphics processor 1041 may be stored in the memory 109 (or other storage media) or sent via the radio frequency unit 101 or WiFi module 102.
  • the microphone 1042 can receive sounds (audio data) via the microphone 1042 in operating modes such as a phone call mode, a recording mode, a voice recognition mode, and the like, and can process such sounds into audio data.
  • the processed audio (voice) data can be converted into a format that can be sent to a mobile communication base station via the radio frequency unit 101 for output in a phone call mode.
  • Microphone 1042 may implement various types of noise cancellation (or suppression) algorithms to eliminate (or suppress) noise or interference generated in the process of receiving and transmitting audio signals.
  • the mobile terminal 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 1061 according to the brightness of the ambient light.
  • the proximity sensor can turn off the display when the mobile terminal 100 moves to the ear. Panel 1061 and/or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three axes). It can detect the magnitude and direction of gravity when stationary.
  • It can be used to identify applications of mobile phone posture (such as horizontal and vertical screen switching, related games, magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tapping), etc.; as for the mobile phone, it can also be configured with fingerprint sensor, pressure sensor, iris sensor, molecular sensor, gyroscope, barometer, hygrometer, Other sensors such as thermometers and infrared sensors will not be described in detail here.
  • the display unit 106 is used to display information input by the user or information provided to the user.
  • the display unit 106 may include a display panel 1061, which may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), or the like.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • the user input unit 107 may be used to receive input numeric or character information, and generate key signal input related to user settings and function control of the mobile terminal.
  • the user input unit 107 may include a touch panel 1071 and other input devices 1072.
  • the touch panel 1071 also known as a touch screen, can collect the user's touch operations on or near the touch panel 1071 (for example, the user uses a finger, stylus, or any suitable object or accessory on or near the touch panel 1071 operation), and drive the corresponding connection device according to the preset program.
  • the touch panel 1071 may include two parts: a touch detection device and a touch controller.
  • the touch detection device detects the user's touch orientation, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device and converts it into contact point coordinates , and then sent to the processor 110, and can receive the commands sent by the processor 110 and execute them.
  • the touch panel 1071 can be implemented using various types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the user input unit 107 may also include other input devices 1072.
  • other input devices 1072 may include but are not limited to one or more of physical keyboards, function keys (such as volume control keys, switch keys, etc.), trackballs, mice, joysticks, etc., which are not specifically discussed here. limited.
  • the touch panel 1071 can cover the display panel 1061.
  • the touch panel 1071 detects a touch operation on or near it, it is transmitted to the processor 110 to determine the type of the touch event, and then the processor 110 determines the type of the touch event according to the touch event.
  • the type provides corresponding visual output on the display panel 1061.
  • the touch panel 1071 and the display panel 1061 are used as two independent components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 1071 and the display panel 1061 can be integrated. The implementation of the input and output functions of the mobile terminal is not limited here.
  • the interface unit 108 serves as an interface through which at least one external device can be connected to the mobile terminal 100 .
  • external devices may include a wired or wireless headphone port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
  • the interface unit 108 may be used to receive input (eg, data information, power, etc.) from an external device and transmit the received input to one or more elements within the mobile terminal 100 or may be used to connect between the mobile terminal 100 and an external device. Transfer data between devices.
  • Memory 109 may be used to store software programs as well as various data.
  • the memory 109 may mainly include a storage program area and a storage data area.
  • the storage program area may store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), etc.;
  • the storage data area may Store data created based on the use of the mobile phone (such as audio data, phone book, etc.), etc.
  • memory 109 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
  • the processor 110 is the control center of the mobile terminal, using various interfaces and lines to connect various parts of the entire mobile terminal, by running or executing software programs and/or modules stored in the memory 109, and calling data stored in the memory 109 , execute various functions of the mobile terminal and process data, thereby overall monitoring the mobile terminal.
  • the processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor and a modem processor.
  • the application processor mainly processes the operating system, user interface, application programs, etc., and modulation
  • the demodulation processor mainly handles wireless communications. It can be understood that the above modem processor may not be integrated into the processor 110 .
  • the mobile terminal 100 may also include a power supply 111 (such as a battery) that supplies power to various components.
  • a power supply 111 such as a battery
  • the power supply 111 may be logically connected to the processor 110 through a power management system, thereby managing charging, discharging, and power consumption management through the power management system. and other functions.
  • the mobile terminal 100 may also include a Bluetooth module, etc., which will not be described again here.
  • FIG. 2 is an architecture diagram of a communication network system provided by an embodiment of the present application.
  • the communication network system is an LTE system of universal mobile communication technology.
  • the LTE system includes UEs (User Equipment, User Equipment) connected in sequence. )201, E-UTRAN (Evolved UMTS Terrestrial Radio Access Network, Evolved UMTS Terrestrial Radio Access Network) 202, EPC (Evolved Packet Core, Evolved Packet Core Network) 203 and the operator's IP business 204.
  • UEs User Equipment, User Equipment
  • E-UTRAN Evolved UMTS Terrestrial Radio Access Network
  • EPC Evolved Packet Core, Evolved Packet Core Network
  • UE201 may be the above-mentioned mobile terminal 100, which will not be described again here.
  • E-UTRAN202 includes eNodeB2021 and other eNodeB2022, etc.
  • eNodeB2021 can be connected to other eNodeB2022 through backhaul (for example, X2 interface), eNodeB2021 is connected to EPC203, and eNodeB2021 can provide access from UE201 to EPC203.
  • backhaul for example, X2 interface
  • EPC 203 may include MME (Mobility Management Entity, mobility management entity) 2031, HSS (Home Subscriber Server, home user server) 2032, other MME 2033, SGW (Serving Gate Way, service gateway) 2034, PGW (PDN Gate Way, packet data Network Gateway) 2035 and PCRF (Policy and Charging Rules Function, policy and charging functional entity) 2036, etc.
  • MME2031 is a control node that processes signaling between UE201 and EPC203, and provides bearer and connection management.
  • HSS2032 is used to provide some registers to manage functions such as the home location register (not shown in the figure), and to save some user-specific information about service characteristics, data rates, etc. All user data can be sent through SGW2034.
  • PGW2035 can provide IP address allocation and other functions for UE 201.
  • PCRF2036 is the policy and charging control policy decision point for business data flows and IP bearer resources. It is the policy and charging execution function. The unit (not shown) selects and provides available policy and charging control decisions.
  • IP services 204 may include the Internet, Intranet, IMS (IP Multimedia Subsystem, IP Multimedia Subsystem) or other IP services.
  • IMS IP Multimedia Subsystem, IP Multimedia Subsystem
  • Figure 3 shows a flow example diagram of a first embodiment of an icon display control method. The method includes the following steps:
  • S20 In response to the second triggering action, move the icons in the target icon row to the target touch area.
  • the target touch area is determined based on the second trigger action.
  • the display interface of the mobile terminal displays at least one icon row arranged in sequence, and the icon row includes at least one application icon.
  • Figure 4 shows a schematic diagram of the display interface. It can be understood that when the user holds the mobile terminal in his right hand, if the icon that the user needs to operate is the icon in the upper left corner, and the distance between the icon in the upper left corner and the user's hand is too far, the user The icon in the upper left corner cannot be operated with one hand. Based on this, embodiments of the present application propose a method of controlling icon movement to move the icon that the user needs to operate to a target touch area that is convenient for the user to operate.
  • a target icon row of the first triggering action is selected from a plurality of icon rows in the current display interface according to the first triggering action, and the target icon row is the user's The row of icons for the required operation is located, and the first triggering action is used to indicate the target icon row.
  • the first trigger action when receiving the first trigger action, determine whether the first trigger action satisfies a trigger condition.
  • the first trigger action satisfies at least one of the following: item:
  • the first triggering action matches the preset action
  • the trigger duration is greater than or equal to the preset duration.
  • the preset action includes at least one of a long press action, a sliding action, a double-click action, and a single click action. It can also be a long press and then sliding, when the first trigger action is consistent with the preset action. , then it is determined that the first trigger action matches the preset action, then it is determined that the first trigger action satisfies the trigger condition.
  • the triggering duration is the maintenance duration of the first triggering action.
  • the maintenance duration is the long press duration.
  • the second triggering action is a sliding During action, the maintenance duration is a sliding duration.
  • the S10 includes:
  • determining the icon row corresponding to the triggering position as the target icon row includes: determining the display of each icon row on the display interface according to the display interface. area, match the trigger position with the display area to determine the display area where the trigger position is located, and use the icon row corresponding to the display area where the trigger position is located as the target icon row.
  • Figure 6 shows a schematic diagram of determining the target icon row according to the trigger position. The icon display interface shown in Figure 6 displays the first icon row, the second icon row and the third icon in sequence from top to bottom. row, when the trigger position is located in the display area of the third icon row, the third icon row is used as the target icon row.
  • the associated trigger identifiers can be preset for each icon row respectively, and the trigger identifiers corresponding to each icon row can be displayed in the preset area of the display interface. Different trigger identifiers correspond to different trigger identifiers.
  • Display position when obtaining the trigger position of the first trigger action, determine the target display position matching the trigger position, and use the icon row associated with the trigger identification corresponding to the target display position as the target icon row.
  • the trigger identification associated with each icon row is displayed in a preset area of the display interface, so that the user can select the icon row in the preset area, improving the convenience of selecting the icon row.
  • the preset area includes At least one of the lower right corner area of the display interface, the lower left corner area of the display interface, and the bottom area of the display interface.
  • the preset area can also be determined in real time according to the handheld state.
  • the preset area is the lower right corner area of the icon display interface.
  • the preset area is the lower right corner area of the icon display interface.
  • the preset area is the lower left corner area of the icon display interface.
  • the preset area is the bottom area of the icon display interface.
  • the method of obtaining the handheld state may be to determine the current handheld state based on the handheld data detected by a sensor of the mobile terminal, and use the current handheld state as the handheld state.
  • the sensors include but are not limited to gravity sensors and gyroscopes.
  • instrument and angular velocity sensor it can also be determined according to the user's dominant hand holding the mobile terminal, and can also be determined according to the triggering position of the first triggering action.
  • the triggering position is on the left side of the center line of the display interface, It is determined that the holding state is left-hand holding, and when the trigger position is located on the right side of the center line of the display interface, it is determined that the holding state is right-hand holding, and the center line of the display interface is the center in the vertical direction. Wire.
  • the preset area can also be a user-customized setting based on his or her own needs.
  • the area within the preset range of the trigger location can also be determined based on the triggering position of the first triggering action as the preset area. area, so that the trigger identification corresponding to each icon row is close to the user's hand. For example, when the trigger position is located in the lower right corner area of the display interface, the preset area is determined to be the lower right corner of the icon interface.
  • the area corresponding to the middle position close to the right side is used as the preset area, and when the trigger position is located in the upper right corner area of the display interface When , the upper right corner area of the display interface is used as the preset area.
  • the trigger logo of each icon row is displayed in the preset area according to the display position of each trigger logo, and the trigger position of the first trigger action and the trigger logo of each icon row are displayed in the preset area.
  • the display positions in the preset area are compared to determine the target trigger mark corresponding to the trigger position of the first trigger action, and the icon row associated with the target trigger mark is used as the target icon row and displayed in the preset area
  • the trigger identification corresponding to each icon row the user can select the target icon row based on the preset area, preventing the user from being unable to select the target icon row with one hand due to the screen being too large and the distance between the target icon row and the user's hand being too large.
  • FIG. 7 shows an example diagram of selecting the target icon row according to the trigger identification.
  • S10 further includes:
  • the icon display state of the target icon row is changed to prompt the user that the target icon row is in a movable state.
  • the icon display state includes an icon display mode, and the icon display mode includes at least one of icon display color, icon background color, icon display size, icon display brightness, and adding a preset mark within a preset range of the icon row, To prompt the user that the target icon row has been selected.
  • the icon display state further includes at least one of highlighted display and hidden display. After determining the target icon row, while highlighting the target icon row, other than the target icon row will be displayed. Other icon rows are hidden and displayed; or the target icon row is highlighted while the other icon rows are kept as they are.
  • the operation state of the target icon row is adjusted to an operable state
  • the operation state includes an operable state and a prohibited operation state
  • the operable state The state is used to indicate moving at least one icon in the icon row to the target touch area
  • the prohibited operation state is used to indicate refusing to perform any operation on the icon row.
  • the user can initiate the second triggering action based on any position on the display interface without being limited to the location of the target icon row.
  • the area initiates a second triggering action, which improves the convenience of controlling the movement of the target icon row, and solves the problem of easily triggering the movement of other icon rows by mistake when controlling the movement of the target icon row, and improves the accuracy of controlling the movement of the target icon row.
  • the target touch area is used to indicate the area reached by the target icon row after moving and the movement The display area of the target icon row after.
  • the method of determining the target touch area may be determined according to the second triggering action.
  • the S20 includes at least one of the following:
  • S22 In response to the second triggering action, determine the area to which the triggering position of the second triggering action belongs, determine the target touch area according to the area, and move the icons in the target icon row to the target touch area. control area.
  • the triggering action includes but is not limited to at least one of a sliding action, a long press action, a single click action, and a double-click action.
  • the second triggering action is a sliding action
  • the sliding trajectory of the second triggering action is obtained
  • the sliding direction is determined based on the sliding trajectory
  • the touch control corresponding to the sliding direction is determined.
  • area as the target touch area.
  • the memory of the mobile terminal stores a preset corresponding relationship between the sliding direction and the touch area. The corresponding relationship between the preset sliding direction and the touch area includes the following: At least one:
  • the preset area on the right side of the display interface is used as the touch area
  • the preset area on the left side of the display interface is used as the touch area
  • the preset area on the right side of the display interface is used as the touch area
  • the preset area on the left side of the display interface is used as the trigger area.
  • FIG. 9 shows an example diagram of the left preset area and the right preset area.
  • the right preset area may be the right sidebar area, the lower right corner area, the rightmost middle area, or the upper right corner area
  • the left preset area may be
  • the left sidebar area can also be the lower left corner area, the leftmost middle area, or the upper left corner area; it can be understood that the left preset area and the right preset area are located on the display
  • the user can modify any icon in the target icon row based on the left preset area or the right preset area.
  • the icons on the right side of the display interface are too far away from the fingers of the user's left hand, causing the fingers of the user's left hand to be unable to operate the icons on the right side of the display interface or when the user holds the mobile terminal with his right hand.
  • the distance between the icons on the left side of the display interface and the fingers of the user's right hand is too far, resulting in the situation where the fingers of the user's right hand cannot operate the icons on the left side of the display interface, thereby improving the convenience of operating icons with one hand. .
  • the method of determining the target touch area according to the second triggering action may also be to obtain the triggering position of the second triggering action, and determine the triggering position of the second triggering action. area, and the target touch area is determined according to the area.
  • the second trigger action is a long press action, a click action, or a double-click action
  • the long press position of the long press action the click action
  • the click position of the action or the double-click position of the double-click action is used as the trigger position of the second trigger action.
  • the method of determining the area where the triggering position is located may be: obtaining the positional relationship between the triggering position and the center line of the display interface, according to the The positional relationship determines the area to which the trigger position belongs.
  • the area includes the left half area of the display interface and the right half area of the display interface.
  • the positional relationship includes the trigger position being located on the left side of the center line and the trigger position being located on the right side of the center line.
  • the position relationship is that the trigger position is located on the left side of the center line
  • it is determined that the area to which the trigger position belongs is the left half area of the display interface.
  • the position relationship is that the trigger position is located on the right side of the center line, it is determined that the trigger position belongs to the left half of the display interface.
  • the area to which the position belongs is the right half of the display interface.
  • the method of determining the target touch area based on the area includes at least one of the following:
  • the target touch area is the left preset area
  • the target touch area is a preset area on the right side.
  • the area within the preset range of the trigger position of the second trigger action is used as the target touch area.
  • the area centered on the trigger position is used as the target touch area. For example, when the trigger position is located in the lower right corner area, the lower right corner area is used as the target touch area.
  • the S20 also includes:
  • this embodiment of the present application arranges the icons in the target icon row vertically in the target touch area, refer to Figure 10 , Figure 10 shows a display interface diagram in which the icons in the target icon row are arranged vertically in the target touch area.
  • the display areas of other icon rows are extruded, and the icons in other icon rows are displayed in a reduced size based on the extruded display area. Save display space.
  • each icon in the target icon row is sequentially displayed in the row gaps between other icon rows.
  • FIG. 11 shows a display interface diagram in which the icons in the target icon row are arranged vertically in the target touch area.
  • the icons in the target icon row can also be arranged in an arc in the target touch area.
  • Figure 12 shows that the icons in the target icon row are arranged in an arc.
  • a display interface arranged in an arc within the target touch area.
  • the user can operate the icon for the desired operation based on the icons in the target icon row displayed in the target touch area.
  • the left preset area may be the left sidebar area, the lower left corner area, or the left middle area. , it can also be the upper left corner area
  • the first triggering action is used to indicate the row of icons where the icons required by the user are located
  • the second triggering action is used to perform the row of icons where the icons required by the user are located need to be moved to
  • the target touch area by responding to the first triggering action, determines the icon row where the icon required by the user is located, and uses the icon row where the icon required by the user is located as the target icon row, and responds to the second triggering action, Determine the target touch area according to the second triggering action, and then move the icons in the target icon row to the target touch area.
  • the user can perform corresponding operations on the icons required to be operated based on the target touch area, thereby improving Improves the convenience of icon operation.
  • Figure 13 shows a schematic flow chart of the second embodiment of the icon display control method. The method includes the following steps:
  • S30 Determine the target touch area according to the triggering position of the first triggering action, and/or obtain the holding position, determine the target touch area according to the holding position, and set the target icon in the row to The icon moves to the target touch area.
  • the target touch area is determined according to the first triggering action and/or the holding position, and the The icons in the target icon row are moved to the target touch area.
  • the method of determining the target touch area according to the triggering position of the first triggering action includes at least one of the following:
  • the target touch area is determined to be a preset area on the right side of the display interface.
  • the convenient operation area representing the user's hands is located on the left side of the center line of the display interface, and then the left side of the display interface is preset
  • the left preset area may be the left sidebar area, the lower left corner area, the leftmost middle area, or the upper left corner area; optionally, in When the trigger position is located on the right side of the center line of the display interface, the convenient operation area for the user's hands is located on the right side of the center line of the display interface, and then the preset area on the right side of the display interface is used as the target touch area.
  • Control area the left preset area can be the right sidebar area, the lower right corner area, the rightmost middle area, or the upper right corner area.
  • Figure 14 shows an example diagram of determining the target touch area according to the trigger position of the first trigger action. When the trigger position of the first trigger action is located on the right side of the center line of the display interface, Move the icons in the target icon row to the preset area on the right.
  • the method of determining the target touch area according to the holding position includes at least one of the following:
  • the target touch area is determined to be a preset area on the right side of the display interface.
  • the holding position is used to indicate a handholding state in which the user holds the mobile terminal.
  • the handholding state includes at least one of left-hand holding, right-hand holding and two-hand holding.
  • the holding position When it is located on the left side of the center line of the display interface, it means that the user's hand is located on the left side of the center line of the display interface. That is, the preset area on the left is closer to the user's hand than the preset area on the right.
  • the preset area on the left side of the display interface is used as the target touch area; optionally, when the holding position is located on the right side of the center line of the display interface, it represents the user's hand located on the center line of the display interface.
  • FIG. 15 shows an example diagram illustrating the determination of the target touch area according to the trigger position of the first trigger action, when the holding position is located on the right side of the center line of the display interface, Move the icons in the target icon row to the preset area on the right.
  • the target icon row is determined based on the first trigger action
  • the target touch area is determined based on the positional relationship between the trigger position of the first trigger action and the center of the display interface, And/or according to the holding position, the area close to the user's hand is used as the target touch area, and the icons in the target icon row are moved to the target touch area, without the need to determine the target icon row.
  • the target touch area can be determined only after receiving the second trigger action initiated by the user, which improves the efficiency and accuracy of icon movement and enhances the user experience.
  • Figure 16 shows a schematic flow chart of the second embodiment of the icon display control method. The method includes the following steps:
  • S50 Determine the arrangement order of each icon in the target touch area according to the display order and/or the attribute information
  • S60 Arrange the icons in the target icon row in the target touch area vertically according to the arrangement order, or arrange the icons in the target icon row in the target touch area in an arc shape.
  • the display order is used to indicate the display order of each icon in the target icon row.
  • the display order may be a display order from left to right or a display order from right to left.
  • the embodiment of the present application takes the display order from left to right as an example for analysis.
  • each icon in the target icon row includes "phone”, “text message”, and "browser”, and the display order is: “phone” - "text message” - "browser”.
  • the earlier the display order is, the closer the display position of the icon in the area where the target icon row is located to the left side of the display interface; the further back the display order is, the closer the display position of the icon is to the area where the target icon row is located.
  • On the right side of the display interface when the triggering position of the first triggering action or the triggering position of the second triggering action is located on the left side of the center line of the display interface, or when the holding position is located on the left side of the center line of the display interface, the closer to the display interface
  • the icons on the left side of the interface are closer to the user's hand, and the icons on the right side of the display interface are further away from the user's hand.
  • the embodiment of the present application proposes an icon based on The display order in the area where the target icon row is located determines the arrangement order of each icon in the target touch area, and the icons of the target icon row are arranged vertically in the target touch area according to the arrangement order, or Arrange the icons in the target row in an arc in the target touch area.
  • the arrangement order is used to indicate the distance between the moved display position corresponding to the icon and the user's hand.
  • the moved display position corresponding to the icon is the position of the icon in the target icon row in the target touch position. The display location within the area.
  • the method of determining the arrangement order of each icon in the target touch area according to the display order of the icons in the area where the target icon row is located includes at least one of the following:
  • the arrangement order of each icon is determined in order from right to left according to the display order.
  • the arrangement order corresponding to the icons that are further to the right in the display order is The closer to the front, the further to the left the icon corresponding to the display order is;
  • the arrangement order of each icon is determined in order from left to right according to the display order.
  • the arrangement order corresponding to the icons further to the left in the display order is The closer to the front, the further to the right the corresponding icon in the display order is.
  • the target touch area when the target touch area is located in the left preset area, it means that the left preset area is closer to the user's hand than the right preset area, and the user's hand is located on the left side of the display interface.
  • the icons closer to the right side of the display interface are further away from the user's hand, and the icons closer to the right side of the display interface have a greater operating probability.
  • the icons are arranged in the front. For example, as shown in Figure 6, each icon in the target icon row includes “Phone", “SMS”, and “Browser”, and the display order is: “Phone” - " "SMS” - "Browser".
  • the target touch area when the target touch area is located in the right preset area, it means that the right preset area is closer to the user's hand than the left preset area, and the user's hand is located on the right side of the display interface.
  • the icons closer to the left side of the display interface are further away from the user's hand, and the icons closer to the left side of the display interface have a greater operating probability.
  • the icons are arranged in the front.
  • each icon in the target icon row includes “Phone”, “SMS”, and “Browser”, and the display order is: “Phone” - " “SMS” - “Browser”, when the user holds the mobile terminal in his right hand, the distance between "phone” and the user's hand is greater than the distance between "SMS” and the user's hand, and the distance between "SMS” and the user's hand is greater than “ “Browser” and the distance between the user's hands, the order is “Phone” - "SMS” - “Browser".
  • the triggering position of the first triggering action or the triggering position of the second triggering action is obtained, using the triggering position of the first triggering action as the starting point or the triggering position of the second triggering action.
  • the triggering position of the second triggering action is used as the starting point.
  • the icons in the target icon row are arranged vertically in the target touch area, or the icons in the target row are arranged in an arc in the target touch area.
  • Figure 17 shows an example diagram in which the icons in the target icon row are arranged vertically in the target touch area according to the arrangement order.
  • the attribute information includes the usage frequency of the icon.
  • the higher the usage frequency the higher the operation probability of the icon.
  • the arrangement order of each icon is determined according to the usage frequency. The higher the usage frequency, the higher the operation probability of the icon. The higher the order, the lower the frequency of use, and the lower the order.
  • the attribute information also includes the matching degree between the icon and the application scene. Obtain the current application scene and obtain the matching degree between each icon and the current application scene. The higher the matching degree, the higher the operation probability of the icon. According to The matching degree determines the arrangement order of each icon. The higher the matching degree, the earlier the arrangement order. The higher the matching degree, the further the arrangement order is.
  • the current application scenario is determined based on the current time and/or current location information. , For example, when the current location information is in a store, it is determined that the current application scene is a shopping scene, and the icon of the payment application has a high matching degree.
  • the attribute information includes but is not limited to the frequency of use of the icon, the matching degree between the icon and the application scenario, and may also include the time interval between the last operation time of the icon and the current time. The shorter the time interval, the higher the time interval. The higher the operation probability of the icon and the longer the time interval, the lower the operation probability of the icon.
  • the arrangement order of each icon in the target touch area is determined according to the operation probability, and the icons in the target icon row are arranged according to the arrangement order. Arrange them vertically in the target touch area, or arrange the icons in the target icon row in an arc in the target touch area, so that icons with a higher probability of operation are closer to the user's hand, and icons with a lower probability of operation are closer to the user's hand. The farther away from the user's hand.
  • the arrangement order of each icon in the target touch area may also be determined based on the display order and the attribute information.
  • the operation probability of each icon is determined according to the display order and the attribute information, and the arrangement order of each icon is determined according to the operation probability. For example, when the user holds the mobile terminal in his right hand, the The icon on the far left and with the highest frequency of use is the icon at the top of the order.
  • each icon in the target icon is moved to the target touch area, and the area where the target icon row is located according to each icon is moved.
  • the display order or the attribute information of each icon determines the operation probability of each icon, the arrangement order of each icon in the target touch area is determined according to the operation probability, and the icons in the target icon row are arranged vertically according to the arrangement order. Arrange them in the target touch area, or arrange the icons in the target icon row in an arc in the target touch area, so that the icons with higher operation probability are closer to the user's hand, improving the accuracy of the user's icon operation and efficiency.
  • Figure 12 is a schematic structural diagram of another mobile terminal provided by an embodiment of the present application.
  • the present application also provides a mobile terminal.
  • the mobile terminal includes a memory 1201, a processor 1202, and an icon display control program stored in the memory 1201 and executable on the processor 1202. When the icon display control program is executed by the processor, any of the above is achieved.
  • Icons in one embodiment show steps of a control method.
  • This application also provides a computer-readable storage medium.
  • An icon display control program is stored on the computer-readable storage medium.
  • the icons display control program is executed by a processor, the steps of the icon display control method in any of the above embodiments are implemented.
  • the embodiments of mobile terminals and computer-readable storage media provided by this application include all technical features of the embodiments of the above-mentioned icon display control method, and the expansion and explanation content of the description are basically the same as those of the above-mentioned embodiments of the incoming call note method. No further details will be given here.
  • Embodiments of the present application also provide a computer program product.
  • the computer program product includes computer program code.
  • the computer program code When the computer program code is run on a computer, it causes the computer to execute the methods in the above various possible implementations.
  • Embodiments of the present application also provide a chip, which includes a memory and a processor.
  • the memory is used to store a computer program.
  • the processor is used to call and run the computer program from the memory, so that the device equipped with the chip executes the above various possible implementations. Methods.
  • the embodiment of the present application also provides a computer module for executing the methods in the above various possible implementations.
  • a computing device generally includes a processor and a memory.
  • the memory is used to store instructions.
  • the computing device executes each step or each program module of the present invention.
  • FIG 19 is a schematic diagram of the hardware structure of a controller 140 provided by this application.
  • the controller 140 includes: a memory 1401 and a processor 1402.
  • the memory 1401 is used to store program instructions.
  • the processor 1402 is used to call the program instructions in the memory 1401 to execute the steps performed by the controller in the above method embodiment. Its implementation principle and The beneficial effects are similar and will not be described again here.
  • the above-mentioned controller also includes a communication interface 1403, which can be connected to the processor 1402 through a bus 1404.
  • the processor 1402 can control the communication interface 1403 to implement the receiving and sending functions of the controller 140.
  • FIG 20 is a schematic diagram of the hardware structure of a network node 150 provided by this application.
  • the network node 150 includes: a memory 1501 and a processor 1502.
  • the memory 1501 is used to store program instructions.
  • the processor 1502 is used to call the program instructions in the memory 1501 to execute the steps performed by the first node in the above method embodiment. Its implementation principle and The beneficial effects are similar and will not be described again here.
  • the above-mentioned network node also includes a communication interface 1503, which can be connected to the processor 1502 through a bus 1504.
  • the processor 1502 can control the communication interface 1503 to implement the receiving and transmitting functions of the network node 150 .
  • the above integrated modules implemented in the form of software function modules can be stored in a computer-readable storage medium.
  • the above-mentioned software function modules are stored in a storage medium and include a number of instructions to cause a computer device (which can be a personal computer, a server, or a network device, etc.) or a processor (English: processor) to execute the methods of various embodiments of the present application. Some steps.
  • a computer program product includes one or more computer instructions.
  • Computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, e.g., computer instructions may be transmitted from a website, computer, server or data center via a wired link (e.g.
  • Coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless means to transmit to another website site, computer, server or data center.
  • Computer-readable storage media can be any available media that can be accessed by a computer or a data storage device such as a server, data center, or other integrated media that contains one or more available media. Available media may be magnetic media (eg, floppy disk, hard disk, tape), optical media (eg, DVD), or semiconductor media (eg, solid state disk, SSD), etc.
  • the methods of the above embodiments can be implemented by means of software plus the necessary general hardware platform. Of course, it can also be implemented by hardware, but in many cases the former is better. implementation.
  • the technical solution of the present application can be embodied in the form of a software product in essence or that contributes to the existing technology.
  • the computer software product is stored in one of the above storage media (such as ROM/RAM, magnetic disk, optical disk), including several instructions to cause a terminal device (which can be a mobile phone, a computer, a server, a controlled terminal, or a network device, etc.) to execute the method of each embodiment of the present application.

Abstract

一种图标显示控制方法、移动终端及存储介质,所述方法应用于移动终端,包括以下步骤:响应于第一触发动作,根据所述第一触发动作确定所述第一触发动作对应的目标图标行(S10);响应于第二触发动作,将所述目标图标行内的图标移动到所述目标触控区域(S20)。本方法在接收到第一触发动作时,先确定第一触发动作对应的目标图标行,进而将目标图标行内的图标移动到目标触控区域,以将用户所需操作的图标行移动到便于用户操作的区域,提升了用户体验。

Description

图标显示控制方法、移动终端及存储介质 技术领域
本申请涉及终端技术领域,具体涉及一种图标显示控制方法、移动终端及存储介质。
背景技术
移动终端的显示界面上显示有多个矩阵排列的图标,用户通过点击任意一个图标,以启动该图标对应的应用程序。
在构思及实现本申请过程中,发明人发现至少存在如下问题:现有桌面应用的排布的形式主要是矩阵排列,随着手机屏幕尺寸越来越大,用户单手握持移动终端时无法触及显示屏幕的全部区域,尤其在应用程序图标处于较偏位置时,用户单手操作无法接触到距离用户手部较远的图标,如用户握持终端的左侧时,因屏幕过宽,而无法操作处于右侧的图标,此时往往需要双手操作,操作步骤繁琐。
前面的叙述在于提供一般的背景信息,并不一定构成现有技术。
申请内容
针对上述技术问题,本申请提供一种图标显示控制方法、移动终端及存储介质,使用户可以基于移动后的图标行对图标行内的任意图标进行操作,操作简单方便。
为解决上述技术问题,本申请提供一种图标显示控制方法,应用于移动终端,包括:
S10:响应于第一触发动作,根据所述第一触发动作确定所述第一触发动作对应的目标图标行;
S20:响应于第二触发动作,将所述目标图标行内的图标移动到目标触控区域。
可选地,所述S20还包括以下至少一项:
根据所述第一触发动作的触发位置确定所述目标触控区域;
获取握持位置,根据所述握持位置确定所述目标触控区域;
根据所述第二触发动作确定所述目标触控区域。
可选地,所述第一触发动作满足以下至少一项:
所述第一触发动作与预设动作相匹配;
触发时长大于或等于预设时长。
可选地,所述S10包括:
获取所述第一触发动作的触发位置;
将所述触发位置对应的图标行确定为所述目标图标行。
可选地,所述S10还包括:
改变所述目标图标行的图标显示状态,以提示用户所述目标图标行处于可移动的状态。
可选地,所述根据所述第二触发动作确定目标触控区域的方式包括以下至少一项:
确定所述第二触发动作的滑动方向,将所述滑动方向对应的触控区域作为所述目标触控区域;
确定所述第二触发动作的触发位置所属的区域,根据所述区域确定所述目标触控区域。
可选地,所述根据所述第一触发动作的触发位置确定所述目标触控区域的方式包括以下至少一项:
在所述触发位置位于显示界面中心线的左侧时,确定所述目标触控区域为所述显示界面的左侧预设区域;
在所述触发位置位于显示界面中心线的右侧时,确定所述目标触控区域为所述显示界面的右侧预设区域。
可选地,所述根据所述握持位置确定所述目标触控区域的方式包括以下至少一项:
在所述握持位置位于显示界面中心线的左侧时,确定所述目标触控区域为所述显示界面的左侧预设区域;
在所述触发位置位于显示界面中心线的右侧时,确定所述目标触控区域为所述显示界面的右侧预设区域。
可选地,所述S20还包括:
将目标图标行内的图标以纵向排列在目标触控区域内;
或,将目标图标行内的图标以弧形排列在目标触控区域内。
可选地,所述S20还包括:
获取各个图标在所述目标图标行所处区域的显示顺序和/或各个图标的属性信息;
根据所述显示顺序和/或所述属性信息确定各个图标在所述目标触控区域的排列顺序;
根据所述排列顺序将所述目标图标行的图标以纵向排列在目标触控区域内,或将目标图标行内的图标以弧形排列在目标触控区域内。
本申请还提供一种移动终端,包括:存储器、处理器,其中,所述存储器上存储有图标显示控制程序,所述图标显示程序被所述处理器执行时实现如上述方法的步骤。
本申请还提供一种存储介质,所述存储介质上存储有计算机程序,所述计算机程序被处理器执行时实现如上述的图标显示控制方法的步骤。
如上所述,本申请的图标显示控制方法,应用于移动终端,包括步骤:响应于第一触发动作,根据所述第一触发动作确定所述第一触发动作对应的目标图标行;响应于第二触发动作,确定目标触控区域,并将所述目标图标行内的图标移动到所述目标触控区域。通过上述技术方案,可以实现将用户所需操作的图标行进行移动的功能,解决图标距离用户的手指可触及范围较远时,用户不便操作图标的问题,进而提升了用户体验。
附图说明
此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本申请的实施例,并与说明书一起用于解释本申请的原理。为了更清楚地说明本申请实施例的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,对于本领域普通技术人员而言,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1是本申请实施例提供的一种实现本申请各个实施例的移动终端的硬件结构示意图;
图2是本申请实施例提供的一种通信网络系统架构图;
图3是根据第一实施例示出的图标显示控制方法的流程示意图;
图4是根据第一实施例示出的图标显示控制方法的显示界面图;
图5是根据第一实施例示出的图标显示控制方法步骤S10的细化流程图;
图6是根据第一实施例示出的图标显示控制方法的显示界面图;
图7是根据第一实施例示出的图标显示控制方法的显示界面图;
图8是根据第一实施例示出的图标显示控制方法步骤S20的细化流程图;
图9是根据第一实施例示出的图标显示控制方法的显示界面图;
图10是根据第一实施例示出的图标显示控制方法的显示界面图;
图11是根据第一实施例示出的图标显示控制方法的显示界面图;
图12是根据第一实施例示出的图标显示控制方法的显示界面图;
图13是根据第二实施例示出的图标显示控制方法的流程示意图;
图14是根据第二实施例示出的图标显示控制方法的显示界面图;
图15是根据第二实施例示出的图标显示控制方法的显示界面图;
图16是根据第三实施例示出的图标显示控制方法的流程示意图;
图17是根据第三实施例示出的图标显示控制方法的显示界面图;
图18是本申请实施例提供的另一种移动终端的结构示意图;
图19是本申请实施例提供的一种控制器140的硬件结构示意图;
图20是本申请实施例提供的一种网络节点150的硬件结构示意图;
本申请目的的实现、功能特点及优点将结合实施例,参照附图做进一步说明。通过上述附图,已示出本申请明确的实施例,后文中将有更详细的描述。这些附图和文字描述并不是为了通过任何方式限制本申请构思的范围,而是通过参考特定实施例为本领域技术人员说明本申请的概念。
具体实施方式
这里将详细地对示例性实施例进行说明,其示例表示在附图中。下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本申请相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本申请的一些方面相一致的装置和方法的例子。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素,此外,本申请不同实施例中具有同样命名的部件、特征、要素可能具有相同含义,也可能具有不同含义,其具体含义需以其在该具体实施例中的解释或者进一步结合该具体实施例中上下文进行确定。
应当理解,尽管在本文可能采用术语第一、第二、第三等来描述各种信息,但这些信息不应限于这些术语。这些术语仅用来将同一类型的信息彼此区分开。例如,在不脱离本文范围的情况下,第一信息也可以被称为第二信息,类似地,第二信息也可以被称为第一信息。取决于语境,如在此所使用的词语"如果"可以被解释成为"在……时"或"当……时"或"响应于确定"。再者,如同在本文中所使用的,单数形式“一”、“一个”和“该”旨在也包括复数形式,除非上下文中有相反的指示。应当进一步理解,术语“包含”、“包括”表明存在所述的特征、步骤、操作、元件、组件、项目、种类、和/或组,但不排除一个或多个其他特征、步骤、操作、元件、组件、项目、种类、和/或组的存在、出现或添加。本申请使用的术语“或”、“和/或”、“包括以下至少一个”等可被解释为包括性的,或意味着任一个或任何组合。例如,“包括以下至少一个:A、B、C”意味着“以下任一个:A;B;C;A和B;A和C;B和C;A和B和C”,再如,“A、B或C”或者“A、B和/或C”意味着“以下任一个:A;B;C;A和B;A和C;B和C;A和B和C”。 仅当元件、功能、步骤或操作的组合在某些方式下内在地互相排斥时,才会出现该定义的例外。
应该理解的是,虽然本申请实施例中的流程图中的各个步骤按照箭头的指示依次显示,但是这些步骤并不是必然按照箭头指示的顺序依次执行。除非本文中有明确的说明,这些步骤的执行并没有严格的顺序限制,其可以以其他的顺序执行。而且,图中的至少一部分步骤可以包括多个子步骤或者多个阶段,这些子步骤或者阶段并不必然是在同一时刻执行完成,而是可以在不同的时刻执行,其执行顺序也不必然是依次进行,而是可以与其他步骤或者其他步骤的子步骤或者阶段的至少一部分轮流或者交替地执行。
取决于语境,如在此所使用的词语“如果”、“若”可以被解释成为“在……时”或“当……时”或“响应于确定”或“响应于检测”。类似地,取决于语境,短语“如果确定”或“如果检测(陈述的条件或事件)”可以被解释成为“当确定时”或“响应于确定”或“当检测(陈述的条件或事件)时”或“响应于检测(陈述的条件或事件)”。
需要说明的是,在本文中,采用了诸如S301、S302等步骤代号,其目的是为了更清楚简要地表述相应内容,不构成顺序上的实质性限制,本领域技术人员在具体实施时,可能会先执行S302后执行S301等,但这些均应在本申请的保护范围之内。
应当理解,此处所描述的具体实施例仅仅用以解释本申请,并不用于限定本申请。
在后续的描述中,使用用于表示元件的诸如“模块”、“部件”或者“单元”的后缀仅为了有利于本申请的说明,其本身没有特定的意义。因此,“模块”、“部件”或者“单元”可以混合地使用。
拍摄设备可以以各种形式来实施。例如,本申请中描述的拍摄设备可以包括诸如手机、平板电脑、笔记本电脑、掌上电脑、个人数字助理(Personal Digital Assistant,PDA)、便捷式媒体播放器(Portable Media Player,PMP)、导航装置、可穿戴设备、智能手环、计步器等具有摄像头的移动终端,以及诸如数字TV、台式计算机等具有摄像头的固定终端。
后续描述中将以移动终端为例进行说明,本领域技术人员将理解的是,除了特别用于移动目的的元件之外,根据本申请的实施方式的构造也能够应用于固定类型的终端。
请参阅图1,其是本申请实施例提供的一种实现本申请各个实施例的移动终端的硬件结构示意图,该移动终端100可以包括:RF(Radio Frequency,射频)单元101、WiFi模块102、音频输出单元103、A/V(音频/视频)输入单元104、传感器105、显示单元106、用户输入单元107、接口单元108、存储器109、处理器110、以及电源111等部件。本领域技术人员可以理解,图1中示出的移动终端结构并不构成对移动终端的限定,移动终端可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
下面结合图1对移动终端的各个部件进行具体的介绍:
射频单元101可用于收发信息或通话过程中,信号的接收和发送,具体的,将基站的下行信息接收后,给处理器110处理;另外,将上行的数据发送给基站。通常,射频单元101包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等。此外,射频单元101还可以通过无线通信与网络和其他设备通信。上述无线通信可以使用任一通信标准或协议,包括但不限于GSM(Global System of Mobile communication,全球移动通讯系统)、GPRS(General Packet Radio Service,通用分组无线服务)、CDMA2000(Code Division Multiple Access 2000,码分多址2000)、WCDMA(Wideband Code Division Multiple Access,宽带码分多址)、TD-SCDMA(Time Division-Synchronous Code Division Multiple  Access,时分同步码分多址)、FDD-LTE(Frequency Division Duplexing-Long Term Evolution,频分双工长期演进)和TDD-LTE(Time Division Duplexing-Long Term Evolution,分时双工长期演进)等。
WiFi属于短距离无线传输技术,移动终端通过WiFi模块102可以帮助用户收发电子邮件、浏览网页和访问流式媒体等,它为用户提供了无线的宽带互联网访问。虽然图1示出了WiFi模块102,但是可以理解的是,其并不属于移动终端的必须构成,完全可以根据需要在不改变申请的本质的范围内而省略。
音频输出单元103可以在移动终端100处于呼叫信号接收模式、通话模式、记录模式、语音识别模式、广播接收模式等等模式下时,将射频单元101或WiFi模块102接收的或者在存储器109中存储的音频数据转换成音频信号并且输出为声音。而且,音频输出单元103还可以提供与移动终端100执行的特定功能相关的音频输出(例如,呼叫信号接收声音、消息接收声音等等)。音频输出单元103可以包括扬声器、蜂鸣器等等。
A/V输入单元104用于接收音频或视频信号。A/V输入单元104可以包括图形处理器(Graphics Processing Unit,GPU)1041和麦克风1042,图形处理器1041对在视频捕获模式或图像捕获模式中由图像捕获装置(如摄像头)获得的静态图片或视频的图像数据进行处理。处理后的图像帧可以显示在显示单元106上。经图形处理器1041处理后的图像帧可以存储在存储器109(或其它存储介质)中或者经由射频单元101或WiFi模块102进行发送。麦克风1042可以在电话通话模式、记录模式、语音识别模式等等运行模式中经由麦克风1042接收声音(音频数据),并且能够将这样的声音处理为音频数据。处理后的音频(语音)数据可以在电话通话模式的情况下转换为可经由射频单元101发送到移动通信基站的格式输出。麦克风1042可以实施各种类型的噪声消除(或抑制)算法以消除(或抑制)在接收和发送音频信号的过程中产生的噪声或者干扰。
移动终端100还包括至少一种传感器105,比如光传感器、运动传感器以及其他传感器。可选地,光传感器包括环境光传感器及接近传感器,可选地,环境光传感器可根据环境光线的明暗来调节显示面板1061的亮度,接近传感器可在移动终端100移动到耳边时,关闭显示面板1061和/或背光。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别手机姿态的应用(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;至于手机还可配置的指纹传感器、压力传感器、虹膜传感器、分子传感器、陀螺仪、气压计、湿度计、温度计、红外线传感器等其他传感器,在此不再赘述。
显示单元106用于显示由用户输入的信息或提供给用户的信息。显示单元106可包括显示面板1061,可以采用液晶显示器(Liquid Crystal Display,LCD)、有机发光二极管(Organic Light-Emitting Diode,OLED)等形式来配置显示面板1061。
用户输入单元107可用于接收输入的数字或字符信息,以及产生与移动终端的用户设置以及功能控制有关的键信号输入。可选地,用户输入单元107可包括触控面板1071以及其他输入设备1072。触控面板1071,也称为触摸屏,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板1071上或在触控面板1071附近的操作),并根据预先设定的程式驱动相应的连接装置。触控面板1071可包括触摸检测装置和触摸控制器两个部分。可选地,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再送给处理器110,并能接收处理器110发来的命令并加以执行。此外,可以采用 电阻式、电容式、红外线以及表面声波等多种类型实现触控面板1071。除了触控面板1071,用户输入单元107还可以包括其他输入设备1072。可选地,其他输入设备1072可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆等中的一种或多种,具体此处不做限定。
可选地,触控面板1071可覆盖显示面板1061,当触控面板1071检测到在其上或附近的触摸操作后,传送给处理器110以确定触摸事件的类型,随后处理器110根据触摸事件的类型在显示面板1061上提供相应的视觉输出。虽然在图1中,触控面板1071与显示面板1061是作为两个独立的部件来实现移动终端的输入和输出功能,但是在某些实施例中,可以将触控面板1071与显示面板1061集成而实现移动终端的输入和输出功能,具体此处不做限定。
接口单元108用作至少一个外部装置与移动终端100连接可以通过的接口。例如,外部装置可以包括有线或无线头戴式耳机端口、外部电源(或电池充电器)端口、有线或无线数据端口、存储卡端口、用于连接具有识别模块的装置的端口、音频输入/输出(I/O)端口、视频I/O端口、耳机端口等等。接口单元108可以用于接收来自外部装置的输入(例如,数据信息、电力等等)并且将接收到的输入传输到移动终端100内的一个或多个元件或者可以用于在移动终端100和外部装置之间传输数据。
存储器109可用于存储软件程序以及各种数据。存储器109可主要包括存储程序区和存储数据区,可选地,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据手机的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器109可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。
处理器110是移动终端的控制中心,利用各种接口和线路连接整个移动终端的各个部分,通过运行或执行存储在存储器109内的软件程序和/或模块,以及调用存储在存储器109内的数据,执行移动终端的各种功能和处理数据,从而对移动终端进行整体监控。处理器110可包括一个或多个处理单元;优选的,处理器110可集成应用处理器和调制解调处理器,可选地,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器110中。
移动终端100还可以包括给各个部件供电的电源111(比如电池),优选的,电源111可以通过电源管理系统与处理器110逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。
尽管图1未示出,移动终端100还可以包括蓝牙模块等,在此不再赘述。
为了便于理解本申请实施例,下面对本申请的移动终端所基于的通信网络系统进行描述。
请参阅图2,图2为本申请实施例提供的一种通信网络系统架构图,该通信网络系统为通用移动通信技术的LTE系统,该LTE系统包括依次通讯连接的UE(User Equipment,用户设备)201,E-UTRAN(Evolved UMTS Terrestrial Radio Access Network,演进式UMTS陆地无线接入网)202,EPC(Evolved Packet Core,演进式分组核心网)203和运营商的IP业务204。
可选地,UE201可以是上述移动终端100,此处不再赘述。
E-UTRAN202包括eNodeB2021和其它eNodeB2022等。可选地,eNodeB2021可以通过回程(backhaul)(例如X2接口)与其它eNodeB2022连接,eNodeB2021连接到EPC203,eNodeB2021可以提供UE201到EPC203的接入。
EPC203可以包括MME(Mobility Management Entity,移动性管理实体)2031,HSS(Home Subscriber Server,归属用户服务器)2032,其它MME2033,SGW(Serving Gate Way,服务网关)2034,PGW(PDN Gate Way,分组数据网络网关)2035和PCRF(Policy and Charging Rules Function,政策和资费功能实体)2036等。可选地,MME2031是处理UE201和EPC203之间信令的控制节点,提供承载和连接管理。HSS2032用于提供一些寄存器来管理诸如归属位置寄存器(图中未示)之类的功能,并且保存有一些有关服务特征、数据速率等用户专用的信息。所有用户数据都可以通过SGW2034进行发送,PGW2035可以提供UE 201的IP地址分配以及其它功能,PCRF2036是业务数据流和IP承载资源的策略与计费控制策略决策点,它为策略与计费执行功能单元(图中未示)选择及提供可用的策略和计费控制决策。
IP业务204可以包括因特网、内联网、IMS(IP Multimedia Subsystem,IP多媒体子系统)或其它IP业务等。
虽然上述以LTE系统为例进行了介绍,但本领域技术人员应当知晓,本申请不仅仅适用于LTE系统,也可以适用于其他无线通信系统,例如GSM、CDMA2000、WCDMA、TD-SCDMA以及未来新的网络系统等,此处不做限定。
基于上述移动终端硬件结构以及通信网络系统,提出本申请各个实施例。
第一实施例
参照图3,图3示出了图标显示控制方法第一实施例的流程示例图,所述方法包括以下步骤:
S10:响应于第一触发动作,根据所述第一触发动作确定所述第一触发动作对应的目标图标行;
S20:响应于第二触发动作,将所述目标图标行内的图标移动到目标触控区域。本实施例中目标触控区域根据第二触发动作确定。
在本申请实施例中,应用于移动终端,所述移动终端的显示界面显示有至少一个依次排列的图标行,所述图标行包括至少一个应用的图标,示例性地,参照图4,图4示出了显示界面的示意图,可以理解的是,在用户右手手握所述移动终端时,若用户需操作的图标为左上角的图标,左上角的图标与用户手部的距离过远,用户单手无法操作左上角的图标,基于此,本申请实施例提出了一种控制图标移动,以将用户所需操作的图标移动至方便用户操作的目标触控区域的方法。可选地,在接收到第一触发动作时,根据所述第一触发动作从当前显示界面中的多个图标行中筛选出所述第一触发动作的目标图标行,所述目标图标行为用户所需操作的图标所在行,所述第一触发动作用于指示目标图标行。
可选地,为了提高图标显示控制的精准性,在接收到所述第一触发动作时,判定所述第一触发动作是否满足触发条件,可选地,所述第一触发动作满足以下至少一项:
所述第一触发动作与预设动作相匹配;
触发时长大于或等于预设时长。
可选地,所述预设动作包括长按动作、滑动动作、双击动作、单击动作的至少一项,还可以是长按后并滑动,在所述第一触发动作与预设动作一致时,则确定所述第一触发动作与预设动作相匹配,则确定所述第一触发动作满足触发条件。
可选地,所述触发时长为所述第一触发动作的维持时长,在所述第一触发动作为长按动作时,所述维持时长为长按时长,在所述第二触发动作为滑动动作时,所述维持时长为滑动时长,在所述触发时长大于或等于预设时长时,则确定所述第一触发动作满足触发条件。
可选地,在所述第一触发动作满足触发条件时,响应于所述第一触发动作,根据所述第一触发动作确定目标图标行。可选地,参照图5,所述S10包括:
S11:获取所述第一触发动作的触发位置;
S12:将所述触发位置对应的图标行确定为所述目标图标行。
可选地,获取所述第一触发动作的触发位置后,将所述触发位置对应的图标行确定为所述目标图标行的方式包括:根据显示界面确定各个图标行在所述显示界面的显示区域,将所述触发位置与所述显示区域进行匹配,以确定所述触发位置所处的显示区域,将所述触发位置所处的显示区域对应的图标行作为所述目标图标行。示例性地,参照图6,图6示出了根据触发位置确定目标图标行的示意图,图6示出的图标显示界面从上往下依次显示第一图标行、第二图标行以及第三图标行,所述触发位置位于第三图标行的显示区域时,则将第三图标行作为所述目标图标行。
可选地,在又一实施例,还可以通过为各个图标行分别预置关联的触发标识,于显示界面的预设区域内显示各个图标行对应的触发标识,不同的触发标识对应的不同的显示位置,在获取所述第一触发动作的触发位置时,确定与所述触发位置匹配的目标显示位置,将所述目标显示位置对应的触发标识关联的图标行作为所述目标图标行。可选地,各个图标行关联的触发标识显示于所述显示界面的预设区域,以使得用户在预设区域即可选中图标行,提高选定图标行的便捷性,所述预设区域包括所述显示界面的右下角区域、所述显示界面的左下角区域、显示界面的底部区域的至少一种。
可选地,所述预设区域还可根据手持状态实时确定,在手持状态为右手持握时,所述预设区域为图标显示界面的右下角区域,在手持状态为左手持握时,所述预设区域为图标显示界面的左下角区域,在手持状态为双手持握时,所述预设区域为图标显示界面的底部区域。可选地,获取所述手持状态的方式可以是基于移动终端的传感器检测到的持握数据确定当前手持状态,将当前手持状态作为所述手持状态,所述传感器包括但不限于重力传感器、陀螺仪、角速度传感器;还可以根据用户持握所述移动终端的惯用手确定,还可以根据所述第一触发动作的触发位置确定,在所述触发位置处于显示界面的中心线的左侧时,确定所述手持状态为左手持握,在所述触发位置位于显示界面的中心线的右侧时,确定所述手持状态为右手持握,所述显示界面的中心线为竖直方向上的中心线。
可选地,所述预设区域还可以是用户基于自身需求自定义设置,还可以基于所述第一触发动作的触发位置确定,将所述触发位置预设范围内的区域作为所述预设区域,使得各个图标行对应的触发标识靠近用户手部,示例性地,在所述触发位置位于所述显示界面的右下角区域时,将所述预设区域确定为所述图标界面的右下角区域,在所述触发位置位于所述显示界面的中部靠近右侧位置时,将中部靠近右侧位置对应的区域作为所述预设区域,在所述触发位置位于所述显示界面的右上角区域时,将所述显示界面的右上角区域作为所述预设区域。
可选地,在接收到第一触发动作时,于预设区域依据各个触发标识的显示位置显示各个图标行的触发标识,将所述第一触发动作的触发位置与各个图标行的触发标识在所述预设区域中的显示位置进行比对,以确定第一触发动作的触发位置对应的目标触发标识,将所述目标触发标识关联的图标行作为所述目标图标行,于预设区域显示各个图标行对应的触发标识,用户基于所述预设区域即可实现对目标图标行的选定,防止因屏幕过大,目标图标行与用户手部距离过远,导致用户单手无法选定对应的目标图标行,示例性地,参照图7,图7示出了根据触发标识实现对目标图标行的选定的示例图。
可选地,在确定所述目标图标行后,为了提示用户被选中的图标行,所述S10还包括:
改变所述目标图标行的图标显示状态,以提示用户所述目标图标行处于可移动的状态。
可选地,所述图标显示状态包括图标显示方式,图标显示方式包括图标显示颜色,图标背景颜色,图标显示大小,图标显示亮度、在图标行预设范围内添加预设标记的至少一种,以提示用户所述目标图标行已被选中。可选地,所述图标显示状态还包括突出显示以及隐藏显示的至少一种,在确定所述目标图标行后,将所述目标图标行突出显示的同时,将除所述目标图标行以外的其它图标行隐藏显示;或将所述目标图标行突出显示的同时,将所述其它图标行维持原样。
可选地,在改变所述目标图标行的图标显示状态的同时,将所述目标图标行的操作状态调整为可操作状态,所述操作状态包括可操作状态以及禁止操作状态,所述可操作状态用于指示将图标行内的至少一个图标移动至目标触控区域,所述禁止操作状态用于指示拒绝执行针对图标行的任意操作。可选地,在确定所述目标图标行后,将所述目标图标行的操作状态调整为可操作状态,将除所述目标图标行以外的其它图标行的操作状态调整为禁止操作状态,基于将目标图标行的操作状态调整为可操作状态且将其它图标行的操作状态调整为禁止操作状态时,用户可基于显示界面的任意位置发起第二触发动作,而无需局限于目标图标行所处区域发起第二触发动作,提高控制目标图标行移动的便捷性,并解决了在控制目标图标行移动的时候,容易误触发其它图标行移动的问题,提高了控制目标图标行移动的准确性。
可选地,在确定目标图标行后,响应于第二触发动作,根据所述第二触发动作确定目标触控区域,所述目标触控区域用于指示目标图标行移动后到达的区域以及移动后的所述目标图标行的显示区域。
可选地,确定目标触控区域的方式可根据所述第二触发动作确定,参照图8,所述S20包括以下至少一项:
S21:响应于第二触发动作,确定所述第二触发动作的滑动方向,将所述滑动方向对应的触控区域作为所述目标触控区域,并将所述目标图标行内的图标移动到所述目标触控区域;
S22:响应于第二触发动作,确定所述第二触发动作的触发位置所属的区域,根据所述区域确定所述目标触控区域,并将所述目标图标行内的图标移动到所述目标触控区域。
可选地,所述触发动作包括但不限于滑动动作、长按动作、单击动作以及双击动作的至少一种。
可选地,在一实施例中,在所述第二触发动作为滑动动作时,获取第二触发动作的滑动轨迹,根据所述滑动轨迹确定滑动方向,进而根据所述滑动方向对应的触控区域作为所述目标触控区域,可选地,所述移动终端的存储器存储了预设的滑动方向与触控区域的对应关系,所述预设的滑动方向与触控区域的对应关系包括以下至少一项:
在所述滑动方向为向右滑动时,将显示界面的右侧预设区域作为所述触控区域;
在所述滑动方向为向左滑动时,将显示界面的左侧预设区域作为所述触控区域;
在所述滑动方向为向上滑动时,将显示界面的右侧预设区域作为所述触控区域;
在所述滑动方向为向下滑动时,将显示界面的左侧预设区域作为所述触发区域。
可选地,参照图9,图9示出了左侧预设区域以及右侧预设区域的示例图。可选地,所述右侧预设区域可以是右侧边栏区域,还可以是右下角区域,还可以是最右侧中部区域,还可以是右上角区域,所述左侧预设区域可以是左侧边栏区域,还可以是左下角区域,还可以是最左侧中部区域,还可以是左上角区域;可以理解的是,所述左侧预设区域以及所述右侧预设区域位于显示界面一侧,在将目标图标行内的图标移 动至左侧预设区域或右侧预设区域时,用户基于左侧预设区域或右侧预设区域即可对目标图标行内的任一图标进行操作,防止出现用户左手握持移动终端时,因处于显示界面右侧的图标与用户左手手指的距离过远,导致用户左手手指无法对处于显示界面右侧的图标进行操作的情况或用户右手握持移动终端时,因处于显示界面左侧的图标与用户右手手指的距离过远,导致用户右手手指无法对处于显示界面左侧的图标进行操作的情况,从而提高了单手操作图标的便捷性。
可选地,在又一实施例中,根据所述第二触发动作确定目标触控区域的方式还可以是获取所述第二触发动作的触发位置,确定所述第二触发动作的触发位置所属的区域,根据所述区域确定所述目标触控区域,可选地,在所述第二触发动作为长按动作、单击动作或双击动作时,将长按动作的长按位置、单击动作的单击位置或双击动作的双击位置作为所述第二触发动作的触发位置。
可选地,在确定所述第二触发动作的触发位置,确定所述触发位置所处的区域的方式可以是:获取所述触发位置与所述显示界面的中心线的位置关系,根据所述位置关系确定触发位置所属的区域,所述区域包括显示界面的左半边区域以及所述显示界面的右半边区域,所述位置关系包括触发位置位于中心线左侧以及触发位置位于中心线右侧。可选地,在所述位置关系为触发位置位于中心线左侧时,确定触发位置所属的区域为显示界面的左半边区域,在所述位置关系为触发位置位于中心线右侧时,确定触发位置所属的区域为显示界面的右半边区域。
可选地,在确定触发位置所属的区域后,根据所述区域确定所述目标触控区域的方式包括以下至少一种:
在所述区域为显示界面的左半边区域时,所述目标触控区域为左侧预设区域;
在所述区域为显示界面的右半边区域时,所述目标触控区域为右侧预设区域。
可选地,在又一实施例中,在接收到所述第二触发动作后,将所述第二触发动作的触发位置预设范围内的区域作为所述目标触控区域,可选地,将以所述触发位置为中心的区域作为所述目标触控区域,示例性地,在所述触发位置位于右下角区域时,将右下角区域作为所述目标触控区域。
可选地,在确定所述目标触控区域后,将所述目标图标行内的图标移动到所述目标触控区域,以在所述目标触控区域显示所述目标图标行内的图标。
可选地,所述S20还包括:
将目标图标行内的图标以纵向排列在目标触控区域内;
或,将目标图标行内的图标以弧形排列在目标触控区域内。
可选地,在将所述目标图标行的图标移动到所述目标触控区域,为了节省显示空间,本申请实施例将目标图标行内的图标以纵向排列在目标触控区域内,参照图10,图10示出了将目标图标行内的图标以纵向排列在目标触控区域内的显示界面图。
可选地,在将目标图标行内的图标以纵向排列在目标触控区域内的同时,将其他图标行的显示区域进行挤压,基于挤压后的显示区域缩小显示其它图标行内的图标,以节省显示空间。
可选地,在将目标图标行内的图标以纵向排列在目标触控区域内时,将所述目标图标行内的各个图标依次显示在其它图标行两两之间的行空隙处,示例性地,参照图11,图11示出了将目标图标行内的图标以纵向排列在目标触控区域内的显示界面图。
可选地,在又一实施例中,还可将目标图标行内的图标以弧形排列在目标触控区域内,示例性地,参 照图12,图12示出了将目标图标行内的图标以弧形排列在目标触控区域内的显示界面。
可选地,在将所述目标图标行内的图标移动至所述目标触控区域后,用户可基于所述目标触控区域内显示的所述目标图标行内的图标对所需操作的图标进行操作。
在本申请实施例中,通过接收用户发起的第一触发动作以及第二触发动作,所述左侧预设区域可以是左侧边栏区域,还可以是左下角区域,还可以是左侧中部区域,还可以是左上角区域,所述第一触发动作用于指示用户所需操作的图标所在的图标行,所述第二触发动作用于执行用户所需操作的图标所在的图标行需移动到的目标触控区域,通过响应于第一触发动作,确定用户所需的图标所在的图标行,将用户所需操作的图标所在的图标行作为所述目标图标行,响应于第二触发动作,根据所述第二触发动作确定目标触控区域,进而将目标图标行内的图标移动到所述目标触控区域,用户基于所述目标触控区域即可对所需操作的图标执行相应操作,提高了图标操作的便捷性。
第二实施例
参照图13,基于第一实施例,图13示出了图标显示控制方法第二实施例的流程示意图,所述方法包括以下步骤:
S10:响应于第一触发动作,根据所述第一触发动作确定所述第一触发动作对应的目标图标行;
S30:根据所述第一触发动作的触发位置确定所述目标触控区域,和/或获取握持位置,根据所述握持位置确定所述目标触控区域,并将所述目标图标行内的图标移动到所述目标触控区域。
在本申请实施例中,响应于第一触发动作,根据第一触发动作确定目标图标行后,根据所述第一触发动作和/或握持位置确定所述目标触控区域,并将所述目标图标行内的图标移动到所述目标触控区域。
可选地,根据所述第一触发动作的触发位置确定所述目标触控区域的方式包括以下至少一项:
在所述触发位置位于显示界面中心线的左侧时,确定所述目标触控区域为所述显示界面的左侧预设区域;
在所述触发位置位于显示界面中心线的右侧时,确定所述目标触控区域为所述显示界面的右侧预设区域。
可选地,在所述触发位置位于显示界面中心线的左侧时,代表用户手部方便的操作区域位于所述显示界面中心线的左侧,进而将所述显示界面的左侧预设区域作为所述目标触控区域,所述左侧预设区域可以是左侧边栏区域,还可以是左下角区域,还可以是最左侧中部区域,还可以是左上角区域;可选地,在所述触发位置位于显示界面中心线的右侧时,代表用户手部方便的操作区域位于所述显示界面中心线的右侧,进而将所述显示界面的右侧预设区域作为所述目标触控区域,所述左侧预设区域可以是右侧边栏区域,还可以是右下角区域,还可以是最右侧中部区域,还可以是右上角区域。示例性地,参照图14,图14示出了根据第一触发动作的触发位置确定目标触控区域的示例图,在所述第一触发动作的触发位置位于显示界面中心线的右侧时,将所述目标图标行内的图标移动至右侧预设区域。
可选地,根据所述握持位置确定所述目标触控区域的方式包括以下至少一项:
在所述握持位置位于显示界面中心线的左侧时,确定所述目标触控区域为所述显示界面的左侧预设区域;
在所述触发位置位于显示界面中心线的右侧时,确定所述目标触控区域为所述显示界面的右侧预设区域。
可选地,所述握持位置用于指示用户握持所述移动终端的手持状态,所述手持状态包括左手握持、右手握持以及双手握持的至少一种,在所述握持位置位于显示界面中心线的左侧时,代表用户手部位于所述显示界面中心线的左侧,即左侧预设区域相较于右侧预设区域而言,更靠近用户手部,进而将所述显示界面的左侧预设区域作为所述目标触控区域;可选地,在所述握持位置位于显示界面中心线的右侧时,代表用户手部位于所述显示界面中心线的右侧,进而将所述显示界面的右侧预设区域作为所述目标触控区域;可选地,在所述握持位置同时位于显示界面中心线的左右两侧时,将所述左侧预设区域和/或所述右侧预设区域作为所述目标触控区域。示例性地,参照图15,图15示出了将示出了根据第一触发动作的触发位置确定目标触控区域的示例图,在所述握持位置位于显示界面中心线的右侧时,将所述目标图标行内的图标移动至右侧预设区域。
在本申请实施例中,在接收到第一触发动作时,根据所述第一触发动作确定目标图标行,并根据第一触发动作的触发位置与显示界面中心的位置关系确定目标触控区域,和/或根据握持位置将靠近用户手部的区域作为所述目标触控区域,将所述目标图标行内的图标移动至所述目标触控区域,而无需在确定目标图标行后,还需要等待接收用户发起的第二触发动作,才可确定目标触控区域,提高了图标移动的效率以及准确性,提升用户体验。
第三实施例
参照图16,基于第一实施例和第二实施例,图16示出了图标显示控制方法第二实施例的流程示意图,所述方法包括以下步骤:
S10:响应于第一触发动作,根据所述第一触发动作确定所述第一触发动作对应的目标图标行;
S40:获取各个图标在所述目标图标行所处区域的显示顺序和/或各个图标的属性信息;
S50:根据所述显示顺序和/或所述属性信息确定各个图标在所述目标触控区域的排列顺序;
S60:根据所述排列顺序将所述目标图标行的图标以纵向排列在目标触控区域内,或将目标图标行内的图标以弧形排列在目标触控区域内。
在本申请实施例中,所述显示顺序用于指示所述目标图标行内的各个图标的显示次序,所述显示次序可以是从左到右的显示次序,还可以是从右到左的显示次序,本申请实施例以所述显示次序为从左到右的显示次序举例分析。示例性地,如图6所示,所述目标图标行内的各个图标包括“电话”、“短信”、“浏览器”,显示顺序为:“电话”-“短信”-“浏览器”。
可选地,在显示顺序越靠前,则图标在目标图标行所处区域的显示位置越靠近显示界面左侧,在显示顺序越靠后,则图标目标图标行所处区域的显示位置越靠近显示界面右侧,在第一触发动作的触发位置或第二触发动作的触发位置位于显示界面中心线的左侧时,或所述握持位置位于显示界面中心线的左侧时,越靠近显示界面左侧的图标距离用户手部越近,越靠近显示界面右侧的图标距离用户手部越远,在用户发起了第一触发动作时,可能是因为用户所需操作的图标距离用户较远,导致用户无法操作该图标,因此,越远离用户手部的图标的操作概率大于越靠近用户手部的图标的操作概率,为了提高图标的操作便捷性,本申请实施例提出了一种根据图标在所述目标图标行所处区域的显示顺序确定各个图标在所述目标触控区域的排列顺序,根据所述排列顺序将所述目标图标行的图标以纵向排列在目标触控区域内,或将目标行内的的图标以弧形排列在目标触控区域内。
可选地,所述排列顺序用于指示图标对应的移动后的显示位置与用户手部的距离的远近,排列顺序越 靠前,则图标对应的移动后的显示位置与用户手部的距离越近,排列顺序越靠后,则图标对应的移动后的显示位置与用户手部的距离越远,所述图标对应的移动后的显示位置为所述目标图标行内的图标在所述目标触控区域内的显示位置。
可选地,根据图标在所述目标图标行所处区域的显示顺序确定各个图标在所述目标触控区域的排列顺序的方式包括以下至少一项:
在所述目标触控区域位于左侧预设区域时,根据所述显示顺序依据从右到左的顺序依次确定各个图标的排列顺序,可选地,显示顺序越靠右的图标对应的排列顺序越靠前,显示顺序越靠左的图标对应的排列顺序越靠后;
在所述目标触控区域位于右侧预设区域时,根据所述显示顺序依据从左到右的顺序依次确定各个图标的排列顺序,可选地,显示顺序越靠左的图标对应的排列顺序越靠前,显示顺序越靠右的图标对应的排列顺序越靠后。
可选地,在所述目标触控区域位于左侧预设区域时,代表左侧预设区域相较于右侧预设区域越靠近用户手部,则用户手部位于显示界面左侧,在用户手部位于显示界面左侧时,则越靠近显示界面右侧的图标距离用户手部的距离越远,则越靠近显示界面右侧的图标的操作概率越大,则越靠近显示界面右侧的图标的排列顺序越靠前,示例性地,如图6所示,所述目标图标行内的各个图标包括“电话”、“短信”、“浏览器”,显示顺序为:“电话”-“短信”-“浏览器”,在用户左手持握所述移动终端时,“浏览器”与用户手部的距离大于“短信”与用户手部的距离,“短信”与用户手部的距离大于“电话”与用户手部的距离,则所述排列顺序依次为“浏览器”-“短信”-“电话”。
可选地,在所述目标触控区域位于右侧预设区域时,代表右侧预设区域相较于左侧预设区域越靠近用户手部,则用户手部位于显示界面右侧,在用户手部位于显示界面右侧时,则越靠近显示界面左侧的图标距离用户手部的距离越远,则越靠近显示界面左侧的图标的操作概率越大,则越靠近显示界面左侧的图标的排列顺序越靠前,示例性地,如图6所示,所述目标图标行内的各个图标包括“电话”、“短信”、“浏览器”,显示顺序为:“电话”-“短信”-“浏览器”,在用户右手持握所述移动终端时,“电话”与用户手部的距离大于“短信”与用户手部的距离,“短信”与用户手部的距离大于“浏览器”与用户手部的距离,则所述排列顺序依次为“电话”-“短信”-“浏览器”。
可选地,在根据所述显示顺序确定所述排列顺序后,获取第一触发动作的触发位置或第二触发动作的触发位置,以所述第一触发动作的触发位置为起点或以所述第二触发动作的触发位置为起点,依据所述排列顺序将所述目标图标行内的图标以纵向排列在目标触控区域内,或将目标行内的的图标以弧形排列在目标触控区域内,示例性地,参照图17,图17示出了根据排列顺序将所述目标图标行内的图标以纵向排列在目标触控区域内的示例图。
可选地,所述属性信息包括所述图标的使用频次,使用频次越高,则所述图标的操作概率越高,根据所述使用频次确定各个图标的排列顺序,使用频次越高,则排列顺序越靠前,使用频次越低,则排列顺序越靠后。可选地,所述属性信息还包括图标与应用场景的匹配度,获取当前应用场景,获取各个图标与所述当前应用场景的匹配度,匹配度越高,则图标的操作概率越高,根据所述匹配度确定各个图标的排列顺序,匹配度越高,则排列顺序越靠前,匹配度越高,则排列顺序越靠后,所述当前应用场景根据当前时间和/或当前位置信息确定,示例性地,在当前位置信息处于商铺时,确定当前应用场景为购物场景,用于 支付的应用的图标的匹配度高。
可选地,所述属性信息包括但不限于图标的使用频次,所述图标与应用场景的匹配度,还可以包括图标的上一次操作时间与当前时间的时间间隔,时间间隔越短,则所述图标的操作概率越高,时间间隔越长,则所述图标的操作概率越低。
可选地,根据所述属性信息确定各个图标的操作概率后,根据所述操作概率确定各个图标在所述目标触控区域的排列顺序,根据所述排列顺序将所述目标图标行内的图标以纵向排列在目标触控区域内,或将目标图标行内的图标以弧形排列在目标触控区域内,以使得操作概率越高的图标与用户手部的距离越近,操作概率越低的图标与用户手部的距离越远。
可选地,在又一实施例中,还可以根据所述显示顺序以及所述属性信息确定各个图标在所述目标触控区域的排列顺序。可选地,根据所述显示顺序以及所述属性信息确定各个图标的操作概率,根据所述操作概率确定各个图标的排列顺序,示例性地,在用户右手持握所述移动终端时,将显示顺序最靠左且使用频次最高的图标作为排列顺序最前的图标。
在本申请实施例中,在确定目标图标行以及目标触控区域后,将所述目标图标内的各个图标移动到所述目标触控区域,并根据各个图标在所述目标图标行所处区域的显示顺序或各个图标的属性信息确定各个图标的操作概率,根据所述操作概率确定各个图标在所述目标触控区域的排列顺序,根据所述排列顺序将所述目标图标行内的图标以纵向排列在目标触控区域内,或将目标图标行内的图标以弧形排列在目标触控区域内,使得操作概率越高的图标与用户手部的距离越近,提高用户操作图标的准确性以及效率。
请参见图18,图12是本申请实施例提供的另一种移动终端的结构示意图。本申请还提供一种移动终端,移动终端包括存储器1201、处理器1202以及存储在存储器1201里并可在处理器1202上运行的图标显示控制程序,图标显示控制程序被处理器执行时实现上述任一实施例中的图标显示控制方法的步骤。
本申请还提供一种计算机可读存储介质,计算机可读存储介质上存储有图标显示控制程序,图标显示控制程序被处理器执行时实现上述任一实施例中的图标显示控制方法的步骤。
在本申请提供的移动终端和计算机可读存储介质的实施例中,包含了上述图标显示控制方法各实施例的全部技术特征,说明书拓展和解释内容与上述来电备注方法的各实施例基本相同,在此不做再赘述。
本申请实施例还提供一种计算机程序产品,计算机程序产品包括计算机程序代码,当计算机程序代码在计算机上运行时,使得计算机执行如上各种可能的实施方式中的方法。
本申请实施例还提供一种芯片,包括存储器和处理器,存储器用于存储计算机程序,处理器用于从存储器中调用并运行计算机程序,使得安装有芯片的设备执行如上各种可能的实施方式中的方法。
本申请实施例还提供一种计算机模块,用于执行上述各种可能的实施方式中的方法。
计算装置一般包含处理器和存储器,存储器用于存储指令,当指令被处理器执行时,使得该计算装置执行本发明的各步骤或各程序模块。
图19为本申请提供的一种控制器140的硬件结构示意图。该控制器140包括:存储器1401和处理器1402,存储器1401用于存储程序指令,处理器1402用于调用存储器1401中的程序指令执行上述方法实施例中控制器所执行的步骤,其实现原理以及有益效果类似,此处不再进行赘述。
可选地,上述控制器还包括通信接口1403,该通信接口1403可以通过总线1404与处理器1402连接。处理器1402可以控制通信接口1403来实现控制器140的接收和发送的功能。
图20为本申请提供的一种网络节点150的硬件结构示意图。该网络节点150包括:存储器1501和处理器1502,存储器1501用于存储程序指令,处理器1502用于调用存储器1501中的程序指令执行上述方法实施例中首节点所执行的步骤,其实现原理以及有益效果类似,此处不再进行赘述。
可选地,上述网络节点还包括通信接口1503,该通信接口1503可以通过总线1504与处理器1502连接。处理器1502可以控制通信接口1503来实现网络节点150的接收和发送的功能。
上述以软件功能模块的形式实现的集成的模块,可以存储在一个计算机可读取存储介质中。上述软件功能模块存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或处理器(英文:processor)执行本申请各个实施例方法的部分步骤。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行计算机程序指令时,全部或部分地产生按照本申请实施例的流程或功能。计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘solid state disk,SSD)等。
上述本申请实施例序号仅仅为了描述,不代表实施例的优劣。
在本申请中,对于相同或相似的术语概念、技术方案和/或应用场景描述,一般只在第一次出现时进行详细描述,后面再重复出现时,为了简洁,一般未再重复阐述,在理解本申请技术方案等内容时,对于在后未详细描述的相同或相似的术语概念、技术方案和/或应用场景描述等,可以参考其之前的相关详细描述。
在本申请中,对各个实施例的描述都各有侧重,某个实施例中没有详述或记载的部分,可以参见其它实施例的相关描述。
本申请技术方案的各技术特征可以进行任意的组合,为使描述简洁,未对上述实施例中的各个技术特征所有可能的组合都进行描述,然而,只要这些技术特征的组合不存在矛盾,都应当认为是本申请记载的范围。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在如上的一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端设备(可以是手机,计算机,服务器,被控终端,或者网络设备等)执行本申请每个实施例的方法。
以上仅为本申请的优选实施例,并非因此限制本申请的专利范围,凡是利用本申请说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本申请的专利保护范围内。

Claims (10)

  1. 一种图标显示控制方法,其特征在于,所述方法包括以下步骤:
    S10:响应于第一触发动作,根据所述第一触发动作确定所述第一触发动作对应的目标图标行;
    S20:响应于第二触发动作,将所述目标图标行内的图标移动到目标触控区域。
  2. 如权利要求1所述的方法,其特征在于,所述S10包括:
    获取所述第一触发动作的触发位置;
    将所述触发位置对应的图标行确定为所述目标图标行。
  3. 如权利要求1所述的方法,其特征在于,所述S20还包括以下至少一项:
    根据所述第一触发动作的触发位置确定所述目标触控区域;
    获取握持位置,根据所述握持位置确定所述目标触控区域;
    根据所述第二触发动作确定所述目标触控区域。
  4. 如权利要求3所述的方法,其特征在于,所述根据所述第二触发动作确定目标触控区域的方式包括以下至少一项:
    确定所述第二触发动作的滑动方向,将所述滑动方向对应的触控区域作为所述目标触控区域;
    确定所述第二触发动作的触发位置所属的区域,根据所述区域确定所述目标触控区域。
  5. 如权利要求3所述的方法,其特征在于,所述根据所述第一触发动作的触发位置确定所述目标触控区域的方式包括以下至少一项:
    在所述触发位置位于显示界面中心线的左侧时,确定所述目标触控区域为所述显示界面的左侧预设区域;
    在所述触发位置位于显示界面中心线的右侧时,确定所述目标触控区域为所述显示界面的右侧预设区域。
  6. 如权利要求3所述的方法,其特征在于,所述根据所述握持位置确定所述目标触控区域的方式包括以下至少一项:
    在所述握持位置位于显示界面中心线的左侧时,确定所述目标触控区域为所述显示界面的左侧预设区域;
    在所述触发位置位于显示界面中心线的右侧时,确定所述目标触控区域为所述显示界面的右侧预设区域。
  7. 如权利要求1所述的方法,其特征在于,所述S20还包括:
    将目标图标行内的图标以纵向排列在目标触控区域内;
    或,将目标图标行内的图标以弧形排列在目标触控区域内。
  8. 如权利要求1-7中任一所述的方法,其特征在于,所述S20还包括:
    获取各个图标在所述目标图标行所处区域的显示顺序和/或各个图标的属性信息;
    根据所述显示顺序和/或所述属性信息确定各个图标在所述目标触控区域的排列顺序。
  9. 一种移动终端,其特征在于,所述移动终端包括:存储器、处理器,其中,所述存储器上存储有图标显示控制程序,所述图标显示控制程序被所述处理器执行时实现如权利要求1至8中任一项所述的图标显示控制方法的步骤。
  10. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质上存储有计算机程序,所述计算机程序被处理器执行时实现如权利要求1至8中任一项所述的图标显示控制方法的步骤。
PCT/CN2022/116724 2022-09-02 2022-09-02 图标显示控制方法、移动终端及存储介质 WO2024045155A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/116724 WO2024045155A1 (zh) 2022-09-02 2022-09-02 图标显示控制方法、移动终端及存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/116724 WO2024045155A1 (zh) 2022-09-02 2022-09-02 图标显示控制方法、移动终端及存储介质

Publications (1)

Publication Number Publication Date
WO2024045155A1 true WO2024045155A1 (zh) 2024-03-07

Family

ID=90100097

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/116724 WO2024045155A1 (zh) 2022-09-02 2022-09-02 图标显示控制方法、移动终端及存储介质

Country Status (1)

Country Link
WO (1) WO2024045155A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103019562A (zh) * 2012-12-07 2013-04-03 东莞宇龙通信科技有限公司 终端和控制托盘配置方法
CN103324414A (zh) * 2013-06-20 2013-09-25 广东欧珀移动通信有限公司 一种调整图标位置的方法及移动终端
JP2014013507A (ja) * 2012-07-04 2014-01-23 Panasonic Corp ポータブル機器、表示画面操作方法、プログラム
KR20140087731A (ko) * 2012-12-31 2014-07-09 엘지전자 주식회사 포터블 디바이스 및 사용자 인터페이스 제어 방법
CN104503660A (zh) * 2014-12-18 2015-04-08 厦门美图移动科技有限公司 一种图标整理方法、设备及移动终端
CN105446641A (zh) * 2015-11-06 2016-03-30 广东欧珀移动通信有限公司 单手操作触摸屏图标的方法、系统和移动终端

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014013507A (ja) * 2012-07-04 2014-01-23 Panasonic Corp ポータブル機器、表示画面操作方法、プログラム
CN103019562A (zh) * 2012-12-07 2013-04-03 东莞宇龙通信科技有限公司 终端和控制托盘配置方法
KR20140087731A (ko) * 2012-12-31 2014-07-09 엘지전자 주식회사 포터블 디바이스 및 사용자 인터페이스 제어 방법
CN103324414A (zh) * 2013-06-20 2013-09-25 广东欧珀移动通信有限公司 一种调整图标位置的方法及移动终端
CN104503660A (zh) * 2014-12-18 2015-04-08 厦门美图移动科技有限公司 一种图标整理方法、设备及移动终端
CN105446641A (zh) * 2015-11-06 2016-03-30 广东欧珀移动通信有限公司 单手操作触摸屏图标的方法、系统和移动终端

Similar Documents

Publication Publication Date Title
CN108037893B (zh) 一种柔性屏的显示控制方法、装置及计算机可读存储介质
CN107943400B (zh) 双面屏切换方法、移动终端及可读存储介质
CN108196922B (zh) 一种打开应用的方法、终端及计算机可读存储介质
CN107809534B (zh) 一种控制方法、终端及计算机存储介质
CN112068744A (zh) 交互方法、移动终端及存储介质
CN109408187B (zh) 头像设置方法、装置、移动终端及可读存储介质
CN108810262B (zh) 一种应用的配置方法、终端和计算机可读存储介质
WO2022166772A1 (zh) 一种应用程序切换方法及装置
WO2022267430A1 (zh) 截屏交互的方法、移动终端及存储介质
CN115914719A (zh) 投屏显示方法、智能终端及存储介质
WO2023015774A1 (zh) 切换方法、移动终端及存储介质
WO2024045155A1 (zh) 图标显示控制方法、移动终端及存储介质
CN113824176A (zh) 充电方法、耳机、终端及存储介质
CN113253892A (zh) 数据分享方法、终端及存储介质
CN108008877B (zh) 一种tab的移动方法、终端设备及计算机存储介质
CN113342246A (zh) 操作方法、移动终端及存储介质
CN112230825A (zh) 分享方法、移动终端及存储介质
WO2024045145A1 (zh) 图标显示控制方法、移动终端及存储介质
WO2024045184A1 (zh) 处理方法、移动终端及存储介质
WO2023092343A1 (zh) 图标区域管理方法、智能终端及存储介质
WO2022261897A1 (zh) 处理方法、移动终端及存储介质
WO2022252031A1 (zh) 应用的显示方法、移动终端及存储介质
WO2023050910A1 (zh) 图标的显示方法、智能终端及存储介质
CN107479747B (zh) 一种触控显示方法、设备及计算机存储介质
WO2022241695A1 (zh) 处理方法、移动终端及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22956979

Country of ref document: EP

Kind code of ref document: A1