WO2018166023A1 - 一种图标显示方法和终端设备 - Google Patents

一种图标显示方法和终端设备 Download PDF

Info

Publication number
WO2018166023A1
WO2018166023A1 PCT/CN2017/080298 CN2017080298W WO2018166023A1 WO 2018166023 A1 WO2018166023 A1 WO 2018166023A1 CN 2017080298 W CN2017080298 W CN 2017080298W WO 2018166023 A1 WO2018166023 A1 WO 2018166023A1
Authority
WO
WIPO (PCT)
Prior art keywords
icon
terminal device
mobile phone
operation instruction
area
Prior art date
Application number
PCT/CN2017/080298
Other languages
English (en)
French (fr)
Inventor
彭军
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to US16/493,403 priority Critical patent/US11086478B2/en
Priority to CN201780052538.9A priority patent/CN109643216A/zh
Publication of WO2018166023A1 publication Critical patent/WO2018166023A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/033Indexing scheme relating to G06F3/033
    • G06F2203/0338Fingerprint track pad, i.e. fingerprint sensor used as pointing device tracking the fingertip image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Definitions

  • Embodiments of the present invention relate to the field of communications, and in particular, to an icon display method and a terminal device.
  • the interface (such as the main interface) displayed on the touch screen of the mobile phone can be reduced overall, so that the original interface is divided into two parts, and the part is displayed.
  • the area of the icon in the original interface (that is, the scaled interface), and the other part is the free area where the icon in the original interface is not displayed.
  • the free area is generally a black screen area that can be caused by the user to click and operate the mobile phone to perform the process of reducing the interface.
  • the problem is that after the overall display of the touch screen of the mobile phone is reduced in the prior art, the overall reduced screen may still exceed the range of the user's one-hand operation because the touch screen of the mobile phone is too large, so that the user cannot Hand-operate part of the icon on the overall reduced interface.
  • the touch screen of the mobile phone displays the above-mentioned scaled interface, the above-mentioned free area is also displayed, which may cause the user to accidentally touch the free area during one-hand operation.
  • the present application provides an icon display method and a terminal device, which can cause part or all of the icons displayed by the terminal device to fall within an area that the user can operate with one hand.
  • the first aspect provides an icon display method, where the icon display method includes: determining, by the terminal device, a grip position, where the grip position is one of a preset gripping position of the terminal device; and determining, by the terminal device, the terminal device a relative position of each icon in at least one icon, a relative position of each icon is a position of each icon relative to a grip position; the terminal device determines a current operation area according to the grip position, the current operation area is a user list Hand-operating the area displayed by the terminal device; the terminal device according to the current operation area, the grip point position, and the relative position of each icon in the at least one icon, Determining a sensing area of each icon such that the sensing area of each icon falls partially or completely within the current operating area, the sensing area of each icon is an area in which each icon is operated; the terminal device according to each icon The sensing area displays at least one icon in each icon.
  • the terminal device may zoom the sensing area of the icon displayed by the terminal device according to the grip position of the terminal device, so that the sensing of each icon displayed by the terminal device is performed. Some or all of the area falls within the current operating area, so that the user can operate all the icons displayed by the terminal device with one hand. Moreover, in the method provided by the present application, the terminal device does not display a free area, so that the possibility that the user displays the area displayed by the terminal device is less likely to be touched.
  • the terminal device scales the icon displayed by the terminal device (ie, the sensing area of the icon) instead of the entire interface displayed by the terminal device, so after the foregoing method is executed, the terminal device displays The icon is displayed well.
  • the icon display method provided by the present application may be triggered by an operation of a user, and the corresponding terminal device may perform the foregoing method after receiving an operation instruction corresponding to the operation.
  • the method may further include: the terminal device receiving the first operation instruction; the terminal device matching the first operation instruction and the first preset instruction, so that the terminal device enters the dynamic display mode, where the dynamic In the display mode, the sensing areas of each of the at least one of the icons are the same or different.
  • the mode in which the terminal device can zoom the sensing area of its display icon is called a dynamic display mode.
  • the sensing area of the displayed icon may be the same; after the terminal device enters the dynamic display mode, the sensing area of the displayed icon may be different.
  • the above first preset instruction can be preset by a person skilled in the art.
  • the terminal device may further receive an operation instruction from the user, so that the terminal device determines the actual grip position.
  • the icon display method may further include: the terminal device receiving the second operation instruction, where the second operation instruction is used to indicate the grip position.
  • the method for determining the position of the grip point by the terminal device may further include: the terminal device matching the second operation instruction with the second preset instruction to determine the grip position.
  • the second preset instruction may be preset by a person skilled in the art. For example, before the icon display method provided by the present application is performed, a person skilled in the art may preset four second preset instructions, which respectively correspond to four different grip position. Then, after receiving the second operation instruction, the terminal device may match whether the second operation instruction is the same as any of the four second preset instructions, to determine the position of the grip point.
  • the interface state of the terminal device may be different, such as a portrait state or a landscape state.
  • the position of the same icon displayed on the touch screen may be different, and the position of the icon may be preset.
  • the position of the icon displayed on the touch screen has an influence on the dynamic display icon in the above method; thus, the terminal device can perform the above method in combination with the interface state of the terminal device.
  • the icon display method may further include: determining, by the terminal device, an interface state; the terminal device determining the terminal device.
  • the method for determining the relative position of each of the at least one icon may include: determining, by the terminal device, the relative position of each of the at least one icon displayed by the terminal device according to the interface state.
  • the current operation area may be preset in the terminal device. Specifically, before the terminal device determines the position of the grip point, the icon display method may further include: the terminal device receives the third operation instruction; and the terminal device sets the current operation area according to the third operation instruction.
  • the current operation area and the grip point position are in one-to-one correspondence.
  • the icon display method may further include: the terminal device receiving a fourth operation instruction, the fourth operation instruction being used to instruct to adjust a sensing area of each icon of the at least one icon.
  • the terminal device obtains a ratio according to the fourth operation instruction; the terminal device adjusts the sensing area of each icon according to the ratio to obtain the sensing area after each icon adjustment; and the terminal device displays at least one of the sensing areas adjusted according to each icon.
  • the icon display method provided by the present application can adjust the sensing area of each icon according to the needs of the user, so that the user can operate all the icons adjusted by the sensing area of the icon displayed by the terminal device with one hand, and can obtain the comparison. Good user experience.
  • a terminal device comprising: a determining module and a display module.
  • the determining module is configured to determine a grip position, wherein the grip position is one of a preset gripping position of the terminal device; and determining a relative position of each icon in at least one icon displayed on the display interface of the terminal device
  • the relative position of each icon is the position of each icon relative to the grip position; according to the grip position, the current operation area is determined, and the current operation area is an area on the display interface of the user with one hand; according to the current operation area, Determining a relative position of each icon in the at least one icon, determining a sensing area of each icon such that the sensing area of each icon falls partially or completely within the current operating area, and the sensing area indication of each icon The area is the area in which each icon is operated.
  • a display module configured to display each icon of the at least one icon according to the sensing area of each icon determined by the determining module.
  • the foregoing terminal device may further include: a receiving module.
  • the receiving module is configured to receive the first operation instruction before the determining module determines the grip position.
  • the determining module is further configured to match the first operation instruction and the first preset instruction received by the receiving module, so that the terminal device enters a dynamic display mode, where the sensing area of each icon in the at least one icon is the same or different.
  • the receiving module is further configured to receive a second operation instruction, where the second operation instruction is used to indicate the position of the grip point, before the determining module determines the position of the grip point.
  • the determining module is specifically configured to match the second operating instruction with the second preset command to determine the grip position.
  • the determining module may be further configured to: after the receiving module receives the first operation instruction, determine an interface state, where the interface state is used to indicate that the interface state of the terminal device is a vertical screen state, or the terminal device The interface status is the horizontal screen status.
  • the above determining module may specifically And determining, according to the interface state, a relative position of each of the at least one icon displayed by the terminal device.
  • the receiving module may be further configured to receive a third operation instruction before the determining module determines the grip position.
  • the terminal device may further include: a setting module.
  • the setting module is configured to set the current operation area according to the third operation instruction.
  • the receiving module may be further configured to: after the displaying module displays each of the at least one icon, receive a fourth operation instruction, where the fourth operation instruction may be used to indicate that the at least one icon is adjusted.
  • the sensing area of each icon may further include: an adjustment module.
  • the adjusting module may be configured to obtain a ratio according to the fourth operation instruction received by the receiving module, and adjust a sensing area of each icon according to the ratio to obtain a sensing area after each icon adjustment.
  • the display module may be further configured to display each of the at least one icon according to the adjusted sensing area of each icon obtained by the adjusting module.
  • a terminal device which can include a processor, a memory, a display, an input, and a bus.
  • the memory is configured to store computer instructions, the processor, the memory, the display, and the input device are connected by the bus.
  • the processor executes the computer command stored in the memory to enable the terminal device to execute An icon display method as in the first aspect and the various alternatives of the first aspect.
  • a computer storage medium stores computer instructions, and when the processor of the terminal device in the third aspect executes the computer instruction, the processor performs the first aspect and the first aspect The icon display method in various alternatives.
  • the processor in the third aspect of the present application may be the integration of the function module such as the determining module, the setting module, and the adjusting module in the second aspect, and the processor may implement the functions of the foregoing functional modules in the second aspect.
  • the processor may implement the functions of the foregoing functional modules in the second aspect.
  • FIG. 1 is a schematic structural diagram of hardware of a terminal device according to an embodiment of the present disclosure
  • FIG. 2 is a schematic flowchart diagram of an icon display method according to an embodiment of the present disclosure
  • FIG. 3 is a schematic diagram of a touch screen display icon of a mobile phone according to an embodiment of the present disclosure
  • FIG. 4 is a schematic diagram of a touch screen display icon of another mobile phone according to an embodiment of the present invention.
  • FIG. 5 is a schematic diagram of a touch screen display icon of another mobile phone according to an embodiment of the present invention.
  • FIG. 6 is a schematic flowchart diagram of another icon display method according to an embodiment of the present disclosure.
  • FIG. 7 is a schematic diagram of interface changes of a touch screen display of a mobile phone according to an embodiment of the present invention.
  • FIG. 8 is a schematic flowchart diagram of another icon display method according to an embodiment of the present disclosure.
  • FIG. 9 is a schematic diagram of interface changes of a touch screen display of another mobile phone according to an embodiment of the present invention.
  • FIG. 10 is a schematic diagram of interface changes of a touch screen display of another mobile phone according to an embodiment of the present invention.
  • FIG. 11 is a schematic diagram showing interface changes of a touch screen display of another mobile phone according to an embodiment of the present invention.
  • FIG. 12 is a schematic diagram of interface changes of a touch screen display of another mobile phone according to an embodiment of the present invention.
  • FIG. 13 is a schematic flowchart diagram of another icon display method according to an embodiment of the present disclosure.
  • FIG. 14 is a schematic diagram of interface changes of a touch screen display of another mobile phone according to an embodiment of the present invention.
  • FIG. 15 is a schematic flowchart diagram of another icon display method according to an embodiment of the present disclosure.
  • FIG. 16 is a schematic diagram of interface changes of a touch screen display of another mobile phone according to an embodiment of the present invention.
  • FIG. 17 is a schematic diagram of interface changes of a touch screen display of another mobile phone according to an embodiment of the present invention.
  • FIG. 18 is a schematic flowchart diagram of another icon display method according to an embodiment of the present disclosure.
  • FIG. 19 is a schematic diagram of interface changes of a touch screen display of another mobile phone according to an embodiment of the present invention.
  • FIG. 20 is a schematic diagram of a possible composition of a terminal device according to an embodiment of the present disclosure.
  • FIG. 21 is a schematic diagram of a possible composition of a terminal device according to an embodiment of the present disclosure.
  • FIG. 22 is a schematic diagram of a possible composition of a terminal device according to an embodiment of the present disclosure.
  • FIG. 23 is a schematic diagram of a possible composition of a terminal device according to an embodiment of the present disclosure.
  • FIG. 24 is a schematic diagram of a possible composition of a terminal device according to an embodiment of the present invention.
  • the embodiment of the present invention provides an icon display method and a terminal device, which can be applied to a process in which a terminal device displays an icon, and is specifically applied to a process in which a terminal device scales a sensing area of each icon and dynamically displays each icon.
  • the icon display method provided by the embodiment of the invention is applicable to a terminal device provided with a touch screen, wherein the terminal device can be a mobile phone, a tablet computer, a notebook computer, an ultra-mobile personal computer (UMPC), Terminal devices such as netbooks and personal digital assistants (PDAs).
  • the terminal device can be a mobile phone, a tablet computer, a notebook computer, an ultra-mobile personal computer (UMPC), Terminal devices such as netbooks and personal digital assistants (PDAs).
  • UMPC ultra-mobile personal computer
  • PDAs personal digital assistants
  • the embodiment of the present invention uses the terminal device as a mobile phone as an example to introduce the icon display method provided by the present invention.
  • the components of the mobile phone 10 will be specifically described below with reference to the accompanying drawings:
  • the mobile phone 10 may include: a touch screen 11 , a processor 12 , a memory 13 , a power source 14 , a radio frequency (RF) circuit 15 , a sensor 16 , an audio circuit 17 , a speaker 18 , a microphone 19 , and the like. These components can be connected via a bus or directly. It will be understood by those skilled in the art that the structure of the handset shown in FIG. 1 does not constitute a limitation to the handset, and may include more components than those illustrated, or some components may be combined, or different component arrangements.
  • RF radio frequency
  • the touch screen 11 can be referred to as a touch display panel for implementing the input and output functions of the mobile phone 10, and can collect operation instructions of the user on or near the user (for example, the user uses a finger, a stylus, etc.)
  • the appropriate object or accessory is operated on the touch screen 11 or in the vicinity of the touch screen 11 and drives the corresponding connecting device according to a preset program. It can also be used to display information entered by the user or information provided to the user (such as "Gallery” and “camera” application icons) as well as various menus of the mobile phone (such as the "Delete” button in the "Gallery”).
  • the touch screen 11 can be implemented in various types, such as a resistive type, a capacitive type, an infrared light sensation, and an ultrasonic wave.
  • This embodiment of the present invention is not limited thereto.
  • the operation of the user in the vicinity of the touch screen 11 can be referred to as a floating touch, and the touch screen capable of performing the floating touch can be implemented by using a capacitive type, an infrared light sense, and an ultrasonic wave.
  • the touch screen 11 can be used to receive an operation instruction of the user, such as an operation instruction of the icon displayed by the user on the touch screen 11 or an operation instruction of the user on a specific area on the touch screen 11; And can be used to transmit the operation instruction to the processor 12 after the touch screen 11 receives the operation instruction, so that the processor 12 performs the operation indicated by the operation instruction.
  • the touch screen 11 transmits the operation instruction to the processor 12, so that the processor 12 scales the sensing area of the icon, so that the touch screen 11 can be The icon is displayed in the sensing area of the zoomed icon.
  • the touch screen of the mobile phone provided by the embodiment of the present invention may be divided into multiple different sensing areas, such as a heat sensing area or a pressure sensing area.
  • each icon displayed on the touch screen of the mobile phone can fall within a sensing area, which is an area in which the icon can be operated by the user.
  • the sensing area of an icon may be larger than the area where the icon is located, or the sensing area of the icon may be the same as the area where the icon is located.
  • the icon display method provided by the embodiment of the present invention is described below by taking the same as the sensing area of each icon and the area where the icon is located.
  • the processor 12 is the control center of the handset 10, which connects various portions of the entire handset using various interfaces and lines, by running or executing software programs and/or modules stored in the memory 13, and recalling data stored in the memory 13, The various functions and processing data of the mobile phone 10 are executed to perform overall monitoring of the mobile phone 10.
  • processor 12 may include one or more processing units; processor 12 may integrate an application processor and a modem processor.
  • the application processor mainly processes an operating system, a main interface, an application, and the like, and the modem processor mainly processes wireless communication. It will be appreciated that the above described modem processor may also not be integrated into the processor 12.
  • the processor 12 can obtain an operation instruction received by the touch screen 11 and perform an operation indicated by the operation instruction. For example, the processor 12 may determine the current operation area after the operation instruction received by the touch screen 11 to determine the current operation area is determined.
  • the icon related to the embodiment of the present invention is an operable element displayed on the touch screen of the mobile phone.
  • the processor of the mobile phone can respond to the operation instruction, for example, the processor of the mobile phone changes the interface displayed by the touch screen.
  • the icon when the touch screen of the mobile phone is in the main interface, the icon may be an application icon displayed on the touch screen, such as a “camera” icon, a “gallery” icon, and a “short message” icon;
  • the user's operation command for the "gallery” is received through the touch screen, and the processor of the mobile phone can cause the touch screen to display the photos in the "gallery”.
  • the above icon may be a button icon displayed on the touch screen, such as an "Edit” button icon and a “Delete” button icon in the "Gallery”; or, when the touch screen of the mobile phone is running in the application In the process of the application interface, the icon may also be an input box of the touch screen display, such as a short message content input box in the "sms"; when the touch screen of the mobile phone is in the drop-down interface, the icon may be a touch screen Button icons displayed, such as "WiFi” icon, "Mute” icon, “Bluetooth” icon, "Direction Lock” icon, etc.
  • the following describes an icon display method provided by an embodiment of the present invention by taking an application icon displayed when the touch screen of the mobile phone is in the main interface as an example.
  • the memory 13 can be used to store data, software programs, and modules, and can be a Volotile Memory, such as a Random-Access Memory (RAM), or a Non-Volatile Memory.
  • a Volotile Memory such as a Random-Access Memory (RAM), or a Non-Volatile Memory.
  • RAM Random-Access Memory
  • Non-Volatile Memory a read-only memory (ROM), a flash memory, a hard disk drive (HDD), or a solid state drive (SSD); or a combination of the above types of memories.
  • the computer 13 can store a computer instruction for causing the processor 12 to execute the icon display method provided by the embodiment of the present invention by executing the computer instruction.
  • the RF circuit 15 can be used for transmitting and receiving information or during a call, receiving and transmitting signals, and in particular, processing the received information to the processor 12; in addition, transmitting signals generated by the processor 12.
  • RF circuits include, but are not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like.
  • the RF circuit 15 can also communicate with the network and other devices through wireless communication.
  • the sensor 16 may include a gravity sensor (Gravity Sensor) 161, a fingerprint recognizer 162, and the like.
  • the Gravity Sensor 161 can detect the acceleration of the mobile phone in various directions (usually three axes), and can detect the magnitude and direction of gravity when stationary, and can be used to identify the gesture of the mobile phone (such as horizontal and vertical screen switching). , related games, magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tapping, shaking direction). Specifically, the gravity sensor 161 can receive the gravity operation instruction in the operation instruction so that the processor 12 performs the operation indicated by the operation instruction.
  • the fingerprint reader 162 has a fingerprint recognition function. The fingerprint identifier 162 can be installed on the back of the mobile phone 10, or installed on the Home button of the mobile phone 10, or controlled by the logic circuit to control the touch screen 11.
  • the fingerprint reader 162 can receive a fingerprint operation instruction in the operation instruction so that the processor 12 performs the operation indicated by the operation instruction.
  • the mobile phone 10 may further include other sensors, such as a pressure sensor, a light sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, and the like, and details are not described herein.
  • the audio circuit 17, speaker 18, and microphone 19 provide an audio interface between the user and the handset 10.
  • the audio circuit 17 can transmit the converted electrical data of the received audio data to the speaker 18 for conversion to the sound signal output by the speaker 18; on the other hand, the microphone 19 converts the collected sound signal into an electrical signal by the audio circuit 17 After receiving, it is converted into audio data, and then the audio data is output to the RF circuit 15 for transmission to, for example, another mobile phone, or the audio data is output to the processor 12 for advancement.
  • the mobile phone 10 may further include a wireless Fidelity (WiFi) module, a Bluetooth module, a camera, and the like, and details are not described herein.
  • WiFi wireless Fidelity
  • the above-mentioned devices such as the touch screen 11, the sensor 16, and the microphone 19 may be collectively referred to as an input device.
  • the input device can receive the operation of the user or the change parameter of the environment, etc., so that the processor of the mobile phone can receive the corresponding operation instruction.
  • the above touch screen can be used as a display in addition to being an input device.
  • the operation instruction according to the embodiment of the present invention may be a touch screen operation instruction, a fingerprint operation instruction, a gravity operation instruction, a key operation instruction, or the like.
  • the touch screen operation instruction corresponds to a user pressing operation on the touch screen of the mobile phone, a long press operation, a slide operation, a click operation, a floating operation (operation of the user in the vicinity of the touch screen), and the like.
  • the fingerprint operation instruction corresponds to the user sliding the fingerprint of the fingerprint identifier of the mobile phone, long pressing the fingerprint, clicking the fingerprint, and double-clicking the fingerprint.
  • the gravity operation command corresponds to a user's operation such as shaking of a specific direction of the mobile phone, shaking of a specific number of times, and the like.
  • the key operation instruction corresponds to a user's click operation on a power key, a volume key, a Home key, and the like of the mobile phone, a double-click operation, a long press operation, a combined key operation, and the like.
  • the solution provided by the embodiment of the present invention can zoom in the sensing area of each icon displayed on the touch screen of the mobile phone.
  • Each of the icons is dynamically displayed such that the sensing area of each of the icons falls partially or completely within the current operating area.
  • the current operation area is an area on the touch screen of the mobile phone that can be operated by the user when the user operates the mobile phone with one hand.
  • the dynamic display mode of the mobile phone is that the touch screen of the mobile phone can display the zoomed icon of the sensing area.
  • the icon display provided by the embodiment of the present invention is shown by the flowchart of the icon display method shown in FIG. 2 in combination with the specific components in the mobile phone 10 shown in FIG. The method is described in detail, wherein the steps shown can also be performed in any terminal device other than the mobile phone shown in FIG. 1.
  • the logical sequence of the icon display methods provided by the embodiments of the present invention is shown in the method flowchart, in some cases, the steps shown or described may be performed in an order different from that herein:
  • the processor of the mobile phone determines the grip position.
  • the position of the grip point is one of the preset gripping positions of the mobile phone, and the grippable portion of the mobile phone can be the position of the four vertices of the touch screen of the mobile phone.
  • FIG. 3 it is a schematic diagram of a touch screen display icon of a mobile phone according to an embodiment of the present invention.
  • the grip position of the mobile phone shown in FIG. 3 may be any of the grip point 1, the grip point 2, the grip point 3, or the grip point 4.
  • the nine application icons (applications 1-9) displayed on the touch screen of the mobile phone shown in FIG. 3, such as the application 1 may be a "gallery" icon.
  • the interface displayed by the touch screen of the mobile phone in FIG. 3 is the main interface.
  • the processor of the mobile phone determines a relative position of each of the at least one icon displayed on the touch screen.
  • the relative position of each of the icons is the position of each icon relative to the grip position.
  • the processor of the mobile phone may display each icon on the touch screen of the mobile phone according to the position of each icon preset.
  • Example The location of the application icon displayed on the touch screen of the mobile phone shown in FIG. 3 may be preset by a person skilled in the art during the production process of the mobile phone.
  • the processor of the mobile phone determines that the user holds the mobile phone through the grip point 1 shown in FIG.
  • the processor can determine that the relative position of any of the application icons in the application 1-9 in FIG. 3 is relative to the grip.
  • the position of point 1 such as the distance from any of the application icons to grip point 1.
  • the relative position of the icon operated by the user with one hand is small, that is, the icon that is closer to the grip position; and the relative position of the icon that cannot be operated by one hand is larger, that is, the icon that is far from the grip position.
  • the processor of the mobile phone determines the current operating area according to the position of the grip point.
  • the current operation area may be an area on the touch screen of the mobile phone operated by the user with one hand. Specifically, when the user operates the mobile phone with one hand, the current operation area may be a maximum area on the touch screen of the mobile phone that the user can operate. For different grip positions, the current operating area on the touch screen of the mobile phone is different, that is, the position of the current operating area on the touch screen of the mobile phone is different. Illustratively, as shown in FIG. 4, it is a schematic diagram of a touch screen display icon of another mobile phone according to an embodiment of the present invention. In FIG. 4, the user's holding position on the mobile phone is the grip point 1, and the current operating area may be a shadow area on the touch screen of the mobile phone.
  • the “hand icon” in FIG. 4 is used to schematically represent a human hand, indicating the position of the user’s finger on the mobile phone, and actually on the mobile phone screen.
  • the “hands icon” is not displayed.
  • the processor of the mobile phone determines a sensing area of each icon according to a current operating area, a gripping position, and a relative position of each icon in the at least one icon, so that the sensing area of each icon partially or completely falls within the current operating area. .
  • the processor of the mobile phone can zoom the sensing area of each icon displayed on the touch screen of the mobile phone toward the position of the grip point.
  • the processor of the mobile phone can reduce the sensing area of the icon with a relatively small relative position of the icon, that is, the icon is reduced; and the sensing area of the icon with a relatively large relative position of the icon is enlarged, that is, the icon is enlarged. Thereby, some or all of the sensing areas of each icon are caused to fall within the current operating area.
  • the sensing area of each icon displayed on the touch screen of the mobile phone may be the same before the above “scaling the sensing area of each icon displayed on the touch screen of the mobile phone to the direction of the grip position”. .
  • the touch screen of the mobile phone can display the zoomed icon of the sensing area, so that the user can operate the mobile phone with one hand. All icons displayed on the touch screen.
  • the icon display method provided by the embodiment of the present invention may further include S205:
  • the processor of the mobile phone displays, according to the sensing area of each icon, the touch screen of the mobile phone to display each icon in at least one icon.
  • FIG. 5 a touch screen display of another mobile phone according to an embodiment of the present invention is provided.
  • a schematic representation of the icon After the processor of the mobile phone zooms the sensing area of the touch screen display icon of the mobile phone shown in FIG. 4, the icon displayed on the touch screen of the mobile phone shown in FIG. 5 can be obtained.
  • the touch screen display applications 1-9 of the mobile phone shown in FIG. 5 all intersect with the shadow area, that is, the sensing area of each application icon displayed on the touch screen of the mobile phone partially or completely falls within the current operation area.
  • the processor of the mobile phone may zoom the sensing area of the icon displayed on the touch screen of the mobile phone according to the grip position of the mobile phone, so that the touch of the mobile phone Part or all of the sensing area of each icon displayed on the screen falls within the current operating area, so that the user can operate all the icons displayed on the touch screen of the mobile phone with one hand.
  • the touch screen of the mobile phone does not display the free area, so that the user is less likely to touch the area displayed by the touch screen of the mobile phone.
  • the processor of the mobile phone zooms the icon displayed by the touch screen of the mobile phone (ie, the sensing area of the icon), instead of the entire interface displayed by the touch screen of the mobile phone, After the above method is executed, the touch screen display icon of the mobile phone has a better display effect.
  • the "display interface of the touch screen of the mobile phone” and the "icon displayed by the touch screen of the mobile phone” described in the embodiments of the present invention all refer to the content displayed on the touch screen of the mobile phone, but in different situations. Different descriptions are used for convenience of explanation.
  • FIG. 6 is a schematic flowchart diagram of another icon display method according to an embodiment of the present invention.
  • the method may further include S601-S602 before S201:
  • the touch screen or the sensor of the mobile phone receives the first operation instruction.
  • the first operation instruction is used to indicate that the touch screen of the mobile phone enters the dynamic display mode.
  • the sensing areas of each icon in the at least one icon are the same or different.
  • the sensing area of the displayed icon may be the same; after the terminal device enters the dynamic display mode, the sensing area of the displayed icon may be different.
  • the operation instruction in the embodiment of the present invention is a touch screen operation instruction
  • the user may operate the touch screen of the mobile phone by using different gestures
  • the gesture may include a pressure recognition gesture, a long press gesture, and an area change gesture.
  • multi-touch gestures swipe gestures, double-touch gestures, click gestures, double-click gestures, and tangential gestures.
  • the processor of the mobile phone acquires the operation instruction, specifically, the processor of the mobile phone acquires the touch screen of the user, and obtains the pressure value of pressing the touch screen.
  • the processor of the mobile phone obtains an operation instruction, specifically, the processor of the mobile phone obtains the time when the user presses the touch screen and obtains the pressing of the touch screen.
  • the processor of the mobile phone obtains the operation instruction, specifically, the processor of the mobile phone acquires the touch screen of the user, and acquires the touch area of pressing the touch screen.
  • the processor of the mobile phone obtains the operation instruction, specifically, the processor of the mobile phone acquires the touch screen and the number of contacts is acquired by the user.
  • the processor of the mobile phone acquires the operation instruction, specifically, the processor of the mobile phone acquires the sliding operation of the touch screen by the user, and obtains the sliding distance or the sliding track.
  • the processor of the mobile phone acquires the operation instruction, specifically, the processor of the mobile phone acquires the user to press the touch screen twice after the user does not leave the touch screen, and obtains the operation. Press the pressure value of the touch screen twice.
  • the processor of the mobile phone obtains the operation instruction, specifically, the processor of the mobile phone acquires the user's click operation on the touch screen, and obtains the same position on the touch screen. Clicks.
  • the processor of the mobile phone obtains the operation instruction, specifically, the processor of the mobile phone acquires the touch operation performed by the user on the touch screen, and the user's finger has at least one of the following trends: upward, Rotate down, left, right, and rotate.
  • the operation instruction is a fingerprint operation instruction
  • the user may operate the fingerprint identifier of the mobile phone by using different gestures
  • the gesture may include a long press gesture, a click gesture, a double tap gesture, a tangential gesture, and the like.
  • the processor of the mobile phone obtains an operation instruction, specifically, the processor of the mobile phone acquires a long press operation of the fingerprint identifier by the user, and obtains the fingerprint of the user and the fingerprint recognition by pressing the fingerprint. Time of the device.
  • the processor of the mobile phone obtains the operation instruction, specifically, the processor of the mobile phone acquires the fingerprint identifier of the user, and obtains the fingerprint of the user and the fingerprint identifier of the user. Clicks.
  • the processor of the mobile phone obtains the operation instruction, specifically, the processor of the mobile phone acquires at least one of the following trends: the user touches the fingerprint identifier, and obtains the fingerprint of the user and the finger of the user: Up, down, left, right.
  • the operation instruction is a gravity operation instruction
  • the user can operate the mobile phone by using different gestures
  • the gesture can be a shaking gesture.
  • the processor of the mobile phone acquires the operation instruction, specifically, the processor of the mobile phone acquires the number of shaking, the shaking direction, and the shaking angle of the user, such as the shaking direction, which may include upward and downward directions. , left, right, and rotate.
  • the operation instruction is a button operation instruction
  • the processor of the mobile phone obtains the operation instruction, specifically, the processor of the mobile phone acquires the number of times the user presses the power button, the volume key, the Home button, and the like of the mobile phone. And pressing order, etc.
  • the first operation instruction provided by the embodiment of the present invention is a touch screen operation instruction
  • the first operation instruction may correspond to a specific position (such as a specific area or a specific icon) on the touch screen of the user.
  • FIG. 7 a schematic diagram of an interface change of a touch screen display of a mobile phone according to an embodiment of the present invention.
  • FIG. 7a shows that the position where the user holds the mobile phone is the grip point 1, and the touch screen of the mobile phone is at the main interface;
  • FIG. 7a shows that the position where the user holds the mobile phone is the grip point 1, and the touch screen of the mobile phone is at the main interface;
  • FIG. 7b shows the downward sliding operation of the touch screen of the mobile phone by the user;
  • the processor of the mobile phone causes the touch screen of the mobile phone to display a pull-down menu interface as shown in FIG. 7c.
  • the “navigation bar” is displayed in the drop-down interface shown in FIG. 7c, and the “navigation bar” includes a "WiFi” icon, a "data” icon, a “mute” icon, a "Bluetooth” icon, a “direction lock” icon, and " The "Dynamic Display” icon, which may be added by those skilled in the art.
  • the user operates the “dynamic display” icon in the "navigation bar” shown in FIG.
  • buttons icon can be easily added to the “navigation bar” displayed on the touch screen of the mobile phone by a person skilled in the art. This will not be described in detail.
  • the processor of the mobile phone matches the first operation instruction and the first preset instruction, so that the touch screen of the mobile phone enters the dynamic display mode.
  • the first preset instruction may be preset by a person skilled in the art, where the first preset instruction corresponds to a preset user operation, and the operation may trigger the mobile phone.
  • the processor performs the method provided by the embodiment of the present invention.
  • the touch screen corresponding to the mobile phone is in the pull-down interface as shown in FIG. 7c, the user uses the click gesture to operate the “dynamic display” in the “navigation bar”. "icon.
  • the processor of the mobile phone matches the first operation instruction and the first preset instruction may include: the processor of the mobile phone receives a first operation instruction of the user, where the first operation instruction corresponds to an operation of the user in actual use; The processor determines whether the first operation instruction and the first preset instruction are the same. If the first operation instruction is the same as the first preset instruction, the processor of the mobile phone will cause the touch screen of the mobile phone to enter the dynamic display mode, and zoom the sensing area of the icon displayed on the touch screen of the mobile phone.
  • the first preset instruction may be stored in a memory of the mobile phone.
  • the operation instruction received by the processor of the mobile phone from the user is not limited, that is, the specific form of the operation of the user corresponding to the operation instruction is not limited.
  • the operation of the specific user may be pre-configured in the mobile phone or notified to the user in advance through the prompt information, so that the user can perform corresponding operations on the touch screen according to the configuration or according to the prompt information when the mobile phone needs to obtain an operation instruction from the user.
  • the user can operate the mobile phone according to the operation of the user corresponding to the first preset instruction in the mobile phone, so that the processor of the mobile phone receives the same first operation instruction as the first preset instruction, to trigger the processing of the mobile phone.
  • the icon display method provided by the embodiment of the present invention is executed.
  • the touch screen or the fingerprint recognizer of the mobile phone can receive an operation instruction from the user, so that the processor of the mobile phone determines the actual Hold the position.
  • the icon display method provided by the embodiment of the present invention may further include S801 before S201; correspondingly, S201 may be replaced by S802.
  • the method shown in FIG. 6 may further include S801 between S602 and S201, and S201 in FIG. 6 may be replaced by S802:
  • the touch screen or the sensor of the mobile phone receives a second operation instruction, and the second operation instruction is used to indicate a grip position.
  • the second operation instruction provided by the embodiment of the present invention is a touch screen operation instruction
  • the second operation instruction may correspond to a specific position (such as a specific area or a specific icon) on the touch screen of the user operating the mobile phone.
  • FIG. 9 it is a schematic diagram of interface changes of a touch screen display of another mobile phone according to an embodiment of the present invention.
  • FIG. 9a shows that the touch screen of the mobile phone displays the shaded area of the area 1 - area 4, and the prompt information of "Please long press any shaded area on the interface" is displayed; wherein, the area 1-4 in FIG. 9a Corresponding to grip points 1-4, respectively.
  • Figures 9b-9c illustrate an area 1 of a touch screen display in which a user operates a mobile phone with a long press gesture.
  • the second operation instruction received by the processor of the mobile phone may be that the user presses the area 1 displayed by the touch screen of the mobile phone by using a long press gesture, and the duration of the pressing reaches a preset duration, and the preset duration may be preset.
  • the touch screen of the mobile phone can be displayed before the processor of the mobile phone receives the second operation instruction.
  • the prompt information and the shaded area shown in FIG. 9a may not be displayed, and the shaded area may not be displayed.
  • the second operation instruction provided by the embodiment of the present invention is a fingerprint operation instruction
  • the second operation instruction may correspond to a fingerprint identifier of the mobile phone operated by the user with different gestures.
  • FIG. 10 it is a schematic diagram of interface changes of a touch screen display of another mobile phone according to an embodiment of the present invention.
  • Fig. 10a shows that the fingerprint recognizer of the mobile phone is the Home button of the mobile phone
  • Figs. 10b-10d show that the user operates the fingerprint recognizer with a double tap gesture using the left thumb.
  • the second operation instruction received by the processor of the mobile phone may be the user's left thumb fingerprint and the user clicks the fingerprint recognizer with the left thumb, and the number of click operations is two.
  • the left thumb when the user holds the mobile phone with the left hand as the grip point 1, the left thumb can operate the fingerprint recognizer by using a double-click gesture; when the user holds the mobile phone with the left hand as the grip point 2, the left thumb can The fingerprint recognizer is operated by using a click gesture; when the user holds the mobile phone with the right hand as the grip point 3, the right thumb can operate the fingerprint recognizer with a click gesture; when the user holds the mobile phone with the right hand as the grip point 4, the right hand The thumb can operate the fingerprint reader with a double tap gesture.
  • FIG. 11 is a schematic diagram of interface changes of a touch screen display of another mobile phone according to an embodiment of the present invention.
  • Fig. 11a shows that the fingerprint recognizer of the mobile phone is the Home button of the mobile phone; and Figs. 11b-11c show the fingerprint recognizer that the user operates the mobile phone with the downward tangential tangential gesture using the left thumb.
  • the second operational command received by the processor of the mobile phone may be that the left thumb thumb of the user and the left hand thumb have a downward trend toward the tangential gesture of the fingerprint recognizer.
  • the left thumb when the user holds the mobile phone in the left hand as the grip point 1, the left thumb can operate the fingerprint recognizer in the downward trending tangential gesture; the user holds the mobile phone in the left hand as the grip point 2 When the left thumb can operate the fingerprint recognizer with an upward trending tangential gesture; when the user holds the mobile phone with the right hand as the grip point 3, the left thumb can operate the fingerprint recognizer with the upward trending tangential gesture; the user uses the right hand grip When the part holding the mobile phone is the grip point 4, the left thumb can operate the fingerprint recognizer with a downward trending tangential gesture.
  • the specific form of the second operational command corresponding to the user operating the fingerprint recognizer is not limited herein.
  • the second operation instruction provided by the embodiment of the present invention is a gravity operation instruction
  • the second operation instruction may correspond to the user operating the mobile phone by using a shaking gesture.
  • FIG. 12 it is a schematic diagram of interface changes of a touch screen display of another mobile phone according to an embodiment of the present invention.
  • FIG. 12a shows that the user holds the mobile phone with the left hand before the user operates the mobile phone without using the shaking gesture
  • FIGS. 12a-12b show the user operating the mobile phone with the gesture of shaking to the left and shaking to the right
  • the second operation command received by the processor may be a sway angle of the position where the grip point 1-4 is located, and the sway angle of the position where the grip point 1 is located is the smallest.
  • the shaking angle of the position of the gripping point 1 is the smallest; if the user holds the part of the mobile phone In order to grasp the point 2, the shaking angle of the position of the gripping point 2 is the smallest; if the position of the user holding the mobile phone is the gripping point 3, the shaking angle of the position of the gripping point 3 is the smallest; if the position of the user holding the mobile phone is the gripping point 4 , the shaking angle of the position where the grip 4 is located is the smallest.
  • the position of holding the mobile phone with the left hand may be the lower left corner of the mobile phone, as shown in FIG. 3; the position of holding the mobile phone with the right hand may be the right of the mobile phone. Lower corner, grip point 4 as shown in Figure 3.
  • the second preset instruction may be preset in the mobile phone (such as the memory of the mobile phone) to support the method in the embodiment of the present invention to execute S802:
  • the processor of the mobile phone matches the second operation instruction with the second preset instruction to determine the grip position.
  • a person skilled in the art may preset four second preset instructions, which respectively correspond to four different grip points.
  • the foregoing “the processor of the mobile phone matches the second operation instruction with the second preset instruction to determine the grip position” may include: the processor of the mobile phone receives the second operation instruction of the user, and the second operation instruction corresponds to the actual The operation of the user; the processor of the mobile phone determines whether the second operation instruction is the same as any of the second preset instructions.
  • the processor of the mobile phone determines the grip position corresponding to the any second preset instruction.
  • any one of the four second preset instructions may be an area 1 of the touch screen displayed by the user as shown in FIG. 9 by using a long press gesture within a certain time. That is, the second preset instruction corresponds to the grip point 1; then, if the second operation command corresponds to the user operating the touch screen display area 1 of the mobile phone by using a long press gesture, and the operation time reaches the above-mentioned certain time, the mobile phone The processor determines that the second operation instruction is the same as any of the second preset instructions, and the actual grip position is the grip point 1.
  • the interface state of the mobile phone may be different, such as a portrait state or a landscape state.
  • the position of the same icon displayed on the touch screen may be different, and the position of the icon may be preset.
  • the position of the icon displayed by the touch screen has an influence on the dynamic display icon in the icon display method provided by the embodiment of the present invention; thus, the processor of the mobile phone can perform the above method in combination with the interface state of the mobile phone.
  • the icon display method provided by the embodiment of the present invention may further include S1301 after S601; correspondingly, S202 may be replaced with S1302.
  • the method shown in FIG. 6 may further include S1301 between S602 and S201, and S202 in FIG. 6 may be replaced by S1302:
  • the processor of the mobile phone determines the interface status.
  • the user's holding posture of the mobile phone may be a left hand grip or a right hand grip.
  • the holding position of the mobile phone may be the upper left corner or the lower left corner of the mobile phone;
  • the holding position of the mobile phone may be the upper right corner or the lower right corner of the mobile phone.
  • the user may use different grip positions to control the holding position of the mobile phone.
  • FIG. 14 it is a schematic diagram of interface changes of a touch screen display of another mobile phone according to an embodiment of the present invention. The mobile phone shown in FIG.
  • the lower left corner of the mobile phone is the grip point of the mobile phone
  • the upper left corner is the grip point of the mobile phone 2
  • the upper right corner is the grip point 3 of the mobile phone
  • the lower right corner is the mobile phone.
  • the grip point 4; and, the processor of the mobile phone causes the touch screen of the mobile phone to display the application icon (application 1-8) according to the position of each icon set in advance (recorded as the vertical screen position of the icon).
  • the mobile phone shown in FIG. 14b is in a horizontal screen state, the lower left corner of the mobile phone is the grip point of the mobile phone 2, the upper left corner is the grip point 3 of the mobile phone, and the upper right corner is the grip point 4 of the mobile phone.
  • the bottom right corner is the grip point 1 of the mobile phone; and, the processor of the mobile phone displays the application icon on the touch screen of the mobile phone according to the position of each icon set in advance (recorded as the horizontal screen position of the icon) (Application 1-8) ).
  • the horizontal icon position and the vertical screen position of the same icon displayed on the touch screen of the mobile phone are different, as shown in FIG. 14a, the position of the application 1 displayed by the touch screen of the mobile phone and the touch of the mobile phone shown in FIG. 14b The position of the application 1 displayed on the screen is different.
  • S1302 The processor of the mobile phone determines, according to the interface state, a relative position of each icon in at least one icon displayed on the touch screen of the mobile phone.
  • the relative position of the icon may be It is different.
  • the position of the application 1 displayed on the touch screen of the mobile phone shown in FIG. 14a relative to the grip point 1 is different from the position of the application 1 displayed on the touch screen of the mobile phone shown in FIG. 14b with respect to the grip point 1. .
  • the processor of the mobile phone in the dynamic display mode, can determine the relative position of each icon displayed on the touch screen of the mobile phone according to the grip position and the interface state of the mobile phone, so that each of the icons The relative position of the icons is more accurate; thus, the processor of the mobile phone determines the sensing area of each icon according to the relative position of each icon to be more accurate, which is beneficial to the sensing area portion of each icon displayed on the touch screen of the mobile phone or All fall in the current operating area.
  • the current operation area may be preset in the mobile phone. Therefore, the foregoing method may further include S1501-S1502 before S202 or S1302, and specifically includes S1501-S1502 before S601. Exemplarily, as shown in FIG. 15, the method shown in FIG. 6 further includes S1501-S1502 before S601:
  • S1501 The touch screen or the sensor of the mobile phone receives the third operation instruction.
  • the third operation instruction is used to instruct the processor of the mobile phone to set the current operation area.
  • FIG. 16 it is a schematic diagram of interface changes of a touch screen display of another mobile phone according to an embodiment of the present invention.
  • Figs. 16a-16c show that when the user holds the mobile phone as the gripping point 1, the user operates the touchscreen of the mobile phone by using a sliding gesture, and the sliding trajectory is position 1 to position 2 shown in Fig. 16c. Arc.
  • the processor of the mobile phone can receive the touch screen of the mobile phone operated by the user using a sliding gesture, and acquire the sliding track.
  • S1502 The processor of the mobile phone sets the current operation area according to the third operation instruction.
  • the current operation area and the grip point position are in one-to-one correspondence.
  • the processor of the mobile phone may display the shaded area on the touch screen of the mobile phone as shown in FIG. 16d. Set to the current operating area corresponding to grip point 1.
  • the processor of the mobile phone can set the shadow area on the touch screen of the mobile phone shown in FIG. 17a as the current shadow area corresponding to the grip point 2; the processor of the mobile phone can touch the touch screen of the mobile phone shown in FIG. 17b.
  • the shaded area on the top is set to grip point 3
  • the current shadow area; the processor of the mobile phone can set the shadow area on the touch screen of the mobile phone shown in FIG. 17c as the current shadow area corresponding to the grip point 4.
  • the third operation instruction may correspond to the operation of the user, and may also correspond to the operation of a person skilled in the art during the production process of the mobile phone.
  • the processor of the mobile phone may save the current operation instruction in the memory of the mobile phone. In this way, the processor of the mobile phone can then determine the current operating area according to the corresponding grip position of the first operation instruction.
  • the above method may further include S1801-S1804 after S205.
  • the method shown in FIG. 6 may further include S1801-S1804 after S205:
  • the touch screen of the mobile phone receives the fourth operation instruction.
  • the fourth operation instruction is used to instruct to adjust a sensing area of each of the at least one icon.
  • FIG. 19 it is a schematic diagram of interface changes of a touch screen display of another mobile phone according to an embodiment of the present invention.
  • part or all of each icon displayed on the touch screen of the mobile phone shown in FIG. 19a falls within the current operation operation area indicated by the shaded area 1, and the current operation area is preset; in this case, the user may be Hold the grip of the phone 1 with your left hand.
  • 19b-19c illustrate a fingerprint recognizer that operates a mobile phone with a left-handed thumb using a gesture of swiping leftward; subsequently, a fourth operational command that the processor of the mobile phone can receive may be that the user operates with a swipe gesture to the left.
  • the fourth operation instruction is used to instruct the processor of the mobile phone to reduce the sensing area of each icon in the at least one icon, if the fourth operation instruction corresponds to the user using the left thumb to operate the fingerprint identifier of the mobile phone by using a leftward sliding gesture; If the fourth operation instruction corresponds to the user using the right thumb to operate the fingerprint identifier of the mobile phone by using the rightward sliding gesture, the fourth operation instruction is used to instruct the processor of the mobile phone to enlarge the sensing area of each of the at least one icon.
  • the ratio is that the processor of the mobile phone scales the ratio of the sensing area of the at least one icon, and the ratio may correspond to the sliding distance in the fourth operation instruction.
  • the sliding distance in the fourth operation command is larger, that is, the ratio is larger; the smaller the sliding distance in the fourth operation command is, the smaller the ratio is.
  • the ratio may be greater than 0 and A number less than 1, if the ratio can be 0.95; if the user can operate all the icons displayed on the touch screen of the mobile phone with one hand and has the requirement of zooming in all the icons, the ratio can be a number greater than 1, if the ratio can Is 1.1.
  • the processor of the mobile phone modifies the sensing area of each icon according to the ratio, and obtains the modified sensing area of each icon.
  • the processor of the mobile phone corresponds to the fourth operation corresponding to the user illustrated in FIGS. 19b-19c.
  • the operation command obtains a ratio of 0.95, and the processor of the mobile phone can reduce the sensing area of each icon in at least one icon displayed on the touch screen of the mobile phone to 0.95 times.
  • S1804 The processor of the mobile phone adjusts the sensing area of each icon, so that the touch screen of the mobile phone displays each icon in at least one icon.
  • the processor of the mobile phone adjusts the sensing area of each icon, so that after the touch screen of the mobile phone displays each icon in at least one icon, the icon displayed on the touch screen of the mobile phone may partially or completely fall into a new operation. within the area. If the touched area of each of the icons is a reduced sensing area, the new operating area is smaller than the current operating area; if each of the icon-adjusted sensing areas is an enlarged sensing area, the new The operation area is larger than the current operation area described above. Exemplarily, as shown in FIG. 19d, the processor of the mobile phone may make part or all of the sensing area of each icon after zooming down by 0.95 times in the shaded area 2 shown in FIG. 19d, which may be the above new The operating area is smaller than the shaded area 1 shown in Figure 19d.
  • the processor of the mobile phone may save the new operation area to the memory of the mobile phone to update the current operation area saved in the memory of the mobile phone; or the processor of the mobile phone may not save the new operation area.
  • the current operation area is preset in the mobile phone, so that the sensing areas of all icons displayed on the touch screen of the mobile phone partially or completely fall within the current operation area. This allows the user to operate all the icons displayed on the touch screen of the mobile phone with one hand.
  • the processor of the mobile phone can also adjust the sensing area of each icon according to the needs of the user, so that the adjusted sensing area of all the icons displayed on the touch screen of the mobile phone partially or completely falls within the new operating area, so that Users can better operate all the icons displayed on the touch screen of the mobile phone with one hand and get a better user experience.
  • the terminal device includes corresponding hardware structures and/or software modules for executing the respective functions.
  • the present invention can be implemented in a combination of hardware or hardware and computer software in combination with the algorithm steps of the various examples described in the embodiments disclosed herein. Whether a function is implemented in hardware or computer software to drive hardware depends on the specific application and design constraints of the solution. A person skilled in the art can use different methods for implementing the described functions for each particular application, but such implementation should not be considered to be beyond the scope of the present invention.
  • the embodiment of the present invention may divide the function module into the terminal device according to the foregoing method example.
  • each function module may be divided according to each function, or two or more functions may be integrated into one processing module.
  • the above integrated modules can be implemented in the form of hardware or in the form of software functional modules. It should be noted that the division of the module in the embodiment of the present invention is schematic, and is only a logical function division, and the actual implementation may have another division manner.
  • FIG. 20 is a schematic diagram showing a possible configuration of the terminal device provided in the foregoing embodiment.
  • the terminal device 20 may include: a determining module 201 and Display module 202.
  • the determining module 201 is used to support the end End device 20 performs S201, S202, S203, S204, S602, S802, S1301, S1302, and S1502 in the above-described embodiments, and/or other processes for the techniques described herein.
  • the display module 202 is configured to support the terminal device 20 to perform S205 and S1804 in the above embodiments, and/or other processes for the techniques described herein.
  • the terminal device 20 may further include: a receiving module 203.
  • the receiving module 203 is configured to support the terminal device 20 to execute S601, S801, S1501, and S1801 in the foregoing embodiments, and/or other processes for the techniques described herein.
  • the terminal device 20 may further include: a setting module 204.
  • the setting module 204 is configured to support the terminal device 20 to execute S1502 in the above embodiment, and/or other processes for the techniques described herein.
  • the terminal device 20 may further include: an adjustment module 205.
  • the adjustment module 205 is configured to support the terminal device 20 to perform S1802 and S1803 in the above embodiments, and/or other processes for the techniques described herein.
  • the terminal device provided by the embodiment of the present invention is configured to execute the above icon display method, so that the same effect as the icon display method described above can be achieved.
  • the processing module may be a processor or a controller, for example, a CPU, a general-purpose processor, a digital signal processor (DSP), and an application-specific integrated circuit (English: Application-Specific Integrated Circuit, referred to as: ASIC), Field Programmable Gate Array (FPGA) or other programmable logic device, transistor logic device, hardware component, or any combination thereof. It is possible to implement or carry out the various illustrative logical blocks, modules and circuits described in connection with the present disclosure.
  • the processing unit described above may also be a combination of computing functions, such as one or more microprocessor combinations, a combination of a DSP and a microprocessor, and the like.
  • the storage module can be a memory.
  • the above receiving module 23 can be implemented by an input device.
  • the above display module 23 can be implemented by a display.
  • the embodiment of the present invention provides a terminal device 24 as shown in FIG.
  • the terminal device 24 includes a processor 241, a memory 242, a display 243, an input 244, and a bus 245.
  • the processor 241, the memory 242, the display 243, and the inputter 244 are connected to one another via a bus 245.
  • the bus 245 may be a Peripheral Component Interconnect (PCI) bus or an Extended Industry Standard Architecture (EISA) bus.
  • PCI Peripheral Component Interconnect
  • EISA Extended Industry Standard Architecture
  • the above bus 245 can be divided into an address bus, a data bus, a control bus, and the like. For ease of representation, only one thick line is shown in FIG. 24, but it does not mean that there is only one bus or one type of bus.
  • the input device 244 may include a mouse, a physical keyboard, a trackball, a touch panel, and a joystick, and a sensor such as a gravity sensor or a fingerprint recognizer.
  • the input device in the above mobile phone may include a touch screen and a sensor such as a gravity sensor, a fingerprint recognizer, and the like.
  • the display 243 described above may be a stand-alone device and may also be integrated with the input device 244 as a device.
  • the above touch screen can also be used as a display in a mobile phone.
  • the embodiment of the present invention further provides a computer storage medium, where the computer storage medium stores computer instructions.
  • the processor 241 of the terminal device 24 executes the computer instruction, the terminal device 24 performs the related method steps in the foregoing embodiment. .
  • the disclosed system, apparatus, and method may be implemented in other manners.
  • the device embodiments described above are merely illustrative.
  • the division of the modules or units is only a logical function division.
  • there may be another division manner for example, multiple units or components may be used. Combinations can be integrated into another system, or some features can be ignored or not executed.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or unit, and may be in an electrical, mechanical or other form.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of a software functional unit.
  • the integrated unit if implemented in the form of a software functional unit and sold or used as a standalone product, may be stored in a computer readable storage medium.
  • a computer readable storage medium A number of instructions are included to cause a computer device (which may be a personal computer, server, or network device, etc.) or processor to perform all or part of the steps of the methods described in various embodiments of the present application.
  • the foregoing storage medium includes: a flash memory, a mobile hard disk, a read only memory, A variety of media that can store program code, such as random access memory, disk, or optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Environmental & Geological Engineering (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本发明实施例公开了一种图标显示方法和终端设备,涉及通信领域,可以使终端设备显示的图标部分或全部落在用户单手操作的区域内。具体方案为:终端设备确定握点位置,该握点位置为预设的该终端设备的可握持部位之一;终端设备确定该终端设备显示的至少一个图标中,每个图标的相对位置;终端设备根据握点位置,确定当前操作区域;终端设备根据当前操作区域、握点位置和至少一个图标中每个图标的相对位置,确定每个图标的感应区域,使得每个图标的感应区域部分或全部落在当前操作区域内;终端设备根据每个图标的感应区域,显示至少一个图标中每个图标。本发明实施例用于终端设备显示图标的过程中。

Description

一种图标显示方法和终端设备
本申请要求于2017年03月13日提交中国专利局、申请号为201710147081.1发明名称为“一种图标显示的方法和设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本发明实施例涉及通信领域,尤其涉及一种图标显示方法和终端设备
背景技术
随着通信领域的技术不断发展,当前手机的智能化程度越来越高,使得手机触控屏的尺寸也越来越大。一般而言,用户可以依据操作习惯和操作环境,选择双手操作手机或者单手操作手机;但是,在一些特定的环境下,如用户乘坐公交车或者地铁等交通工具时,通常需要一只手握住扶手,只能选择单手操作手机。其中,用户单手操作触控屏较大的手机时,该触控屏显示的界面中的一部分区域超出用户单手操作的范围,使得用户不能单手操作该区域内的图标。
现有技术中,在用户单手操作触控屏较大的手机时,可以将该手机的触控屏显示的界面(如主界面)整体缩小,使得原来的界面划分为两部分,一部分为显示原来界面中图标的区域(即缩放后的界面),另一部分为不显示原来界面中图标的空闲区域。其中,上述空闲区域一般为由用户点击操作后,可以使手机退出执行上述缩小界面过程的黑屏区域。
存在的问题是,现有技术中将手机的触控屏显示的界面整体缩小之后,由于手机的触控屏过大导致整体缩小后的界面仍可能超出用户单手操作的范围,使得用户不能单手操作整体缩小后的界面上的部分图标。同时,由于手机的触控屏显示上述缩放后的界面之外,还显示上述空闲区域,将可能导致用户单手操作过程中误触该空闲区域。
发明内容
本申请提供一种图标显示方法和终端设备,可以使终端设备显示的图标部分或全部落在用户可以单手操作的区域内。
为达到上述目的,本申请采用如下技术方案:
第一方面,提供一种图标显示方法,该图标显示方法包括:终端设备确定握点位置,该握点位置为预设的终端设备的可握持部位之一;终端设备确定该终端设备显示的至少一个图标中,每个图标的相对位置,该每个图标的相对位置为每个图标相对于握点位置的位置;终端设备根据握点位置,确定当前操作区域,该当前操作区域为用户单手操作该终端设备显示的区域;终端设备根据当前操作区域、握点位置和至少一个图标中每个图标的相对位置, 确定每个图标的感应区域,使得该每个图标的感应区域部分或全部落在该当前操作区域内,该每个图标的感应区域为该每个图标被操作的区域;终端设备根据每个图标的感应区域,显示至少一个图标中每个图标。
需要说明的是,本申请提供的图标显示方法中,终端设备可以根据用户对该终端设备的握点位置,缩放该终端设备显示的图标的感应区域,使得该终端设备显示的每个图标的感应区域部分或全部落在当前操作区域内,以实现用户可以单手操作该终端设备显示的所有图标。并且,本申请提供的方法中该终端设备不显示空闲区域,使得用户对该终端设备显示的区域误触的可能性较小。此外,由于本申请提供的方法中,终端设备缩放的是该终端设备显示的图标(即图标的感应区域),而不是该终端设备显示的整个界面,因此在上述方法执行之后,该终端设备显示图标的显示效果较好。
在一种可能的实现方式中,本申请提供的图标显示方法可以由用户的操作触发,相应的终端设备接收到该操作对应的操作指令后可以执行上述方法。具体的,上述方法在终端设备确定握点位置之前还可以包括:终端设备接收第一操作指令;终端设备匹配第一操作指令与第一预设指令,以使终端设备进入动态显示模式,该动态显示模式下、上述至少一个图标中每个图标的感应区域相同或不同。其中,终端设备可以缩放其显示图标的感应区域的模式称为动态显示模式。一般而言,终端设备进入动态显示模式之前,显示的图标的感应区域可以是相同的;终端设备进入动态显示模式之后,显示的图标的感应区域可以是不同的。上述第一预设指令可以由本领域技术人员预先设置的。
在一种可能的实现方式中,在终端设备执行本申请提供的图标显示方法之前,该终端设备还可以接收来自用户的操作指令,以使终端设备确定出实际中的握点位置。具体的,在终端设备确定握点位置之前,上述图标显示方法还可以包括:终端设备接收第二操作指令,该第二操作指令用于指示握点位置。上述终端设备确定握点位置的方法还可以包括:终端设备将第二操作指令与第二预设指令匹配,确定握点位置。其中,上述第二预设指令可以是本领域技术人员预先设置的。示例性的,在未执行本申请提供的图标显示方法之前,本领域技术人员可以预先设置四个第二预设指令,这四个第二预设指令分别对应于四个不同的握点位置。随后,终端设备在接收到第二操作指令之后,可以匹配该第二操作指令与上述四个第二预设指令中的任一第二预设指令是否相同,以确定出握点位置。
在一种可能的实现方式中,用户单手操作终端设备时,终端设备(如终端设备的触控屏)的界面状态可以是不同的,如竖屏状态或横屏状态。终端设备在竖屏状态或横屏状态下,触控屏显示的同一个图标的位置可以是不同的,该图标的位置可以是预先设置的。而触控屏显示的图标的位置对上述方法中动态显示图标是有影响的;从而,终端设备可以结合终端设备的界面状态执行上述方法。具体的,在终端设备接收第一操作指令之后,上述图标显示方法还可以包括:终端设备确定界面状态;上述终端设备确定该终端设备 显示的至少一个图标中,每个图标的相对位置的方法可以包括:终端设备根据界面状态,确定终端设备显示的至少一个图标中,每个图标的相对位置。
在一种可能的实现方式中,本申请提供的图标显示方法,在终端设备根据握点位置,确定当前操作区域之前,终端设备中可以预先设置有当前操作区域。具体的,在终端设备确定握点位置之前,上述图标显示方法还可以包括:终端设备接收第三操作指令;终端设备根据第三操作指令,设置上述当前操作区域。其中,上述当前操作区域与握点位置一一对应。
在一种可能的实现方式中,由于单手操作终端设备的用户可以是不同的,而不同用户单手操作终端设备上的最大区域可能是不同的,因此,当前用户可能无法操作终端设备显示的当前操作区域之外的图标。从而,在终端设备显示至少一个图标中每个图标之后,上述图标显示方法还可以包括:终端设备接收第四操作指令,该第四操作指令用于指示调整至少一个图标中每个图标的感应区域;终端设备根据第四操作指令,得到比率;终端设备根据比率,调整每个图标的感应区域,得到每个图标调整后的感应区域;终端设备根据每个图标调整后的感应区域,显示至少一个图标中每个图标。
需要说明的是,本申请提供的图标显示方法可以根据用户的需求,调整每个图标的感应区域,使得用户可以单手操作终端设备显示的图标的感应区域经过调整的所有图标,并可以获得较好的用户体验。
第二方面,提供一种终端设备,该终端设备包括:确定模块和显示模块。其中,确定模块,用于确定握点位置,该握点位置为预设的终端设备的可握持部位之一;确定终端设备的显示界面上显示的至少一个图标中,每个图标的相对位置,该每个图标的相对位置为每个图标相对于握点位置的位置;根据握点位置,确定当前操作区域,该当前操作区域为用户单手操作显示界面上的区域;根据当前操作区域、握点位置和至少一个图标中每个图标的相对位置,确定每个图标的感应区域,使得该每个图标的感应区域部分或全部落在该当前操作区域内,该每个图标的感应区域指示的区域为该每个图标被操作的区域。显示模块,用于根据确定模块确定出的每个图标的感应区域,显示至少一个图标中每个图标。
在一种可能的实现方式中,上述终端设备还可以包括:接收模块。其中,接收模块,用于在确定模块确定握点位置之前,接收第一操作指令。上述确定模块,还用于匹配接收模块接收的第一操作指令与第一预设指令,以使终端设备进入动态显示模式,该动态显示模式下、至少一个图标中每一个图标的感应区域相同或不同。
在一种可能的实现方式中,上述接收模块还用于在确定模块确定出握点位置之前,接收第二操作指令,该第二操作指令用于指示该握点位置。上述确定模块,具体用于将第二操作指令与第二预设指令匹配,确定握点位置。
在一种可能的实现方式中,上述确定模块还可以用于在接收模块接收第一操作指令之后,确定界面状态,该界面状态用于指示终端设备的界面状态为竖屏状态,或者终端设备的界面状态为横屏状态。上述确定模块具体可以 用于根据界面状态,确定终端设备显示的至少一个图标中,每个图标的相对位置。
在一种可能的实现方式中,上述接收模块还可以用于在上述确定模块确定握点位置之前,接收第三操作指令。上述终端设备还可以包括:设置模块。该设置模块,可以用于根据上述第三操作指令,设置上述当前操作区域。
在一种可能的实现方式中,上述接收模块还可以用于在上述显示模块显示至少一个图标中每个图标之后,接收第四操作指令,该第四操作指令可以用于指示调整至少一个图标中每个图标的感应区域。上述终端设备还可以包括:调整模块。该调整模块可以用于根据上述接收模块接收的第四操作指令,得到比率;根据比率,调整每个图标的感应区域,得到每个图标调整后的感应区域。上述显示模块还可以用于根据上述调整模块得到的每个图标调整后的感应区域,显示至少一个图标中每个图标。
第三方面,提供一种终端设备,该终端设备可以包括:处理器、存储器、显示器、输入器和总线。上述存储器用于存储计算机指令,上述处理器、上述存储器、上述显示器和上述输入器通过上述总线连接,当终端设备运行时,上述处理器执行上述存储器存储的上述计算机指令,以使上述终端设备执行如第一方面以及第一方面的各种可选方式中的图标显示方法。
第四方面,提供一种计算机存储介质,该计算机存储介质中存储有计算机指令,当第三方面中的终端设备的处理器执行该计算机指令时,该处理器执行如第一方面以及第一方面的各种可选方式中的图标显示方法。
需要说明的是,本申请的第三方面中的处理器可以为第二方面中的确定模块、设置模块和调整模块等功能模块的集成,处理器可以实现第二方面上述的各个功能模块的功能。第二方面和第三方面中各个模块的详细描述以及有益效果分析可以参考上述第一方面及其各种可能的实现方式中的对应描述及技术效果,此处不再赘述。
附图说明
图1为本发明实施例提供的一种终端设备的硬件结构示意图;
图2为本发明实施例提供的一种图标显示方法的流程示意图;
图3为本发明实施例提供的一种手机的触控屏显示图标的示意图;
图4为本发明实施例提供的另一种手机的触控屏显示图标的示意图;
图5为本发明实施例提供的另一种手机的触控屏显示图标的示意图;
图6为本发明实施例提供的另一种图标显示方法的流程示意图;
图7为本发明实施例提供的一种手机的触控屏显示的界面变化示意图;
图8为本发明实施例提供的另一种图标显示方法的流程示意图;
图9为本发明实施例提供的另一种手机的触控屏显示的界面变化示意图;
图10为本发明实施例提供的另一种手机的触控屏显示的界面变化示意图;
图11为本发明实施例提供的另一种手机的触控屏显示的界面变化示意 图;
图12为本发明实施例提供的另一种手机的触控屏显示的界面变化示意图;
图13为本发明实施例提供的另一种图标显示方法的流程示意图;
图14为本发明实施例提供的另一种手机的触控屏显示的界面变化示意图;
图15为本发明实施例提供的另一种图标显示方法的流程示意图;
图16为本发明实施例提供的另一种手机的触控屏显示的界面变化示意图;
图17为本发明实施例提供的另一种手机的触控屏显示的界面变化示意图;
图18为本发明实施例提供的另一种图标显示方法的流程示意图;
图19为本发明实施例提供的另一种手机的触控屏显示的界面变化示意图;
图20为本发明实施例提供的终端设备的一种可能的组成示意图;
图21为本发明实施例提供的终端设备的一种可能的组成示意图;
图22为本发明实施例提供的终端设备的一种可能的组成示意图;
图23为本发明实施例提供的终端设备的一种可能的组成示意图;
图24为本发明实施例提供的终端设备的一种可能的组成示意图。
具体实施方式
本发明实施例提供一种图标显示方法和终端设备,可以应用于终端设备显示图标的过程中,具体应用于终端设备缩放每个图标的感应区域并动态显示每个图标的过程中。
下面将结合本发明实施例中的附图,详细地描述本发明实施例中的技术方案。
本发明实施例提供的图标显示方法,适用于设置有触控屏的终端设备,其中,该终端设备可以为手机、平板电脑、笔记本电脑、超级移动个人计算机(Ultra-mobile Personal Computer,UMPC)、上网本、个人数字助理(Personal Digital Assistant,PDA)等终端设备。
具体的,本发明实施例以终端设备为手机为例,对本发明提供的图标显示方法进行介绍。下面结合附图对手机10的各个构成部件进行具体的介绍:
如图1所示,手机10可以包括:触控屏11、处理器12、存储器13、电源14、射频(Radio Frequency,RF)电路15、传感器16、音频电路17、扬声器18、麦克风19等部件,这些部件之间可以通过总线连接,也可以直接连接。本领域技术人员可以理解,图1中示出的手机结构并不构成对手机的限定,可以包括比图示更多的部件,或者组合某些部件,或者不同的部件布置。
其中,触控屏11可称为触控显示面板,用于实现手机10的输入和输出功能,可收集用户在其上或附近的操作指令(比如用户使用手指、触笔等任 何适合的物体或附件在触控屏11上或在触控屏11附近的操作指令),并根据预先设定的程式驱动相应的连接装置。还可以用于显示由用户输入的信息或提供给用户的信息(如“图库”和“相机”的应用程序图标)以及手机的各种菜单(如“图库”中的“删除”按钮)。例如,可以采用电阻式、电容式、红外光感以及超声波等多种类型实现触控屏11,本发明实施例对此不限定。其中,用户在触控屏11附近的操作可以称之为悬浮触控,能够进行悬浮触控的触控屏可以采用电容式、红外光感以及超声波等实现。
具体的,在本发明实施例中,触控屏11可以用于接收用户的操作指令,如用户对触控屏11显示的图标的操作指令或用户对触控屏11上特定区域的操作指令;并且可以用于在触控屏11接收到操作指令后,将该操作指令传送给处理器12,以便于处理器12执行该操作指令指示的操作。例如,触控屏11在接收到指示终端设备修改图标的感应区域的操作指令之后,将该操作指令传送给处理器12,以便于处理器12缩放图标的感应区域,使得触控屏11可以根据缩放后的图标的感应区域,显示该图标。
需要说明的是,本发明实施例提供的手机的触控屏可以划分为多个不同的感应区域,如热感应区域或压力感应区域。从而,手机的触控屏显示的每个图标可以落在一个感应区域内,该感应区域为该图标可以被用户操作的区域。其中,一个图标的感应区域可以大于该图标所在的区域,或者该图标的感应区域可以与该图标所在的区域相同。以下仅以每个图标的感应区域和该图标所在的区域相同为例,说明本发明实施例提供的图标显示方法。
处理器12是手机10的控制中心,利用各种接口和线路连接整个手机的各个部分,通过运行或执行存储在存储器13内的软件程序和/或模块,以及调用存储在存储器13内的数据,执行手机10的各种功能和处理数据,从而对手机10进行整体监控。在具体实现中,作为一种实施例,处理器12可包括一个或多个处理单元;处理器12可集成应用处理器和调制解调处理器。其中,应用处理器主要处理操作系统、主界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器12中。
具体的,在本发明实施例中,处理器12可以得到触控屏11接收的操作指令,并执行该操作指令指示的操作。例如,处理器12可以得到触控屏11接收的指示确定当前操作区域的操作指令后,确定出当前操作区域。
需要说明的是,本发明实施例涉及的图标为手机的触控屏显示的可操作元素。具体的,手机的处理器接收到用户对可操作元素的操作指令之后,手机的处理器可以响应该操作指令,如手机的处理器使得触控屏显示的界面发生变化。
示例性的,当手机的触控屏处于主界面时,上述图标可以为触控屏显示的应用程序图标,如“相机”图标、“图库”图标和“短信”图标等;此时,若手机通过触控屏接收到用户对“图库”的操作指令,手机的处理器可以使触控屏显示“图库”中的照片。此外,当手机的触控屏处于应用程序运行过 程中的应用界面时,上述图标可以是触控屏显示的按钮图标,如“图库”中的“编辑”按钮图标和“删除”按钮图标等;或者,当手机的触控屏处于应用程序运行过程中的应用界面时,上述图标还可以是触控屏显示的输入框,如“短信”中的短信内容输入框等;当手机的触控屏处于下拉界面时,上述图标可以是触控屏显示的按钮图标,如“WiFi”图标、“静音”图标、“蓝牙”图标、“方向锁定”图标等。以下仅以手机的触控屏处于主界面时显示的应用程序图标为例,说明本发明实施例提供的图标显示方法。
存储器13可用于存储数据、软件程序以及模块,可以是易失性存储器(Volatile Memory),例如随机存取存储器(Random-Access Memory,RAM);或者非易失性存储器(Non-Volatile Memory),例如只读存储器(Read-Only Memory,ROM),快闪存储器(Flash Memory),硬盘(Hard Disk Drive,HDD)或固态硬盘(Solid-State Drive,SSD);或者上述种类的存储器的组合。具体的,存储器13内可存储计算机指令,该计算机指令用于使处理器12通过执行该计算机指令,执行本发明实施例提供的图标显示方法。
电源14,可以为电池,通过电源管理系统与处理器12逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。
RF电路15可用于收发信息或通话过程中,信号的接收和发送,特别地,将接收到的信息给处理器12处理;另外,将处理器12生成的信号发送出去。通常,RF电路包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器(Low Noise Amplifier,LNA)、双工器等。此外,RF电路15还可以通过无线通信与网络和其他设备通信。
传感器(Sensor)16,可以包括重力传感器(Gravity Sensor)161和指纹识别器162等。
其中,重力传感器(Gravity Sensor)161,可以检测手机在各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别手机姿态的应用(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击、晃动方向)等。具体的,重力传感器161可以接收操作指令中的重力操作指令,以便处理器12执行该操作指令指示的操作。指纹识别器162,具有指纹识别功能。其中,指纹识别器162可以安装在手机10的背面,或者安装在手机10的Home键上,或者通过逻辑电路控制触控屏11来实现。具体的,指纹识别器162可以接收操作指令中的指纹操作指令,以便处理器12执行该操作指令指示的操作。需要说明的是,手机10还可以包括其它传感器,比如压力传感器、光传感器、陀螺仪、气压计、湿度计、温度计、红外线传感器等其他传感器,在此不再赘述。
音频电路17、扬声器18、麦克风19可提供用户与手机10之间的音频接口。音频电路17可将接收到的音频数据转换后的电信号,传输到扬声器18,由扬声器18转换为声音信号输出;另一方面,麦克风19将收集的声音信号转换为电信号,由音频电路17接收后转换为音频数据,再将音频数据输出至RF电路15以发送给比如另一手机,或者将音频数据输出至处理器12以便进 一步处理。
尽管未示出,手机10还可以包括无线保真(Wireless Fidelity,WiFi)模块、蓝牙模块、摄像头等功能模块,在此不再一一赘述。
需要说明的是,上述触控屏11、传感器16、麦克风19等器件可以统称为输入器。其中,输入器可以接收上述用户的操作或者环境的变化参数等,以使得手机的处理器可以接收到相应的操作指令。并且,上述触控屏除了可以作为输入器之外,还可以作为显示器。
示例性的,本发明实施例涉及的操作指令可以为触屏操作指令、指纹操作指令、重力操作指令、按键操作指令等。触屏操作指令对应于用户对手机的触控屏的按压操作、长按操作、滑动操作、点击操作、悬浮操作(用户在触控屏附近的操作)等操作。指纹操作指令对应于用户对手机的指纹识别器的滑动指纹、长按指纹、单击指纹和双击指纹等操作。重力操作指令对应于用户对手机特定方向的晃动、特定次数的晃动等操作。按键操作指令对应于用户对手机的电源键、音量键、Home键等按键的单击操作、双击操作、长按操作、组合按键操作等操作。
为了解决由于手机的触控屏较大导致用户不能单手操作该触控屏显示的全部图标的问题,本发明实施例提供的方案可以缩放手机的触控屏显示的每个图标的感应区域,并动态显示上述每个图标,使得上述每个图标的感应区域部分或全部落在当前操作区域内。其中,上述当前操作区域为用户单手操作手机时,手机的触控屏上可以被用户操作的区域。手机的动态显示模式为手机的触控屏可以显示感应区域经过缩放的图标。
为使本发明实施例的目的、技术方案和优点更加清楚,下面结合图1所示的手机10中的具体部件,通过图2所示的图标显示方法的流程图对本发明实施例提供的图标显示方法进行详细描述,其中,示出的步骤也可以在除图1所示手机之外的其他任一终端设备中执行。此外,虽然在方法流程图中示出了本发明实施例提供的图标显示方法的逻辑顺序,但是在某些情况下,可以以不同于此处的顺序执行所示出或描述的步骤:
S201、手机的处理器确定握点位置。
其中,上述握点位置为预设的手机的可握持部位之一,手机的可握持部位可以为手机的触控屏的四个顶点所在的位置。示例性的,如图3所示,为本发明实施例提供的一种手机的触控屏显示图标的示意图。图3示出的手机的握点位置可以为握点1、握点2、握点3或者握点4中的任一握点位置。并且,图3示出的手机的触控屏显示的9个应用程序图标(应用1-9),如应用1可以是“图库”图标。并且,图3中的手机的触控屏显示的界面为主界面。
S202、手机的处理器确定触控屏显示的至少一个图标中,每个图标的相对位置。
其中,上述每个图标的相对位置为该每个图标相对于握点位置的位置。
需要说明的是,在未执行本发明实施例提供的方法之前,手机的处理器可以根据预先设置的每个图标的位置在手机的触控屏上显示每个图标。示例 性的,图3所示的手机的触控屏上显示应用程序图标的位置可以是该手机生产过程中、由本领域技术人员预先设置的。当用户握持手机的部位不同,即握点不同时,同一个图标的相对位置是不同的;而手机的触控屏显示图标时,该图标的相对位置对用户是否能操作该图标是有影响的。例如,若手机的处理器确定出用户通过图3所示的握点1握持手机,则该处理器可以确定图3中的应用1-9中任一应用程序图标的相对位置为相对于握点1的位置,如该任一应用程序图标到握点1的距离。一般而言,用户单手操作的图标的相对位置较小,即距离握点位置较近的图标;而不能单手操作的图标的相对位置较大,即距离握点位置较远的图标。
S203、手机的处理器根据握点位置,确定当前操作区域。
其中,上述当前操作区域可以为用户单手操作手机的触控屏上的区域。具体的,在用户单手操作手机时,上述当前操作区域可以为用户能够操作的手机的触控屏上的最大区域。对于不同的握点位置,手机的触控屏上的当前操作区域是不同的,即该当前操作区域在手机的触控屏上的位置是不同的。示例性的,如图4所示,为本发明实施例提供的另一种手机的触控屏显示图标的示意图。图4中用户对手机的握持部位为握点1,当前操作区域可以为该手机的触控屏上的阴影区域。需要说明的是,本发明实施例提供的附图,如图4中的“人手图标”用于对人手进行示意性地表示,表示用户的手指在手机上的位置,而实际中该手机屏幕上是不显示该“人手图标”的。
S204、手机的处理器根据当前操作区域、握点位置和至少一个图标中每个图标的相对位置,确定每个图标的感应区域,使得每个图标的感应区域部分或全部落在当前操作区域内。
具体的,手机的处理器可以将手机的触控屏上显示的每个图标的感应区域向握点位置的方向缩放。其中,手机的处理器可以将图标的相对位置较小的图标的感应区域缩小,即将该图标缩小;将图标的相对位置较大的图标的感应区域放大,即将该图标放大。从而,使得每个图标的感应区域部分或全部落在当前操作区域内。一般而言,在上述“将手机的触控屏上显示的每个图标的感应区域向握点位置的方向缩放”之前,手机的触控屏上显示的每个图标的感应区域可以是相同的。
需要说明的是,由于用户是在当前操作区域内单手操作手机的触控屏显示的图标的,因此手机的触控屏显示的每个图标部分或全部落在当前操作区域内,使得用户可以单手操作手机的触控屏上显示的所有图标。
在上述“将手机的触控屏显示的每个图标的感应区域向握点位置的方向缩放”之后,手机的触控屏便可以显示感应区域经过缩放的图标,使得用户可以单手操作手机的触控屏显示的所有图标。具体的,本发明实施例提供的图标显示方法还可以包括S205:
S205、手机的处理器根据每个图标的感应区域,使手机的触控屏显示至少一个图标中每个图标。
示例性的,如图5所示,为本发明实施例提供的另一种手机的触控屏显 示图标的示意图。其中,手机的处理器对图4所示的手机的触控屏显示图标的感应区域缩放后,可以得到图5所示的手机的触控屏显示的图标。图5所示的手机的触控屏显示的应用1-9均与阴影区域有交集,即手机的触控屏显示的每个应用程序图标的感应区域部分或全部落在当前操作区域内。
需要说明的是,本发明实施例提供的图标显示方法中,手机的处理器可以根据用户对该手机的握点位置,缩放手机的触控屏上显示的图标的感应区域,使得手机的触控屏显示的每个图标的感应区域部分或全部落在当前操作区域内,以实现用户可以单手操作手机的触控屏显示的所有图标。并且,本发明实施例提供的方法中手机的触控屏不显示空闲区域,使得用户对手机的触控屏显示的区域误触的可能性较小。此外,由于本发明实施例提供的方法中,手机的处理器缩放的是该手机的触控屏显示的图标(即图标的感应区域),而不是该手机的触控屏显示的整个界面,因此在上述方法执行之后,手机的触控屏显示图标的显示效果较好。
其中,本发明实施例中所描述的“手机的触控屏显示的界面”、“手机的触控屏显示的图标”均指的是手机的触控屏显示的内容,只是在不同的情况下为了方便说明而使用的不同的描述。
进一步的,在执行本发明实施例提供的图标显示方法之前,手机的触控屏或指纹识别器等传感器还可以接收用户的操作指令,使得手机的处理器开始执行上述方法。如图6所示,为本发明实施例提供的另一种图标显示方法的流程示意图,该方法在S201之前还可以包括S601-S602:
S601、手机的触控屏或传感器接收第一操作指令。
其中,上述第一操作指令用于指示手机的触控屏进入动态显示模式,动态显示模式下、至少一个图标中每个图标的感应区域相同或不同。一般而言,终端设备进入动态显示模式之前,显示的图标的感应区域可以是相同的;终端设备进入动态显示模式之后,显示的图标的感应区域可以是不同的。
以下以不同的应用场景,对本发明实施例涉及的操作指令作详细的描述:
在第一种应用场景中,本发明实施例涉及的操作指令为触屏操作指令,用户可以采用不同的手势操作手机的触控屏,该手势可以包括压力识别手势、长按手势、面积变化手势、多点触控手势、滑动手势、双按手势、单击手势、双击手势和切向手势等。具体的:当用户采用压力识别手势操作时,手机的处理器获取到操作指令具体为,手机的处理器获取到用户按压了触控屏,且获取到按压触控屏的压力值。当用户采用长按手势操作时,手机的处理器获取到操作指令具体为,手机的处理器获取到用户长按了触控屏,且获取到按压触控屏的时间。当用户采用面积变化手势时,手机的处理器获取到操作指令具体为,手机的处理器获取到用户触摸了触控屏,且获取到按压触控屏的触摸面积。当用户采用多点触控手势时,手机的处理器获取到操作指令具体为,手机的处理器获取到用户按压了触控屏,且获取到触点的数量。当用户采用滑动手势时,手机的处理器获取到操作指令具体为,手机的处理器获取到用户对触控屏进行了滑动操作,且获取到滑动距离或滑动轨迹。当用户采 用双按手势操作指令时,手机的处理器获取到操作指令具体为,手机的处理器获取到用户在未离开触控屏的情况下,对触控屏进行了两次按压操作,且获取到两次按压触控屏的压力值。当用户采用单击手势或双击手势操作指令时,手机的处理器获取到操作指令具体为,手机的处理器获取到用户对触控屏进行了点击操作,且获取到对触控屏上同一位置的点击次数。当用户采用切向手势时,手机的处理器获取到操作指令具体为,手机的处理器获取到用户对触控屏进行了触摸操作,且获取到用户的手指有以下至少一种趋势:向上、向下、向左、向右、旋转。
在第二种应用场景中,上述操作指令为指纹操作指令,用户可以采用不同的手势对手机的指纹识别器进行操作,该手势可以包括长按手势、单击手势、双击手势和切向手势等。具体的,当用户采用长按手势操作时,手机的处理器获取到操作指令具体为,手机的处理器获取到用户对指纹识别器进行了长按操作,且获取到用户的指纹和按压指纹识别器的时间。当用户采用单击手势或双击手势操作指令时,手机的处理器获取到操作指令具体为,手机的处理器获取到用户点击操作了指纹识别器,且获取到用户的指纹和用户对指纹识别器的点击次数。当用户采用切向手势时,手机的处理器获取到操作指令具体为,手机的处理器获取到用户对触摸了指纹识别器,且获取到用户的指纹和用户的手指的以下至少一种趋势:向上、向下、向左、向右。
在第三种应用场景中,上述操作指令为重力操作指令,用户可以采用不同的手势对手机进行操作,该手势可以为晃动手势。具体的,当用户采用晃动手势操作时,手机的处理器获取到操作指令具体为,手机的处理器获取到用户对手机的晃动次数、晃动方向和晃动角度,如晃动方向可以包括向上、向下、向左、向右、旋转。
在第四种应用场景中,上述操作指令为按键操作指令,手机的处理器获取到操作指令具体为,手机的处理器获取到用户对手机的电源键、音量键、Home键等按键的按压次数和按压顺序等。
在具体实现方式中,当本发明实施例提供的第一操作指令为触屏操作指令时,第一操作指令可以对应于用户操作触控屏上的特定位置(如特定的区域或特定的图标)。示例性的,如图7所示,为本发明实施例提供的一种手机的触控屏显示的界面变化示意图。图7中,图7a示出了用户握持手机的位置为握点1,且手机的触控屏处于主界面;图7b示出了用户对手机的触控屏向下的滑动操作;随后,手机的处理器接收到上述滑动操作后,使手机的触控屏显示如图7c示出的下拉菜单界面。其中,图7c示出的下拉界面中显示有“导航栏”,“导航栏”中包括“WiFi”图标、“数据”图标、“静音”图标、“蓝牙”图标、“方向锁定”图标和“动态显示”图标,该“动态显示”图标可以是本领域技术人员新增的。从而,如图7d所示,用户采用单击手势操作图7c示出的“导航栏”中的“动态显示”图标,以触发该手机的处理器执行本发明实施例提供的图标显示方法。需要说明的是,本领域的技术人员可以轻易实现在手机的触控屏显示的“导航栏”中新增按钮图标,本发明实施例这里 对此不作详细描述。
S602、手机的处理器匹配第一操作指令与第一预设指令,使手机的触控屏进入动态显示模式。
其中,在未执行本发明实施例提供的图标显示方法之前,可以由本领域技术人员预先设置第一预设指令,该第一预设指令对应于预设的用户的操作,该操作可以触发手机的处理器执行本发明实施例提供的方法,如第一预设指令对应于手机的触控屏处于如图7c所示的下拉界面时,用户采用单击手势操作“导航栏”中的“动态显示”图标。具体的,上述“手机的处理器匹配第一操作指令与第一预设指令”可以包括:手机的处理器接收用户的第一操作指令,该第一操作指令对应于实际中用户的操作;手机的处理器判断第一操作指令和第一预设指令是否相同。若第一操作指令和第一预设指令相同,则手机的处理器将使该手机的触控屏进入动态显示模式,缩放手机的触控屏显示的图标的感应区域。其中,上述第一预设指令可以存储在手机的存储器中。
需要说明的是,本发明实施例对手机的处理器接收的来自用户的哪种操作指令不作限制,即操作指令对应的用户的操作的具体形式不作限制。具体用户采用哪种操作可以预先配置在手机中或者通过提示信息提前告知用户,以便手机在有获取来自用户的操作指令的需求时,用户可以根据配置或按照提示信息对触控屏进行相应的操作。用户可以依据预先配置在手机中第一预设指令对应的用户的操作,操作该手机,使得该手机的处理器接收到和第一预设指令相同的第一操作指令,以触发该手机的处理器执行本发明实施例提供的图标显示方法。
进一步的,在手机的处理器执行本发明实施例提供的图标显示方法之前,手机的触控屏或指纹识别器等传感器还可以接收来自用户的操作指令,以使手机的处理器确定出实际的握点位置。具体的,本发明实施例提供的图标显示方法,在S201之前,还可以包括S801;相应的,S201可以替换为S802。示例的,如图8所示,图6所示方法在S602和S201之间,还可以包括S801,图6中的S201可以替换为S802:
S801、手机的触控屏或传感器接收第二操作指令,第二操作指令用于指示握点位置。
其中,当本发明实施例提供的第二操作指令为触屏操作指令时,第二操作指令可以对应于用户操作手机的触控屏上的特定位置(如特定的区域或特定的图标)。示例性的,如图9所示,为本发明实施例提供的另一种手机的触控屏显示的界面变化示意图。图9中,图9a示出了手机的触摸屏显示有区域1-区域4的阴影区域,并且显示有“请长按界面上任一阴影区域”的提示信息;其中,图9a中的区域1-4分别对应于握点1-4。图9b-9c示出了用户采用长按手势操作手机的触控屏显示的区域1。随后,手机的处理器接收到的第二操作指令可以是用户采用长按手势按压了手机的触控屏显示的区域1,并且按压的时长达到预设时长,该预设时长可以是预先设置的。需要说明的是,实际应用中,在手机的处理器接收第二操作指令之前,手机的触控屏可以显示 如图9a所示的提示信息和阴影区域,也可以不显示该提示信息和该阴影区域,本发明实施例这里对此不作限定。
当本发明实施例提供的第二操作指令为指纹操作指令时,第二操作指令可以对应于用户采用不同的手势操作手机的指纹识别器。示例性的,如图10所示,为本发明实施例提供的另一种手机的触控屏显示的界面变化示意图。图10中,图10a示出了手机的指纹识别器为手机的Home键;图10b-10d示出了用户使用左手拇指采用双击手势操作指纹识别器。随后,手机的处理器接收到的第二操作指令可以是用户的左手拇指指纹和用户使用左手拇指点击操作了指纹识别器,且点击操作次数为两次。其中,本发明实施例这里,用户使用左手握持手机的部位为握点1时,左手拇指可以采用双击手势操作指纹识别器;用户使用左手握持手机的部位为握点2时,左手拇指可以采用单击手势操作指纹识别器;用户使用右手握持手机的部位为握点3时,右手拇指可以采用单击手势操作指纹识别器;用户使用右手握持手机的部位为握点4时,右手拇指可以采用双击手势操作指纹识别器。
或者,如图11所示,为本发明实施例提供的另一种手机的触控屏显示的界面变化示意图。图11中,图11a示出了手机的指纹识别器为手机的Home键;图11b-11c示出了用户使用左手拇指采用向下趋势的切向手势操作手机的指纹识别器。随后,手机的处理器接收到的第二操作指令可以是用户的左手拇指指纹和左手拇指对指纹识别器的切向手势有向下趋势。其中,本发明实施例这里,用户使用左手握持手机的部位为握点1时,左手拇指可以采用向下趋势的切向手势操作指纹识别器;用户使用左手握持手机的部位为握点2时,左手拇指可以采用向上趋势的切向手势操作指纹识别器;用户使用右手握持手机的部位为握点3时,左手拇指可以采用向上趋势的切向手势操作指纹识别器;用户使用右手握持手机的部位为握点4时,左手拇指可以采用向下趋势的切向手势操作指纹识别器。本发明实施例这里对第二操作指令对应用户操作指纹识别器的具体形式不作限定。
当本发明实施例提供的第二操作指令为重力操作指令时,第二操作指令可以对应于用户采用晃动手势操作手机。示例性的,如图12所示,为本发明实施例提供的另一种手机的触控屏显示的界面变化示意图。图12中,图12a示出了用户未采用晃动手势操作手机之前,用户使用左手握持手机;图12a-12b示出了用户采用向左晃动和向右晃动的手势操作手机;随后,手机的处理器接收到的第二操作指令可以是握点1-4所在位置的晃动角度,且握点1所在位置的晃动角度最小。其中,在用户采用向左晃动和向右晃动的手势操作手机的情况下:若用户握持手机的部位为握点1,则握点1所在位置的晃动角度最小;若用户握持手机的部位为握点2,则握点2所在位置的晃动角度最小;若用户握持手机的部位为握点3,则握点3所在位置的晃动角度最小;若用户握持手机的部位为握点4,则握点4所在位置的晃动角度最小。
一般而言,在用户单手操作手机时,使用左手握持手机的部位可以是手机的左下角,如图3所示的握点1;使用右手握持手机的部位可以是手机的右 下角,如图3所示的握点4。
在未执行本发明实施例提供的图标显示方法之前,手机(如手机的存储器)中可以预先设置第二预设指令,以支持本发明实施例的方法执行S802:
S802、手机的处理器将第二操作指令与第二预设指令匹配,确定握点位置。
示例性的,在未执行本发明实施例提供的图标显示方法之前,本领域技术人员可以预先设置四个第二预设指令,这四个第二预设指令分别对应于四个不同的握点位置,如握点1-4;并且,这四个第二预设指令分别对应于四种预设的用户的操作。具体的,上述“手机的处理器将第二操作指令与第二预设指令匹配,确定握点位置”可以包括:手机的处理器接收用户的第二操作指令,该第二操作指令对应于实际中用户的操作;手机的处理器判断第二操作指令和对个第二预设指令中的任一第二预设指令是否相同。若第二操作指令和该任一第二预设指令相同,则手机的处理器确定出该任一第二预设指令对应的握点位置。示例性的,上述四个第二预设指令中的任一第二预设指令可以为在一定时间内、用户采用长按手势操作了如图9所示的手机的触控屏显示的区域1,即该第二预设指令对应于握点1;随后,第二操作指令若对应于的用户采用长按手势操作手机的触控屏显示的区域1,且操作时间达到上述一定时间,则手机的处理器确定第二操作指令和该任一第二预设指令相同,实际中的握点位置为握点1。
进一步的,用户单手操作手机时,手机(手机的触控屏)的界面状态可以是不同的,如竖屏状态或横屏状态。手机在竖屏状态或横屏状态下,触控屏显示的同一个图标的位置可以是不同的,该图标的位置可以是预先设置的。而触控屏显示的图标的位置对本发明实施例提供的图标显示方法中动态显示图标是有影响的;从而,手机的处理器可以结合手机的界面状态执行上述方法。本发明实施例提供的图标显示方法在S601之后,还可以包括S1301;相应的,S202可以替换为S1302。示例性的,如图13所示,图6所示方法在S602和S201之间,还可以包括S1301,图6中的S202可以替换为S1302:
S1301、手机的处理器确定界面状态。
其中,用户在单手操作手机时,用户对手机的握持姿势可以是左手握持或右手握持。当用户左手握持手机时,对手机的握持部位可以是手机的左上角或左下角;当用户右手握持手机时,对手机的握持部位可以是手机的右上角或右下角。需要说明的是,若手机处于不同的界面状态,则用户使用相同握持姿势对手机的握持部位可能不同。示例性的,如图14所示,为本发明实施例提供的另一种手机的触控屏显示的界面变化示意图。图14a中示出的手机为竖屏状态,该手机的左下角为该手机的握点1、左上角为该手机的握点2、右上角为该手机的握点3、右下角为该手机的握点4;并且,手机的处理器根据预先设置的每个图标的位置(记为图标的竖屏位置),使手机的触控屏显示应用程序图标(应用1-8)。图14b中示出的手机为横屏状态,该手机的左下角为该手机的握点2、左上角为该手机的握点3、右上角为该手机的握点4、 右下角为该手机的握点1;并且,手机的处理器根据预先设置的每个图标的位置(记为图标的横屏位置),使手机的触控屏显示应用程序图标(应用1-8)。其中,手机的触控屏上显示的同一个图标的横屏位置和竖屏位置不同,如图14a所示的手机的触控屏显示的应用1的位置和图14b所示的手机的触控屏显示的应用1的位置不同。
S1302、手机的处理器根据界面状态,确定手机的触控屏显示的至少一个图标中,每个图标的相对位置。
其中,由于手机的触控屏上显示的同一个图标的横屏位置和竖屏位置是不同的,因此在手机的界面状态不同时,即使用户握持手机的部位相同,该图标的相对位置可能是不同的。例如,图14a所示的手机的触控屏显示的应用1的相对于握点1的位置,与图14b所示的手机的触控屏显示的应用1相对于握点1的位置是不同的。
本发明实施例这里提供的图标显示方法,在动态显示模式下,手机的处理器可以根据手机的握点位置和界面状态确定手机的触控屏显示的每个图标的相对位置,使得该每个图标的相对位置更加准确;从而,使得手机的处理器根据每个图标的相对位置确定出的每个图标的感应区域更加准确,有利于手机的触控屏显示的每个图标的感应区域部分或全部落在当前操作区域中。
进一步的,本发明实施例提供的图标显示方法,在手机的处理器根据握点位置,确定当前操作区域之前,手机中可以预先设置有当前操作区域。从而,上述方法在S202或S1302之前还可以包括S1501-S1502,具体在S601之前还包括S1501-S1502。示例性的,如图15所示,图6所示的方法在S601之前还包括S1501-S1502:
S1501、手机的触控屏或传感器接收第三操作指令。
其中,第三操作指令用于指示手机的处理器设置当前操作区域。示例性的,如图16所示,为本发明实施例提供的另一种手机的触控屏显示的界面变化示意图。图16中,图16a-16c示出了用户握持手机的部位为握点1时,用户采用滑动手势操作了手机的触控屏,并且滑动轨迹为图16c示出的位置1到位置2的弧线。随后,手机的处理器可以接收到用户采用滑动手势操作了手机的触控屏,并且获取到上述滑动轨迹。
需要说明的是,用户握持手机的部位为握点2-4时,第三操作指令对应的用户的操作与上述图16示出的用户的操作类似,本发明实施例这里不再赘述。
S1502、手机的处理器根据第三操作指令,设置当前操作区域。
其中,上述当前操作区域与握点位置一一对应。
示例性的,若手机的处理器接收到的第三操作指令对应于如图16所示的用户的操作,则手机的处理器可以将如图16d所示的手机的触控屏上的阴影区域设置为握点1对应的当前操作区域。类似的,手机的处理器可以将图17a所示的手机的触控屏上的阴影区域设置为握点2对应的当前阴影区域;手机的处理器可以将图17b所示的手机的触控屏上的阴影区域设置为握点3对应 的当前阴影区域;手机的处理器可以将图17c所示的手机的触控屏上的阴影区域设置为握点4对应的当前阴影区域。其中,上述第三操作指令可以对应于用户的操作,还可以对应于手机生产过程中、本领域技术人员的操作。
需要说明的是,手机的处理器在接收到第三操作指令,并设置该第三操作指令对应的当前操作区域之后,可以将该当前操作指令保存在手机的存储器中。如此,手机的处理器随后可以根据第一操作指令对应的握点位置确定出当前操作区域。
进一步的,由于单手操作手机的用户可以是不同的,而不同用户单手操作手机的触控屏上的最大区域可能是不同的,因此,当前用户可能无法操作手机的触控屏上的当前操作区域之外的图标。如此,为了避免当前用户无法操作手机的触控屏上的当前操作区域之外图标,上述方法在S205之后还可以包括S1801-S1804。示例性的,如图18所示,图6所示的方法在S205之后还可以包括S1801-S1804:
S1801、手机的触控屏接收第四操作指令。
其中,第四操作指令用于指示调整至少一个图标中每个图标的感应区域。
示例性的,如图19所示,为本发明实施例提供的另一种手机的触控屏显示的界面变化示意图。图19中,图19a示出的手机的触控屏显示的每个图标部分或全部落在阴影区域1指示的当前操作操作区域内,该当前操作区域是预先设置的;此时,用户可以是使用左手握持手机的握点1的。图19b-19c示出了用户使用左手拇指采用向左的滑动的手势操作手机的指纹识别器;随后,手机的处理器可以接收的第四操作指令可以为,用户采用向左的滑动手势操作了手机的指纹识别器,并获得滑动距离和用户的左手拇指指纹。其中,若第四操作指令对应于用户使用左手拇指采用向左的滑动手势操作手机的指纹识别器,则第四操作指令用于指示手机的处理器缩小至少一个图标中每个图标的感应区域;若第四操作指令对应于用户使用右手拇指采用向右的滑动手势操作手机的指纹识别器,则第四操作指令用于指示手机的处理器放大至少一个图标中每个图标的感应区域。
S1802、手机的处理器根据第四操作指令,得到比率。
示例性的,上述比率为手机的处理器缩放至少一个图标的感应区域的比率,该比率可以对应于上述第四操作指令中的滑动距离。其中,上述第四操作指令中的滑动距离越大,即上述比率越大;上述第四操作指令中的滑动距离越小,即上述比率越小。其中,在手机的触控屏显示的图标部分或全部落在预先设置的当前操作区域的情况下:若用户不能单手操作手机的触控屏显示的全部图标,则上述比率可以为大于0且小于1的数字,如该比率可以为0.95;若用户可以单手操作手机的触控屏显示的全部图标且有放大所有图标的需求时,则上述比率可以为大于1的数字,如该比率可以为1.1。
S1803、手机的处理器根据比率,修改每个图标的感应区域,得到修改后的每个图标的感应区域。
示例性的,手机的处理器根据图19b-19c示出的用户的操作对应的第四 操作指令得到的比率为0.95,则手机的处理器可以手机的触控屏显示的至少一个图标中的每个图标的感应区域均缩小为0.95倍。
S1804、手机的处理器根据每个图标调整后的感应区域,使手机的触控屏显示至少一个图标中每个图标。
具体的,手机的处理器根据每个图标调整后的感应区域,使手机的触控屏显示至少一个图标中每个图标之后,手机的触控屏显示的图标可以部分或全部落在新的操作区域内。若上述每个图标调整后的感应区域为缩小后的感应区域,则上述新的操作区域小于上述当前操作区域;若上述每个图标调整后的感应区域为放大后的感应区域,则上述新的操作区域大于上述当前操作区域。示例性的,如图19d所示,手机的处理器可以使得缩小0.95倍后的每个图标的感应区域部分或全部落在图19d所示的阴影区域2中,该阴影区域2可以为上述新的操作区域,且该阴影区域2小于图19d所示的阴影区域1。
可选的,手机的处理器可以将上述新的操作区域保存至手机的存储器中,以更新手机的存储器中保存的当前操作区域;或者,手机的处理器可以不保存上述新的操作区域。
需要说明的是,本发明实施例提供的图标显示方法,可以通过在手机中预先设置当前操区域,使手机的触控屏显示的所有图标的感应区域均部分或全部落在当前操作区域内,使得用户可以单手操作手机的触控屏显示的所有图标。并且,手机的处理器还可以根据用户的需求,调整每个图标的感应区域,使手机的触控屏显示的所有图标的调整后的感应区域均部分或全部落在新的操作区域内,使得用户可以更好地单手操作手机的触控屏显示的所有图标,并可以获得较好的用户体验。
上述主要从终端设备的角度对本发明实施例提供的方案进行了介绍。可以理解的是,终端设备为了实现上述功能,其包含了执行各个功能相应的硬件结构和/或软件模块。本领域技术人员应该很容易意识到,结合本文中所公开的实施例描述的各示例的算法步骤,本发明能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本发明的范围。
本发明实施例可以根据上述方法示例对终端设备进行功能模块的划分,例如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。需要说明的是,本发明实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
在采用对应各个功能划分各个功能模块的情况下,图20示出了上述实施例中提供的终端设备的一种可能的组成示意图,如图20所示,终端设备20可以包括:确定模块201和显示模块202。其中,确定模块201,用于支持终 端设备20执行上述实施例中的S201、S202、S203、S204、S602、S802、S1301、S1302和S1502,和/或用于本文所描述的技术的其它过程。显示模块202,用于支持终端设备20执行上述实施例中的S205和S1804,和/或用于本文所描述的技术的其它过程。
进一步的,如图21所示,为本发明实施例中提供的终端设备的一种可能的组成示意图。图21中,终端设备20还可以包括:接收模块203。其中,接收模块203,用于支持终端设备20执行上述实施例中的S601、S801、S1501和S1801,和/或用于本文所描述的技术的其它过程。
进一步的,如图22所示,为本发明实施例中提供的终端设备的一种可能的组成示意图。图21中,终端设备20还可以包括:设置模块204。其中,设置模块204,用于支持终端设备20执行上述实施例中的S1502,和/或用于本文所描述的技术的其它过程。
进一步的,如图23所示,为本发明实施例中提供的终端设备的一种可能的组成示意图。图23中,终端设备20还可以包括:调整模块205。其中,调整模块205,用于支持终端设备20执行上述实施例中的S1802和S1803,和/或用于本文所描述的技术的其它过程。
需要说明的是,上述方法实施例涉及的各步骤的所有相关内容均可以援引到对应功能模块的功能描述,在此不再赘述。
本发明实施例提供的终端设备,用于执行上述图标显示方法,因此可以达到与上述图标显示方法相同的效果。
在采用集成的单元的情况下,上述确定模块201、设置模块204、调整模块205等可以集成在一个处理模块中实现。上述处理模块可以是处理器或控制器,例如可以是CPU,通用处理器,数字信号处理器(英文:Digital Signal Processor,简称:DSP),专用集成电路(英文:Application-Specific Integrated Circuit,简称:ASIC),现场可编程门阵列(英文:Field Programmable Gate Array,简称:FPGA)或者其他可编程逻辑器件、晶体管逻辑器件、硬件部件或者其任意组合。其可以实现或执行结合本发明公开内容所描述的各种举例说明逻辑方框,模块和电路。上述处理单元也可以是实现计算功能的组合,例如包含一个或多个微处理器组合,DSP和微处理器的组合等等。存储模块可以是存储器。上述接收模块23可以由输入器实现。上述显示模块23可以由显示器实现。
当上述处理模块为处理器,存储模块为存储器时,本发明实施例提供一种如图24所示的终端设备24。如图24所示,终端设备24包括:处理器241、存储器242、显示器243、输入器244以及总线245。其中,处理器241、存储器242、显示器243和输入器244通过总线245相互连接。其中,上述总线245可以是外设部件互连标准(Peripheral Component Interconnect,PCI)总线或扩展工业标准结构(Extended Industry Standard Architecture,EISA)总线等。上述总线245可以分为地址总线、数据总线、控制总线等。为便于表示,图24中仅用一条粗线表示,但并不表示仅有一根总线或一种类型的总线。
示例性的,上述输入器244可以包括鼠标、物理键盘、轨迹球、触控面板和操作杆等器件,以及重力传感器、指纹识别器等传感器。例如上述手机中的输入器可以包括触控屏以及重力传感器、指纹识别器等传感器等。上述显示器243可以是独立的器件,还可以与上述输入器244集成为一个器件。例如,上述触控屏还可以作为手机中的显示器。
本发明实施例还提供一种计算机存储介质,该计算机存储介质中存储有计算机指令,当上述终端设备24的处理器241执行该计算机指令时,上述终端设备24执行上述实施例中的相关方法步骤。
其中,本发明实施例提供的终端设备24中各个模块的详细描述以及各个模块执行上述实施例中的相关方法步骤后所带来的技术效果可以参考本发明方法实施例中的相关描述,此处不再赘述。
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。上述描述的系统,装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统,装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或处理器执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:快闪存储器、移动硬盘、只读存储器、 随机存取存储器、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (14)

  1. 一种图标显示方法,其特征在于,包括:
    终端设备确定握点位置,所述握点位置为预设的所述终端设备的可握持部位之一;
    所述终端设备确定所述终端设备显示的至少一个图标中,每个图标的相对位置,所述每个图标的相对位置为所述每个图标相对于所述握点位置的位置;
    所述终端设备根据所述握点位置,确定当前操作区域,所述当前操作区域为用户单手操作所述终端设备显示的区域;
    所述终端设备根据所述当前操作区域、所述握点位置和所述至少一个图标中每个图标的相对位置,确定所述每个图标的感应区域,使得所述每个图标的感应区域部分或全部落在所述当前操作区域内,所述每个图标的感应区域为所述每个图标被操作的区域;
    所述终端设备根据所述每个图标的感应区域,显示所述至少一个图标中每个图标。
  2. 根据权利要求1所述的方法,其特征在于,在所述终端设备确定握点位置之前,还包括:
    所述终端设备接收第一操作指令;
    所述终端设备匹配所述第一操作指令与第一预设指令,以使所述终端设备进入动态显示模式,所述动态显示模式下、所述至少一个图标中每个图标的感应区域相同或不同。
  3. 根据权利要求2所述的方法,其特征在于,在所述终端设备确定握点位置之前,还包括:
    所述终端设备接收第二操作指令,所述第二操作指令用于指示所述握点位置;
    所述终端设备确定握点位置,包括:
    所述终端设备将所述第二操作指令与第二预设指令匹配,确定握点位置。
  4. 根据权利要求1或3所述的方法,其特征在于,在所述终端设备接收第一操作指令之后,还包括:
    所述终端设备确定界面状态;
    所述终端设备确定所述终端设备显示的至少一个图标中,每个图标的相对位置,包括:
    所述终端设备根据所述界面状态,确定所述终端设备显示的至少一个图标中,每个图标的相对位置。
  5. 根据权利要求4所述的方法,其特征在于,所述终端设备确定握点位置之前,还包括:
    所述终端设备接收第三操作指令;
    所述终端设备根据所述第三操作指令,设置所述当前操作区域。
  6. 根据权利要求5所述的方法,其特征在于,在所述终端设备显示所述至少一个图标中每个图标之后,还包括:
    所述终端设备接收第四操作指令,所述第四操作指令用于指示调整所述至少一个图标中每个图标的感应区域;
    所述终端设备根据所述第四操作指令,得到比率;
    所述终端设备根据所述比率,调整所述每个图标的感应区域,得到每个图标调整后的感应区域;
    所述终端设备根据所述每个图标调整后的感应区域,显示所述至少一个图标中每个图标。
  7. 一种终端设备,其特征在于,包括:
    确定模块,用于确定握点位置,所述握点位置为预设的终端设备的可握持部位之一;所述终端设备的显示界面上显示的至少一个图标中,的相对位置,所述每个图标的相对位置为所述每个图标相对于所述握点位置的位置;根据所述握点位置,确定当前操作区域,所述当前操作区域为用户单手操作所述显示界面上的区域;根据所述当前操作区域、所述握点位置和所述至少一个图标中每个图标的相对位置,确定所述每个图标的感应区域,使得所述每个图标的感应区域落在所述当前操作区域内,所述每个图标的感应区域指示的区域为所述每个图标被操作的区域;
    显示模块,用于根据所述确定模块确定出的所述每个图标的感应区域,显示所述至少一个图标中每个图标。
  8. 根据权利要求7所述的终端设备,其特征在于,还包括:
    接收模块,用于在所述确定模块确定握点位置之前,接收第一操作指令;
    所述确定模块,还用于匹配所述接收模块接收的所述第一操作指令与第一预设指令,以使所述终端设备进入动态显示模式,所述动态显示模式下、所述至少一个图标中每一个图标的所述感应区域相同或不同。
  9. 根据权利要求8所述的终端设备,其特征在于,所述接收模块,还用于在所述确定模块确定出所述握点位置之前,接收第二操作指令,所述第二操作指令用于指示所述握点位置;
    所述确定模块,具体用于将所述第二操作指令与第二预设指令匹配,确定握点位置。
  10. 根据权利要求7或9所述的终端设备,其特征在于,所述确定模块,还用于在所述接收模块接收所述第一操作指令之后,确定界面状态,所述界面状态用于指示所述终端设备的界面状态为竖屏状态,或者所述终端设备的界面状态为横屏状态;
    所述确定模块,具体用于根据所述界面状态,确定所述终端设备显示的至少一个图标中,每个图标的相对位置。
  11. 根据权利要求10所述的终端设备,其特征在于,所述接收模块,还用于在所述确定模块确定所述握点位置之前,接收第三操作指令;
    所述终端设备,还包括:
    设置模块,用于根据所述第三操作指令,设置所述当前操作区域。
  12. 根据权利要求11所述的终端设备,其特征在于,所述接收模块,还 用于在所述显示模块显示所述至少一个图标中每个图标之后,接收第四操作指令,所述第四操作指令用于指示调整所述至少一个图标中每个图标的感应区域;
    所述终端设备,还包括:
    调整模块,用于根据所述接收模块接收的所述第四操作指令,得到比率;根据所述比率,调整所述每个图标的感应区域,得到每个图标调整后的感应区域;
    所述显示模块,还用于根据所述调整模块得到的所述每个图标调整后的感应区域,显示所述至少一个图标中每个图标。
  13. 一种终端设备,其特征在于,包括:处理器、存储器、显示器、输入器和总线;
    所述存储器用于存储计算机指令,所述处理器、所述存储器、所述显示器和所述输入器通过所述总线连接,当所述终端设备运行时,所述处理器执行所述存储器存储的所述计算机指令,以使所述终端设备执行如权利要求1-6任意一项所述的图标显示方法。
  14. 一种计算机存储介质,其特征在于,包括:计算机指令;
    当所述计算机指令在终端设备上运行时,使得所述终端设备执行如权利要求1-6任意一项所述的图标显示方法。
PCT/CN2017/080298 2017-03-13 2017-04-12 一种图标显示方法和终端设备 WO2018166023A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US16/493,403 US11086478B2 (en) 2017-03-13 2017-04-12 Icon display method and terminal device
CN201780052538.9A CN109643216A (zh) 2017-03-13 2017-04-12 一种图标显示方法和终端设备

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710147081 2017-03-13
CN201710147081.1 2017-03-13

Publications (1)

Publication Number Publication Date
WO2018166023A1 true WO2018166023A1 (zh) 2018-09-20

Family

ID=63521620

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/080298 WO2018166023A1 (zh) 2017-03-13 2017-04-12 一种图标显示方法和终端设备

Country Status (3)

Country Link
US (1) US11086478B2 (zh)
CN (1) CN109643216A (zh)
WO (1) WO2018166023A1 (zh)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080141149A1 (en) * 2006-12-07 2008-06-12 Microsoft Corporation Finger-based user interface for handheld devices
CN102609164A (zh) * 2011-01-20 2012-07-25 深圳富泰宏精密工业有限公司 目录图标调整系统及方法
CN102799356A (zh) * 2012-06-19 2012-11-28 中兴通讯股份有限公司 优化移动终端大屏触屏单手操作的系统、方法及移动终端
CN103207750A (zh) * 2012-01-17 2013-07-17 腾讯科技(深圳)有限公司 图标缩放的方法及装置
CN103309604A (zh) * 2012-11-16 2013-09-18 中兴通讯股份有限公司 一种终端及终端屏幕显示信息控制方法
CN104808936A (zh) * 2014-01-28 2015-07-29 宏碁股份有限公司 界面操作方法与应用该方法的便携式电子装置
US20160349985A1 (en) * 2015-05-27 2016-12-01 Kyocera Corporation Mobile terminal

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101113906B1 (ko) * 2009-09-04 2012-02-29 노상기 전자기기 단말기용 사용자 인터페이스 화면 구성 방법 및 이를 수행하기 위한 전자기기 단말기
US8593418B2 (en) * 2010-08-08 2013-11-26 Qualcomm Incorporated Method and system for adjusting display content
CN109101165A (zh) * 2012-06-28 2018-12-28 汉阳大学校产学协力团 用户界面调节方法
JP5798532B2 (ja) * 2012-08-23 2015-10-21 株式会社Nttドコモ ユーザインタフェース装置、ユーザインタフェース方法及びプログラム
US20160132139A1 (en) * 2014-11-11 2016-05-12 Qualcomm Incorporated System and Methods for Controlling a Cursor Based on Finger Pressure and Direction
CN104866188A (zh) * 2015-03-02 2015-08-26 深圳市金立通信设备有限公司 一种显示界面的调整方法
CN104731514B (zh) * 2015-04-09 2017-02-15 努比亚技术有限公司 触摸操作区域单握触摸操作的识别方法及装置
CN106293307A (zh) 2015-05-18 2017-01-04 中兴通讯股份有限公司 图标设置方法及装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080141149A1 (en) * 2006-12-07 2008-06-12 Microsoft Corporation Finger-based user interface for handheld devices
CN102609164A (zh) * 2011-01-20 2012-07-25 深圳富泰宏精密工业有限公司 目录图标调整系统及方法
CN103207750A (zh) * 2012-01-17 2013-07-17 腾讯科技(深圳)有限公司 图标缩放的方法及装置
CN102799356A (zh) * 2012-06-19 2012-11-28 中兴通讯股份有限公司 优化移动终端大屏触屏单手操作的系统、方法及移动终端
CN103309604A (zh) * 2012-11-16 2013-09-18 中兴通讯股份有限公司 一种终端及终端屏幕显示信息控制方法
CN104808936A (zh) * 2014-01-28 2015-07-29 宏碁股份有限公司 界面操作方法与应用该方法的便携式电子装置
US20160349985A1 (en) * 2015-05-27 2016-12-01 Kyocera Corporation Mobile terminal

Also Published As

Publication number Publication date
US20200012410A1 (en) 2020-01-09
US11086478B2 (en) 2021-08-10
CN109643216A (zh) 2019-04-16

Similar Documents

Publication Publication Date Title
AU2020201096B2 (en) Quick screen splitting method, apparatus, and electronic device, display UI, and storage medium
JP5759660B2 (ja) タッチ・スクリーンを備える携帯式情報端末および入力方法
US20150185953A1 (en) Optimization operation method and apparatus for terminal interface
EP2752754B1 (en) Remote mouse function method and terminals
WO2020258929A1 (zh) 文件夹界面切换方法及终端设备
WO2019014859A1 (zh) 一种多任务操作方法及电子设备
WO2021057337A1 (zh) 操作方法及电子设备
WO2020134744A1 (zh) 图标移动方法及移动终端
WO2020181955A1 (zh) 界面控制方法及终端设备
WO2016191938A1 (zh) 通过触控板调节移动终端拍照焦距的方法和移动终端
WO2013135169A1 (zh) 一种输入法键盘的调整方法及其移动终端
WO2021203815A1 (zh) 页面操作方法、装置、终端及存储介质
US20150160731A1 (en) Method of recognizing gesture through electronic device, electronic device, and computer readable recording medium
EP2713606B1 (en) Method for controlling camera and mobile device
WO2013177901A1 (zh) 触控解锁方法、装置和电子设备
WO2020134743A1 (zh) 图标移动方法及终端
WO2017035818A1 (zh) 控制电子设备的方法、装置及电子设备
WO2021047062A1 (zh) 按键模式设置方法、装置及存储介质
WO2019047129A1 (zh) 一种移动应用图标的方法及终端
CN108984099B (zh) 一种人机交互方法及终端
US20140258923A1 (en) Apparatus and method for displaying screen image
KR102396736B1 (ko) 컴퓨팅 디바이스에 대한 장치의 결합
WO2018039914A1 (zh) 一种数据复制方法及用户终端
WO2022063034A1 (zh) 一种输入界面的显示方法及终端
WO2018166023A1 (zh) 一种图标显示方法和终端设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17900934

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17900934

Country of ref document: EP

Kind code of ref document: A1