WO2022089060A1 - Interface display method and electronic device - Google Patents

Interface display method and electronic device Download PDF

Info

Publication number
WO2022089060A1
WO2022089060A1 PCT/CN2021/118075 CN2021118075W WO2022089060A1 WO 2022089060 A1 WO2022089060 A1 WO 2022089060A1 CN 2021118075 W CN2021118075 W CN 2021118075W WO 2022089060 A1 WO2022089060 A1 WO 2022089060A1
Authority
WO
WIPO (PCT)
Prior art keywords
coordinates
electronic device
finger
display
processor
Prior art date
Application number
PCT/CN2021/118075
Other languages
French (fr)
Chinese (zh)
Inventor
殷代宗
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022089060A1 publication Critical patent/WO2022089060A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Definitions

  • the present application relates to the field of terminal technologies, and in particular, to an interface display method and an electronic device.
  • the functions of electronic devices are gradually improved.
  • the screen of an electronic device is usually set to be larger.
  • the user experience is poor due to the limitation of the length of the user's finger.
  • Some electronic devices provide a one-handed operation mode, but due to the limitations of the screen width of the electronic device and the length of the user's fingers, the user still needs to use multiple fingers together (for example, if the user wants to click the application icon on the far right of the mobile phone screen with the left hand, need to span the entire width of the screen), there is still the problem of poor user experience.
  • the present application provides an interface display method and an electronic device, which are used to solve the problem of inconvenient user operation in the existing one-hand operation mode.
  • an interface display method in a first aspect, can be executed by an electronic device, the method includes: receiving a first operation; in response to the first operation, starting a one-handed operation mode; in the one-handed operation mode, acquiring the first Viewpoint coordinates and finger coordinates; obtain the first display content in the first display area corresponding to the first viewpoint coordinates, and display the first display content in the second display area corresponding to the finger coordinates.
  • the electronic device when the electronic device receives the first operation, in response to the first operation, the single-handed operation mode is activated; and in the single-handed operation mode, the electronic device can obtain the coordinates of the first viewpoint and the finger, and obtain the first The first display content in the first display area corresponding to a viewpoint coordinate, and the first display content is displayed in the second display area corresponding to the finger coordinate.
  • the user can effectively operate any content in the display interface with one hand, thereby effectively improving the user experience.
  • the first operation may be that the electronic device is lifted or shaken; or, a click operation or a sliding operation on the display screen of the electronic device; or, a voice command; or, Operation of hardware keys for electronic devices.
  • the above only provide several specific implementation manners of the first operation.
  • the first operation may also be implemented in other manners in the embodiments of the present application, which are not implemented in the embodiments of the present application. limited.
  • the electronic device after the electronic device displays the first display content in the second display area corresponding to the coordinates of the finger, the electronic device can also receive the first sliding operation in the second display area, and The sliding operation determines the second display content, and switches the display content in the second display area from the first display content to the second display content.
  • the electronic device displays the first display content in the second display area corresponding to the coordinates of the finger
  • the display content of the second display area can be updated. , which effectively meets the user's operational needs, thereby making the display of the user interface more intelligent.
  • the electronic device may also obtain the coordinates of the second viewpoint and the first display area in the third display area corresponding to the coordinates of the second viewpoint.
  • the third display content is to switch the display content in the second display area from the first display content to the third display content.
  • the electronic device after the electronic device displays the first display content in the second display area corresponding to the finger coordinates, if the electronic device can also monitor the change of the viewpoint coordinates in real time, for example, the electronic device detects the change of the first viewpoint coordinates For the second viewpoint coordinates, the electronic device can acquire the third display content in the third display area corresponding to the two viewpoint coordinates, and switch the display content in the second display area from the first display content to the third display content. In this way, the user's operation requirements are effectively met, thereby effectively improving the intelligence of the user operation interface.
  • the electronic device before displaying the first display content in the second display area corresponding to the finger coordinates, the electronic device also needs to determine whether there is an application icon at the first position corresponding to the finger coordinates in the second display area or a control icon; if it exists, the first display content will be displayed at the second position in the second display area; the second position is separated from the first position by the first preset value; if it does not exist, the first display content will be displayed displayed in the first position.
  • the electronic device judges whether there is an application icon or a control icon at the first position corresponding to the finger coordinates, if there is an application icon or control icon at the first position icon or control icon, the first display content will be displayed at the second position in the second display area that is far from the first position by the first preset value; if it does not exist, the first display content will be displayed at the first position .
  • the first display content covers other control icons in the second display area, which may cause inconvenience for the user to operate, thereby effectively improving the user experience.
  • the electronic device may further determine that the user's line of sight stays on the coordinates of the first viewpoint for longer than a second preset threshold.
  • the electronic device displays the first display content in the second display area corresponding to the finger only after it is determined that the user's line of sight stays on the coordinates of the first viewpoint for longer than the second preset threshold. In this way, misoperations are effectively avoided, thereby effectively improving the display accuracy of the user operation interface and effectively improving the user experience.
  • the electronic device may also receive a click operation on at least one target icon in the first display content, determine the first target icon, and execute an application process corresponding to the first target icon.
  • the electronic device detects a click operation on at least one target icon in the first display content, it can determine the first target icon the user wants to operate, and then start the application process corresponding to the first target icon, effectively Improve the interaction efficiency between electronic devices and users, thereby effectively improving the user experience.
  • the electronic device may acquire the coordinates of the first viewpoint by: determining the first distance between the user's eyes and the infrared camera, as well as the coordinates of the corneal reflection spot and the center of the pupil; The first distance, the coordinates of the corneal reflection spot and the coordinates of the pupil center determine the coordinates of the first viewpoint.
  • the electronic device determines the coordinates of the first viewpoint through the first distance between the user's eyes and the infrared camera, the coordinates of the corneal reflection spot, and the coordinates of the center of the pupil. In this way, the interaction efficiency between the electronic device and the user is effectively improved, thereby effectively improving the user experience.
  • the electronic device acquires the coordinates of the finger in response to the first operation, and the specific manner may be: determining the second distance between the finger and the infrared light sensor in the electronic device, and the third distance between the finger and the display screen of the electronic device. distance, the orientation of the finger relative to the infrared light sensor; according to the second distance and the third distance, determine the fourth distance between the projection point of the finger on the display screen and the infrared light sensor; according to the fourth distance and orientation, determine the coordinates of the finger.
  • the electronic device determines the coordinates of the finger according to the fourth distance between the projection point of the finger on the display screen and the infrared light sensor, and the orientation of the finger relative to the infrared light sensor. In this way, the interaction efficiency between the electronic device and the user is effectively improved, thereby effectively improving the user experience.
  • an electronic device comprising means for performing the method of the first aspect.
  • the electronic device may include: a processor and a memory; wherein the memory is used to store one or more computer programs; when the one or more computer programs stored in the memory are executed by the processor, the electronic device is made to execute as follows step:
  • the memory is located outside the electronic device.
  • the electronic device includes a memory connected to the at least one processor, and the memory stores instructions executable by the at least one processor.
  • the electronic device further includes a display screen, and the first operation is: the electronic device is lifted or shaken; or, a click operation or sliding operation for the display screen; or a voice command; or, for the electronic device The operation of the hardware keys of the device.
  • the electronic device when one or more computer programs stored in the memory are executed by the processor, the electronic device is further caused to perform the following steps: receiving a first sliding operation in the second display area; according to the first sliding operation , determine the second display content, and switch the display content in the second display area from the first display content to the second display content.
  • the electronic device when one or more computer programs stored in the memory are executed by the processor, the electronic device is further caused to perform the following steps: acquiring the coordinates of the second viewpoint and the coordinates in the third display area corresponding to the coordinates of the second viewpoint
  • the third display content is to switch the display content in the second display area from the first display content to the third display content.
  • the electronic device when one or more computer programs stored in the memory are executed by the processor, the electronic device is further caused to perform the following steps: judging whether there is an application on the first position corresponding to the finger coordinates in the second display area icon or control icon; if it exists, the first display content will be displayed on the second position in the second display area; the second position is separated from the first position by the first preset value; if it does not exist, the first display will be displayed The content is displayed in the first position.
  • the electronic device when one or more computer programs stored in the memory are executed by the processor, the electronic device also causes the electronic device to perform the following step: determining that the user's sight stays on the coordinates of the first viewpoint for a duration exceeding a second time threshold.
  • the electronic device when one or more computer programs stored in the memory are executed by the processor, the electronic device is further caused to perform the following steps: receiving a click operation on at least one target icon in the first display content, determining the first Target icon, execute the application process corresponding to the first target icon.
  • the electronic device further includes an infrared camera, and when one or more computer programs stored in the memory are executed by the processor, the electronic device also causes the electronic device to perform the following steps: determining a first distance between the user's eyes and the infrared camera , the coordinates of the corneal reflection spot and the coordinates of the pupil center; the coordinates of the first viewpoint are determined according to the first distance, the coordinates of the cornea reflection spot and the coordinates of the pupil center.
  • the electronic device further includes an infrared light sensor, and when one or more computer programs stored in the memory are executed by the processor, the electronic device also causes the electronic device to perform the following steps: determine the relationship between the finger and the infrared light sensor in the electronic device According to the second distance, the third distance between the finger and the display screen of the electronic device, the orientation of the finger relative to the infrared light sensor; according to the second distance and the third distance, determine the projection point of the finger on the display screen and the infrared light sensor. Four distances; according to the fourth distance and orientation, determine the finger coordinates.
  • a computer-readable medium stores a program code for execution by a device.
  • the program code is executed by the device, the above-mentioned first aspect or any possible design of the first aspect The method in will be executed.
  • a computer program instruction comprising instructions is provided, when the program instruction is executed on a computer, the method in the above-mentioned first aspect or any possible design of the first aspect is performed.
  • a fifth aspect provides a chip, the chip includes a processor and a data interface, the processor is configured to read and execute instructions stored on a memory through the data interface, so that the first aspect or any one of the first aspect The methods in the possible designs are implemented.
  • the chip may further include the memory, where the instructions are stored.
  • 1A is a scene diagram of a user operating an electronic device with one hand
  • 1B is a schematic diagram of a user graphical interface of a mobile phone provided by the application
  • 1C is a schematic diagram of a user graphical interface of a mobile phone provided by the application.
  • FIG. 2A is a schematic diagram of a hardware structure of a mobile phone 100 according to an embodiment of the present application
  • FIG. 2B is a schematic diagram of the software structure of the mobile phone 100 according to an embodiment of the application.
  • FIG. 3 is a schematic flowchart of an interface display method according to an embodiment of the present application.
  • FIG. 4A is one of schematic diagrams of operations of the mobile phone 100 entering a one-handed operation mode according to an embodiment of the present application
  • FIG. 4B is the second schematic diagram of the operation of the mobile phone 100 entering the one-hand operation mode according to an embodiment of the present application
  • 4C is the third schematic diagram of the operation of the mobile phone 100 entering the one-hand operation mode according to an embodiment of the present application.
  • FIG. 4D is the third schematic diagram of the operation of the mobile phone 100 entering the one-hand operation mode according to an embodiment of the present application.
  • FIG. 5 is a schematic flowchart of a possible determination of user viewpoint coordinates according to an embodiment of the present application.
  • FIG. 6A is one of the schematic diagrams of acquiring viewpoint coordinates of the mobile phone 100 according to an embodiment of the present application.
  • FIG. 6B is the second schematic diagram of acquiring viewpoint coordinates of the mobile phone 100 according to an embodiment of the application.
  • 6C is a schematic diagram of a user graphical interface of the mobile phone 100 according to an embodiment of the present application.
  • FIG. 7 is a schematic flowchart of a possible determination of the coordinates of a user's finger provided by an embodiment of the present application.
  • FIG. 8A is a schematic diagram of acquiring finger coordinates of the mobile phone 100 according to an embodiment of the application.
  • FIG. 8B is a schematic diagram of a user graphical interface of the mobile phone 100 according to an embodiment of the present application.
  • 9A is a schematic diagram of a user graphical interface of a mobile phone 100 according to an embodiment of the present application.
  • FIG. 9B is a schematic diagram of a user graphical interface of the mobile phone 100 according to an embodiment of the application.
  • FIG. 10 is a schematic diagram of a user graphical interface of a mobile phone 100 according to an embodiment of the application.
  • FIG. 11 is a schematic diagram of a user graphical interface of a mobile phone 100 according to an embodiment of the application.
  • 12A is a schematic diagram of a possible game interface provided by an embodiment of the application.
  • FIG. 12B is a schematic diagram of another possible game interface provided by an embodiment of the application.
  • FIG. 13 is a schematic diagram of a user graphical interface of a mobile phone 100 according to an embodiment of the application.
  • FIG. 14 is a schematic diagram of a user graphical interface for implementing one-handed operation with multiple devices according to an embodiment of the application
  • FIG. 15 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • FIG. 1A shows a scene diagram of a user operating an electronic device with one hand, where the electronic device is a mobile phone as an example.
  • a graphical user interface (graphical user interface, GUI) of the electronic device in FIG. 1A is displayed on the electronic device 101 with icons of a plurality of application programs (application, APP).
  • application application
  • some electronic devices may provide a one-handed operation mode.
  • FIG. 1B After the electronic device detects an operation for sliding from top to bottom on the screen, in response to the operation, it is determined to activate the one-handed mode, and the operation interface 101 is reduced as a whole. , and display it in the display area close to the operation position (ie, the user's finger) to obtain the operation interface 102 shown in FIG. 1B .
  • the reduction ratio of the operation interface 102 set by the electronic device relative to the operation interface 101 is fixed. If the screen of the electronic device is large or the user's fingers are short, the user still has the problem of inconvenient one-handed operation.
  • the electronic device when the electronic device detects two consecutive click operations on the start key of the electronic device within a preset time period, it moves the operation interface 101 of the electronic device down to the electronic device
  • the operation interface 103 is obtained, but in this way, the operation interface 101 is only reduced in the vertical direction, and the operation interface 101 is not reduced in the horizontal direction.
  • the user holds the electronic device with the right hand, and the thumb of the right hand cannot touch the area where the APP1 icon is located, and there is still a problem of poor user experience.
  • an embodiment of the present application provides an interface display method.
  • the electronic device After the electronic device detects the first operation, it enters a one-handed operation mode.
  • the electronic device obtains the coordinates of the finger and the first viewpoint coordinates, and The first display content in the first display area corresponding to the coordinates of the first viewpoint is displayed in the second display area corresponding to the coordinates of the finger (ie, the operable range of the finger).
  • the user only needs to control the position of the viewpoint, and can operate the application icon or control icon located at any position in the operation interface with one hand in the same area, which improves the user experience.
  • the specific implementation manner of this technical solution will be described in detail later.
  • An application program (application, app for short) involved in the embodiments of the present application is a software program capable of implementing one or more specific functions.
  • multiple applications can be installed in an electronic device.
  • the application program mentioned below may be an application program installed when the terminal leaves the factory, or it may be an application program downloaded from the network or acquired by other terminals when the user uses the electronic device.
  • the operation interface involved in the embodiments of the present application may also be referred to as a user interface (User Interface, UI) or a graphical interface, or other names.
  • the operation interface is an interface between an electronic device and a user for human-computer interaction.
  • Output information such as displaying images or text, etc., can also receive user operations through the operation interface, such as receiving user touch operations.
  • the operation interface 1101 and the like in FIG. 11 are all operation interfaces.
  • the application icons involved in the embodiments of the present application are graphics with explicit meanings, which explicitly refer to an application.
  • Application icons can be displayed on the desktop (or on the home screen interface) of the electronic device.
  • the electronic device detects the click operation on these application icons, and can run the corresponding application program and start the corresponding application process. For example, assuming that the APP1 icon in FIG. 1A is an application icon of WeChat, when the mobile phone detects a click operation on the APP1 icon, it runs WeChat and starts WeChat.
  • the control icon (referred to as the control) involved in the embodiment of the present application may be an icon in the interface of a certain application used to implement a specific function of the application.
  • the electronic device detects a click operation on the control icon and can start the certain application.
  • the corresponding child process under each application For example, when the game application has been started in FIG. 12A, the mobile phone detects a click or long-press operation on the control 1 in the game application operation interface, and can start the process of controlling the forward direction of the game character in the game application.
  • the one-handed operation mode involved in the embodiments of the present application is an interface display mode set for the convenience of the user to operate any icon in the display interface with one hand.
  • the electronic device can display the coordinates of the user's viewpoint
  • the display content in the corresponding display area is displayed in the display area corresponding to the coordinates of the user's finger.
  • the APP8 icon corresponding to the viewpoint coordinates is displayed in the display area corresponding to the finger coordinates, that is, the operation interface 901 shown in (b) of FIG. 9A .
  • the electronic device may be a portable terminal including a display screen, such as a mobile phone, a tablet computer, and the like.
  • portable electronic devices include, but are not limited to, carry-on Or portable electronic devices with other operating systems.
  • the above-mentioned portable electronic device may also be other portable electronic device, such as a digital camera. It should also be understood that, in some other embodiments of the present application, the above-mentioned electronic device may not be a portable electronic device, but a desktop computer having a display screen or the like.
  • electronic devices can support multiple applications.
  • one or more of the following applications messaging applications, instant messaging applications, gaming applications, etc.
  • instant messaging applications there may be many kinds of instant messaging applications.
  • Users can send text, voice, pictures, video files and other various files to other contacts (or other contacts) through instant messaging applications; or, users can communicate with other contacts through instant messaging applications video or audio call.
  • FIG. 2A shows a schematic structural diagram of the mobile phone 100 .
  • the mobile phone 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, Mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194, and user Identity module (subscriber identification module, SIM) card interface 195 and so on.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an orientation sensor 180C, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a touch sensor 180K, a bone conduction sensor 180M, and the like.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) Wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • graphics processor graphics processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the controller may be the nerve center and command center of the mobile phone 100 .
  • the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the processor 110 may run the software code of the interface display method provided by the embodiment of the present application, so as to realize the one-hand operation mode of the mobile phone 100 .
  • the USB interface 130 is an interface that conforms to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the mobile phone 100, and can also be used to transmit data between the mobile phone 100 and peripheral devices.
  • the charging management module 140 is used to receive charging input from the charger.
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140 and supplies power to the processor 110 , the internal memory 121 , the external memory, the display screen 194 , the camera 193 , and the wireless communication module 160 .
  • the wireless communication function of the mobile phone 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in handset 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G etc. applied on the mobile phone 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1.
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
  • the wireless communication module 160 can provide applications on the mobile phone 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • BT wireless fidelity
  • GNSS global navigation satellite system
  • frequency modulation frequency modulation, FM
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify the signal, and convert it into electromagnetic
  • the antenna 1 of the mobile phone 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the mobile phone 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (global positioning system, GPS), global navigation satellite system (global navigation satellite system, GLONASS), Beidou navigation satellite system (beidou navigation satellite system, BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • global positioning system global positioning system, GPS
  • global navigation satellite system global navigation satellite system, GLONASS
  • Beidou navigation satellite system beidou navigation satellite system, BDS
  • quasi-zenith satellite system quadsi -zenith satellite system, QZSS
  • SBAS satellite based augmentation systems
  • the mobile phone 100 realizes the display function through the GPU, the display screen 194, and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • the display screen 194 is used to display images, videos, application icons, control icons, and the like.
  • Display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • LED organic light-emitting diode
  • AMOLED organic light-emitting diode
  • FLED flexible light-emitting diode
  • Miniled MicroLed, Micro-oLed
  • quantum dot light-emitting diode quantum dot light emitting diodes, QLED
  • the handset 100 may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the display screen 194 may be used to display an operation interface provided by an embodiment of the present application (eg, the operation interface 901 shown in FIG. 9A ), and application icons or control icons in the operation interface.
  • the camera 193 is used to capture still images or video.
  • the camera 193 may include a front camera and a rear camera.
  • the camera 193 may be used to capture an image of the user's face.
  • the front camera may be an infrared camera, which may be used to capture infrared light reflected from the user's eyes and generate a depth image of the eyes.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the processor 110 executes various functional applications and data processing of the mobile phone 100 by executing the instructions stored in the internal memory 121 .
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store the operating system, and the software code of at least one application (such as game application, WeChat application, etc.).
  • the storage data area can store data (such as images, videos, etc.) and the like generated during the use of the mobile phone 100 .
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the internal memory 121 may also store software codes of the interface display method provided by the embodiments of the present application.
  • the processor 110 executes the software codes, the process steps of the interface display method are executed to realize a one-handed operation mode.
  • the internal memory 121 can also store information generated or received by the electronic device during operation, such as user-defined shortcut gesture information for entering the one-hand operation mode, face image, viewpoint coordinates, finger coordinates, fingerprint information, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the mobile phone 100 .
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • the software code of the interface display method provided in the embodiment of the present application may also be stored in an external memory, and the processor 110 may run the software code through the external memory interface 120 to execute the process steps of the interface display method to realize the one-handed operation mode .
  • the user's face information, viewpoint coordinates, finger coordinates, etc. collected by the mobile phone 100 may also be stored in an external memory.
  • the mobile phone 100 can implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, and an application processor. Such as answering calls, recording, etc.
  • the pressure sensor 180A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the pressure sensor 180A may be provided on the display screen 194 .
  • the mobile phone 100 can detect the click operation on the display screen 194 or on the application icon in the operation interface displayed on the display screen 194 through the pressure sensor 180A.
  • the pressure sensor 180A may be used to detect the shortcut gestures determined by the bone joint for the sliding operation of the display screen 194, for example, the shortcut gesture letters "Z" and "C” in FIG. 4B.
  • the gyroscope sensor 180B can be used to determine the motion attitude of the mobile phone 100 .
  • the angular velocity of the mobile phone 100 around three axes ie, the x, y and z axes
  • the gyro sensor 180B can be determined by the gyro sensor 180B to determine whether the mobile phone 100 is lifted.
  • the orientation sensor 180C can detect the absolute attitude value of the mobile phone 100 , and then determine the angle change of the mobile phone 100 .
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the mobile phone 100 in various directions (generally three axes). When the mobile phone 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc. In some possible embodiments, together with the direction sensor 180C, it can also detect whether the mobile phone 100 is lifted.
  • the cell phone 100 can measure the distance through infrared or laser. In some embodiments, when shooting a scene, the mobile phone 100 can use the distance sensor 180F to measure the distance to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • the mobile phone 100 emits infrared light through the light emitting diodes.
  • Cell phone 100 uses photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the mobile phone 100 . When insufficient reflected light is detected, the cell phone 100 may determine that there is no object near the cell phone 100 .
  • the mobile phone 100 can use the proximity light sensor 180G to detect that the user holds the mobile phone 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • the proximity light sensor 180G can also be used in holster mode, pocket mode automatically unlocks and locks the screen.
  • the proximity light sensor 180G is an infrared light sensor
  • the distance between the user's eyes and the screen 194 and the coordinates of the user's finger relative to the display screen 194 can also be obtained through infrared light emitting diodes.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the mobile phone 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, accessing application locks, taking photos with fingerprints, answering incoming calls with fingerprints, and the like.
  • the size of the fingerprint sensor 180H may be the same as the size of the display screen 194.
  • the display screen 194 displays the fingerprint unlock pattern in the display area corresponding to the coordinates of the finger, and the mobile phone 100 Detecting the click operation for the fingerprint unlocking pattern, the fingerprint sensor 180H is activated, the fingerprint sensor 180H starts to collect the fingerprint information of the user, and the fingerprint sensor 180H sends the collected fingerprint information to the processor 100, and the processor 100 stores the fingerprint information according to the internal memory 121.
  • the fingerprint information of the user matches the fingerprint information. If the matching is successful, the unlocking application process is started. If the matching is unsuccessful, the screen lock mode is continued.
  • Touch sensor 180K also called “touch panel”.
  • the touch sensor 180K may be disposed on the display screen 194 , and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor 180K may pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may be provided through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the mobile phone 100 , which is different from the position where the display screen 194 is located. For example, the touch sensor 180K detects a touch action on the display screen 194, and the coordinates of the user's finger can be determined.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 180M can also contact the pulse of the human body and receive the blood pressure beating signal.
  • the keys 190 include a power-on key, a volume key, and the like. Keys 190 may be mechanical keys. It can also be a touch key.
  • the cell phone 100 can receive key input and generate key signal input related to user settings and function control of the cell phone 100 . In a possible embodiment, the mobile phone 100 detects multiple click operations on the button 190 within a preset time period, and starts the one-handed operation mode of the mobile phone 100 .
  • Motor 191 can generate vibrating cues.
  • the motor 191 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback.
  • touch operations acting on different applications can correspond to different vibration feedback effects.
  • the indicator 192 can be an indicator light, which can be used to indicate the charging state, the change of the power, and can also be used to indicate a message, a missed call, a notification, and the like.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be contacted and separated from the mobile phone 100 by being inserted into the SIM card interface 195 or pulled out from the SIM card interface 195 .
  • the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the mobile phone 100 .
  • the mobile phone 100 may include more or less components than shown, or some components may be combined, or some components may be separated, or different component arrangements.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the hardware structure of the mobile phone 100 is described above, and the software structure of the mobile phone 100 is described below.
  • the software system of the mobile phone 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiments of the present application take an Android system with a layered architecture as an example to illustrate the software structure of the mobile phone 100 as an example.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
  • the Android system is divided into four layers, from top to bottom, the applications layer (hereinafter referred to as the "application layer”), the application framework (Application Framework) layer (herein referred to as “framework layer”), Android runtime (Android runtime) and system library layer (herein referred to as “system runtime layer”), and kernel layer.
  • application layer the applications layer
  • application framework Application Framework
  • Android runtime Android runtime
  • system library layer the kernel layer
  • At least one application program runs in the application program layer, and these application programs can be a Window program, a system setting program, a contact program, a short message program, a clock program, a camera application, etc.
  • Applications developed by third-party developers such as instant messaging programs, photo enhancement programs, game programs, etc.
  • the application package in the application layer is not limited to the above examples, and may actually include other application packages, which are not limited in this embodiment of the present application.
  • the framework layer provides an application programming interface (API) and a programming framework for applications in the application layer.
  • API application programming interface
  • the application framework layer includes some predefined functions.
  • the application framework layer is equivalent to a processing center, which decides to let the applications in the application layer take action.
  • the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen and control the change of the display window. For example, in FIG. 9A and FIG. 9B, a selected part of the display content is displayed in a separate display window (eg, the operation interface 901). For another example, in FIG. 11, the mobile phone 100 detects a sliding operation for the operation interface 1101 (ie, the display window). The icons in the operation interface 1101 are displayed in reduced or enlarged size.
  • the window manager can also tell if there is a status bar, lock screen, screen capture, etc.
  • Content providers are used to store and retrieve data and make these data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone book, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on. View systems can be used to build applications.
  • a display interface eg, the operation interface 101 shown in FIG. 1B , or the operation interface 901 shown in FIG. 9A ) may be composed of one or more views.
  • the phone manager is used to provide the communication function of the mobile phone 100 .
  • the management of call status including connecting, hanging up, etc.
  • the resource manager provides various resources for the application, such as localization strings, icons, pictures, layout files, video files and so on.
  • the notification manager enables applications to display notification information in the status bar, which can be used to convey notification-type messages, and can disappear automatically after a brief pause without user interaction. For example, the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also display notifications in the status bar at the top of the system in the form of graphs or scroll bar text, such as notifications of applications running in the background, and notifications on the screen in the form of dialog windows. For example, text information is prompted in the status bar, a prompt sound is issued, the electronic device vibrates, and the indicator light flashes.
  • the notification manager controls the display 194 and displays the dialog 302 .
  • the system runtime layer provides support for the upper layer, that is, the framework layer.
  • the Android operating system will run the C/C++ library contained in the system runtime layer to implement the functions to be implemented by the framework layer.
  • the kernel layer is the layer between hardware and software. As shown in FIG. 2B , the core layer at least includes a display driver, a sensor driver (such as an infrared light sensor, a touch sensor, a pressure sensor, etc.), a camera driver, an audio driver, and the like.
  • a display driver such as an infrared light sensor, a touch sensor, a pressure sensor, etc.
  • a sensor driver such as an infrared light sensor, a touch sensor, a pressure sensor, etc.
  • a camera driver such as an infrared light sensor, a touch sensor, a pressure sensor, etc.
  • FIG. 3 is a schematic flowchart of an interface display method provided by an embodiment of the present application. As shown in Figure 3, the method may include the following steps:
  • S301 The mobile phone 100 receives the first operation, and in response to the first operation, enters a one-handed operation mode.
  • the first operation is an operation to trigger the mobile phone 100 to enter the one-hand operation mode.
  • the processor 110 in the mobile phone 100 can control the display screen 194 to display an operation interface that can be operated by the user with one hand in a one-handed operation mode. , parameter control icons, function control icons in an application, etc.), users can operate with one hand. For specific plans, see the introduction below.
  • the first operation is an operation of lifting the mobile phone 100 to be lifted.
  • the mobile phone 100 can detect whether the mobile phone 100 is lifted through the acceleration sensor 180E and the direction sensor 180C.
  • the acceleration sensor 180E converts the detected signal into information that can be processed by the processor 110 and transmits it to the processor 110, and the kernel layer running in the processor 110 generates corresponding acceleration data based on the information
  • the direction sensor 180C converts the detected signal into information that the processor 110 can process and transmits it to the processor 110, and the kernel layer running in the processor 110 generates the angle change data of the mobile phone 100 relative to its own horizontal axis based on this information;
  • the kernel layer determines that the difference between the acceleration data and the preset acceleration data exceeds a preset threshold, and determines that the angle change data satisfies a preset condition (for example, the angle value changes from 0 to 180), then it is determined that the mobile phone 100 is lifted,
  • the processor 110 controls the display screen 194 to enter a one-handed operation mode.
  • the mobile phone 100 may continue to detect whether there is a click operation or a shaking operation on the display screen 194 of the mobile phone 100, and if so, enter the one-handed operation mode. In this way, user misoperations can be effectively avoided, thereby improving user experience.
  • the mobile phone 100 detects that the mobile phone 100 is lifted through the acceleration sensor detection 180E, and detects a double-tap action or a shaking action on the display screen 194, and the processor 110 controls the display screen 194 to enter one hand. operating mode.
  • the first operation is a user-defined shortcut gesture.
  • the mobile phone 100 can detect the shortcut gesture through the pressure sensor 180A.
  • the pressure sensor 180A converts the detected signal into information that can be processed by the processor 110 and transmits it to the processor 110, and the kernel layer running in the processor 110 generates the position data (specifically, the contact point) corresponding to the operation based on the information. coordinates, time stamps corresponding to contact coordinates, etc.); the kernel layer draws the data of the shortcut gesture according to the finger position data collected within the first preset time period, and determines the data of the shortcut gesture and the preset stored in the internal memory 121. Whether the data of the shortcut gesture matches, if so, the processor 110 controls the display screen 194 to enter the one-hand operation mode.
  • the internal memory 121 of the mobile phone 100 stores shortcut gestures (the letter Z and the letter C) customized by the user according to their own preferences.
  • the processor 110 in the mobile phone 100 detects the letter Z through the pressure sensor 180A
  • the processor 110 controls the display screen 194 to enter the one-hand operation mode.
  • the processor 110 controls the display screen 194 to enter the one-hand operation mode.
  • the first operation may be sound information.
  • the audio module 170 converts the detected sound signal into information that can be processed by the processor 110 and transmits it to the processor 110, and the kernel layer running in the processor 110 generates sound instruction data corresponding to the operation based on the information.
  • the kernel layer matches the voice command data collected within the preset duration with the preset voice command data. If the matching is successful, the processor 110 controls the display screen 194 to enter the one-hand operation mode; if the matching is unsuccessful, the original display is maintained. model.
  • the processor 110 in the mobile phone 100 detects the “turn on the one-handed mode” through the audio module 170 In the case of a voice command, the processor 110 controls the display screen 194 to enter the one-hand operation mode.
  • the first operation may be an operation on a hardware control (eg, key 190 ) of the mobile phone 100 .
  • the key 190 may be a power key or a volume key, which is not limited in this embodiment of the present application.
  • the first operation may be two click operations on the button 190 detected by the mobile phone 100 within a preset duration, or three click operations, or a touch operation, which is not specifically limited in this embodiment of the present application.
  • the processor 110 controls the display screen 194 to enter the one-handed operation mode.
  • the first operation may be multiple click operations or touch operations on the display screen 194 within a preset time period.
  • the touch sensor 180K in the display screen 194 detects the multiple click operations and touch operations, and the touch sensor 180K converts the detected signals into information that can be processed by the processor 110 and transmits it to the processor 110 , and the processor 110 It is determined that the multiple click operation or touch operation does not trigger any application process, and the processor 110 controls the window manager to output a dialog box for asking the user whether to enter the one-handed operation mode (refer to the dialog box 402 shown in FIG. 4D ), If the processor 110 detects a confirmation command, the display screen 194 is controlled to enter the one-hand operation mode, and if the processor 110 detects a negative command, the display screen 194 is controlled to maintain the original display mode.
  • the user can set it according to his own needs in the auxiliary function in the setting application program of the mobile phone 100 .
  • S302 The mobile phone 100 obtains the coordinates of the viewpoint and the coordinates of the finger.
  • viewpoint coordinates in this embodiment of the present application are used to represent the specific position where the user's line of sight falls on the display screen 194 of the mobile phone 100 .
  • the viewpoint coordinate system can be a two-dimensional coordinate system, and the two-dimensional plane corresponding to the two-dimensional coordinate system can be the plane where the display screen 194 of the mobile phone 100 is located.
  • the two-dimensional plane corresponding to the two-dimensional coordinate system can be the plane where the display screen 194 of the mobile phone 100 is located.
  • the mobile phone 100 determines the coordinates of the viewpoint, it can also be judged whether the time that the user's line of sight resides at the position where the coordinates of the viewpoint are located exceeds the first time threshold, and if so, the mobile phone 100 obtains the coordinates of the user's finger; If it exceeds, the mobile phone 100 continues to detect the coordinates of the new viewpoint until the viewpoint coordinates for which the dwell time of the user's sight exceeds the first time threshold occurs, and then obtains the coordinates of the user's finger. In this way, the obtained viewpoint coordinates are more accurate and closer to the viewpoint coordinates of the target that the user actually wants to operate.
  • the finger coordinates in this embodiment of the present application are used to represent the position information of the projection of the user's finger on the display screen 194 of the mobile phone 100 or the position information of the contact point between the user's finger and the display screen 194 on the display screen 194 .
  • the coordinates of the user's finger may be three-dimensional coordinates or two-dimensional coordinates, which are not specifically limited in the embodiments of the present application.
  • the finger coordinates are two-dimensional coordinates
  • the finger coordinate system and the viewpoint coordinate system may be the same coordinate system.
  • the mobile phone 100 to obtain the coordinates of the finger, which are not limited in this application.
  • Example 1 the mobile phone 100 detects the position data corresponding to the touch operation through the touch sensor 180K (specifically may include contact coordinates, time stamps corresponding to the contact coordinates, etc.), and then determines the coordinates of the user's finger.
  • the touch sensor 180K specifically may include contact coordinates, time stamps corresponding to the contact coordinates, etc.
  • the mobile phone 100 emits infrared rays through the infrared optical sensor, detects the infrared light reflected by the finger, and determines the coordinates of the user's finger.
  • the mobile phone 100 acquires the first display content of the first display area corresponding to the viewpoint coordinates, and displays the first display content to the second display area corresponding to the finger coordinates.
  • the mobile phone 100 obtains the user's viewpoint coordinates and finger coordinates and converts them into information that the processor 110 can process. the first display content of the first display area, and generate a new display window according to the first display content; the processor 110 controls the display screen 194 to display the new display window in the second display area corresponding to the finger coordinates.
  • the first display area may be a circular display area with the viewpoint coordinates as the center and the radius as the first preset value; in another possible implementation manner, the first display area may be A display area of a square whose center is the coordinate of the viewpoint and whose side length is the second preset value.
  • the above two are only examples and not limitations.
  • the first display content may include at least one application icon or control icon.
  • the specific content may be determined according to the display content of the first display area where the viewpoint coordinates are located.
  • the mobile phone 100 when the mobile phone 100 detects a click operation on any application icon or control icon in the first display content, the mobile phone 100 determines the icon as the target icon, and starts the process corresponding to the target icon.
  • the mobile phone 100 determines the coordinates of the viewpoint and the coordinates of the finger, it can also be judged whether the time that the user's line of sight resides at the position of the coordinates of the viewpoint exceeds the second time threshold. Display to the second display area; if not exceeded, the mobile phone 100 continues to detect the new viewpoint coordinates, until there is a viewpoint coordinate where the user's sight dwell time exceeds the second time threshold, the mobile phone 100 displays the first display content to the second display area. In this way, the acquired first display content is more accurate and closer to the display content that the user actually wants to operate.
  • the mobile phone 100 after detecting the first operation, the mobile phone 100 enters the one-handed operation mode.
  • the mobile phone 100 can obtain the coordinates of the user's viewpoint and finger, and use the coordinates of the viewpoint to The first display content in the corresponding first display area is displayed in the second display area corresponding to the finger coordinates. It can effectively solve the technical problem of inconvenient mobile phones for users with one hand, and effectively improve the user experience.
  • the method includes:
  • S501 The mobile phone 100 detects the face image of the user.
  • the processor 110 in the mobile phone 100 starts the camera 193, and the camera 193 captures environmental image data around the mobile phone 100, converts the environmental image data into information that can be processed by the processor 110, and transmits it to the processor 110; 110 determines whether there is human face data in the environmental image data, if there is human face data, the processor 110 continues to execute S502; if there is no human face data, the processor 110 controls the display screen 194 to exit the one-handed operation mode.
  • the legitimacy of the user identity may be further verified.
  • the specific implementations for verifying the legitimacy of the user identity include but are not limited to the following three ways:
  • Mode 1 The processor 110 in the mobile phone 100 determines whether the face image in the environmental image data matches the stored face image, and if so, executes S402; if not, controls the display screen 194 to exit the one-handed operation mode.
  • Mode 2 The processor 110 in the mobile phone 100 detects the user's voiceprint information, and determines whether the current user's voiceprint information matches the stored voiceprint information. If it matches, execute S402; if not, control the display screen 194 to exit One-handed operation mode.
  • Mode 3 The processor 110 in the mobile phone 100 detects the iris information of the user, and judges whether the iris information of the current user matches the stored iris information, if so, execute S402; if not, control the display screen 194 to exit the one-handed operation model.
  • the mobile phone 100 determines the distance between the user's eyes and the infrared camera, as well as the coordinates of the corneal reflection spot and the center of the pupil.
  • the processor 110 in the mobile phone 100 activates the infrared light sensor in the kernel layer and the infrared camera arranged in the center of the upper edge of the display screen 194;
  • the face emits infrared rays;
  • the infrared camera captures the light returned by the user's eyes, converts the light data into information that can be processed by the processor 110 and transmits it to the processor 110;
  • the processor 110 controls the processor 110 to run according to the information
  • the image processing library in the system library of the system library generates the depth image of the user's eyes;
  • the processor 110 processes the depth image according to the algorithm (for example, the coordinate transformation algorithm) stored in the internal memory 121, and then determines the user's eyes and the infrared camera.
  • the processor 110 can also determine the corneal reflection spot coordinates and the pupil center coordinates according to algorithms such as pupil segmentation, pupil coarse positioning, edge extraction, and edge fitting.
  • the mobile phone 100 determines the coordinates of the user's viewpoint according to the distance, the coordinates of the corneal reflection spot, and the coordinates of the pupil center.
  • the mobile phone 100 takes the center point of the upper edge of the display screen 194 as the origin, and the upper edge of the display screen 194 is the Y axis, passing through the center point of the upper edge of the display screen 194 and perpendicular to the display screen 194
  • the straight line on the upper edge is used as the X-axis, and the mobile phone 100 can determine the position of the user's line of sight relative to the display screen 194 according to the distance between the user's eyes and the infrared camera, the coordinates of the corneal reflection spot, and the center coordinates of the pupil (ie, the coordinates of the viewpoint) .
  • the processor 110 may acquire the offset of the corneal reflection spot relative to the center of the pupil in the depth image of the eye, and determine the first position of the user's line of sight in the world coordinate system according to the preset mapping relationship between the offset and the viewpoint coordinates. A coordinate; the processor 110 then performs matrix transformation on the first coordinate according to the distance between the user's eyes and the infrared camera, and then obtains the viewpoint coordinates (Ex, Ey) of the user's line of sight relative to the display screen 194 . The processor 110 determines the position information corresponding to the viewpoint coordinates, and controls the display screen 194 to display prompt information (eg, a cursor) at the position corresponding to the viewpoint coordinates.
  • prompt information eg, a cursor
  • the processor 110 of the mobile phone 100 activates the human eye viewpoint tracking sensor, and the human eye viewpoint tracking sensor acquires viewpoint coordinates (Ex, Ey) and converts them into the processor 110 can process
  • the information is sent to the processor 110; the processor 110 determines the position information corresponding to the viewpoint coordinates, and controls the display screen 194 to display prompt information (eg, a floating icon) at the position corresponding to the viewpoint data.
  • the human eye tracking sensor and the obtained viewpoint coordinates are converted into information that can be processed by the processor 110 and transmitted to the processor 110; the processor 110 controls the window manager in the application framework layer that the processor 110 runs. Obtain the display content within the preset range corresponding to the coordinates of the viewpoint. For example, referring to FIG. 6C , the processor 110 determines that the coordinate of the viewpoint falls on the APP8 icon, and the mobile phone 100 obtains the display information corresponding to the icon APP8 (ie, the content of the dotted box).
  • the method includes:
  • S701 The mobile phone 100 emits infrared rays to the user's finger.
  • the processor 110 in the mobile phone 100 may control the front camera of the mobile phone 100 to start, and the front camera starts to collect the surrounding environment image data, and converts the environment image data into
  • the information that can be processed by the processor 110 is transmitted to the processor 110; the processor 110 executes the finger recognition program stored in the internal memory 121, if it is determined that there are multiple fingers in the environmental image data, and the multiple fingers include the user's thumb , the processor 110 executes S701-S702 to obtain the coordinates of the thumb; if it is determined that there is only one finger in the environmental image data, the processor 110 executes S701-S702 to obtain the coordinates of the finger.
  • the user's finger may be a finger closest to the display screen 194 .
  • the mobile phone 100 determines whether the distance between the finger and the screen 194 is less than a preset value through the built-in gesture tracking sensor of the mobile phone 100. If it is less than the preset value, the mobile phone 100 performs step S701, If not less than the preset value, the processor in the mobile phone 100 controls the display screen 194 to exit the one-handed operation mode.
  • S702 The mobile phone 100 receives the infrared light reflected by the finger.
  • the processor 110 in the mobile phone 100 activates the infrared optical sensor in the kernel layer where the processor 110 runs, and the infrared optical sensor controls the infrared light-emitting diode inside it to emit infrared rays to the user's finger, and receives the infrared rays returned by the finger. Light.
  • the mobile phone 100 determines the coordinates of the finger according to the infrared light reflected by the finger.
  • the infrared light sensor is disposed at the center point of the upper edge of the display screen 194 , and the two infrared light emitting diodes in the infrared light sensor are disposed on both sides of the center point of the upper edge of the display screen 194 respectively.
  • the processor 110 obtains the second distance of the finger from the display screen 194 according to the built-in gesture tracking sensor, and the processor 110 performs geometric operations on the first distance and the second distance to obtain the projection point of the finger on the display screen 194 and The third distance from the center point of the upper edge of the display screen 194; the processor 110 controls the infrared camera to take an image of the finger, and determines the position of the finger relative to the infrared light sensor according to the image; and then the processor 110 can determine the finger relative to the display screen. 194
  • the coordinates of the center point ie, the infrared light sensor
  • the orientation of the finger relative to the infrared light sensor determine the coordinates of the finger. For example, as shown in FIG.
  • the second distance is Ez
  • the third distance is Ex
  • the user's finger is located in the true north direction of the infrared light sensor
  • the finger coordinates are (Ex, 0, Ez).
  • the finger coordinates here are three-dimensional coordinates.
  • a corresponding icon may be displayed on the display screen 194 of the mobile phone 100 .
  • the mobile phone 100 can also acquire the coordinates of the user's finger by the infrared light sensor, and control the display screen 194 to display a fingerprint unlock icon on the operable interface corresponding to the coordinates of the finger.
  • the touch sensor 180K of the mobile phone 100 detects the touch operation for the unlock icon, converts the touch operation into an electrical signal, drives the fingerprint sensor 180H by the kernel layer, collects the user's fingerprint by the fingerprint sensor 180H, and uses the fingerprint and the stored fingerprint. Make a match, and after the match is successful, enter the unlock mode.
  • the position of the fingerprint unlocking does not need to be fixed at a fixed position, which effectively improves the user experience.
  • the touch sensor 180K detects the sliding touch operation on the display screen 194, converts the detected signal into information that can be processed by the processor 110 and transmits it to the processor 110, and the kernel layer running in the processor 110 generates the corresponding operation based on the information.
  • Position data (specifically may include contact coordinates, time stamps corresponding to the contact coordinates, etc.); the processor 110 determines that the multiple click operations or touch operations do not trigger any application process, and the processor 110 controls the window manager to output for querying A dialog box for whether the user enters the one-handed operation mode (refer to the dialog box 302 shown in FIG. 4D ).
  • the processor 110 detects the confirmation instruction, it controls the display screen 194 to enter the one-handed operation mode, and the mobile phone 100 activates the infrared camera to obtain infrared image, and transmit the infrared image to the processor 110, the processor 110 determines the user's viewpoint coordinates and finger coordinates according to the infrared image, and transmits the viewpoint coordinates and finger coordinates to the application framework layer, the view system in the framework layer The first display content in the first display area corresponding to the viewpoint coordinates and the second display area corresponding to the finger coordinates are acquired, and the first display content is displayed in the second display area.
  • the above only takes the first operation as a sliding touch operation detected by the touch sensor 180K as an example to illustrate the interface display method provided by the embodiment of the present application.
  • the touch sensor 180K may also be other implementation manners during specific implementation, which are not limited in this embodiment of the present application.
  • the following describes the interface display method in the one-hand operation mode in the embodiments of the present application in combination with several specific application scenarios.
  • Scenario 1 Launch the app.
  • the mobile phone 100 to display the first display content in the first display area corresponding to the viewpoint coordinates to the second display area corresponding to the finger coordinates, including but not limited to the following methods:
  • Mode 1 when the processor 110 in the mobile phone 100 determines that the position corresponding to the viewpoint coordinates overlaps the position of an APP icon, the processor 110 of the mobile phone 100 determines the APP icon as the first display content, and displays the first display content to the position of the finger coordinates. the corresponding second display area.
  • the processor 110 in the mobile phone 100 determines that the position corresponding to the viewpoint coordinates overlaps the position of the APP8 icon, the mobile phone 100 uses the APP8 icon as the first display content, and controls The display screen 194 displays it in the second display area corresponding to the finger coordinates, that is, the operation interface 901 in (b) of FIG. 9A .
  • the mobile phone 100 may monitor the change of the viewpoint coordinates in real time, that is, acquire the third display content in the first display area corresponding to the new viewpoint coordinates in real time, and change the display content in the second display area corresponding to the finger coordinates from the first display area.
  • a display content is switched to a third display content.
  • the mobile phone 100 detects that the position of the viewpoint coordinates corresponds to the APP8 icon at the first moment, and detects that the position of the viewpoint coordinates corresponds to the APP9 icon at the second moment, then the finger coordinates at the first moment correspond to the APP9 icon.
  • the APP8 icon is displayed on the operation interface 901 at the second moment
  • the APP9 icon is displayed on the operation interface 901 corresponding to the finger coordinates at the second moment.
  • the display content in the second display area (eg, the operation interface 901 ) corresponding to the coordinates of the finger may change in real time as the coordinates of the viewpoint change.
  • Method 2 when the processor 110 in the mobile phone 100 determines that the position of the viewpoint coordinates does not overlap with the position of any APP icon, it determines the APP icon whose distance from the position of the viewpoint coordinates is a preset value as the first display content, and the first display content is A display content is displayed to the corresponding second display area of the finger coordinates.
  • the processor 110 determines that the coordinates of the viewpoint are located between the APP8 icon and the APP13 icon, then the processor 110 The APP8 icon and the APP13 icon are determined as the first display content, and the icon APP8 and APP13 are displayed in the second display area corresponding to the finger coordinates, that is, the operation interface 1001 shown in (b) in FIG. 10 .
  • the processor 110 in the mobile phone 100 can control the display screen 194 to display a new content in the second display area. Display content.
  • the mobile phone 100 detects a sliding operation (swiping left and right, sliding up and down, etc.) on the operation interface 1001 , and the operation interface 1001 displays the APP8 icon, APP9 icon, APP13 icon, and APP14 icon.
  • a sliding operation swipe left and right, sliding up and down, etc.
  • the processor 110 in the mobile phone 100 can control the display screen 194 in the second display area. Display the first display content after scaling at a preset scaling ratio. The zoom ratio may also be set by the user through a voice command when using the mobile phone 100 .
  • the processor 110 in the mobile phone 100 determines that the coordinates of the viewpoint are located between the APP8 icon and the APP13 icon, and processes The controller 110 takes the APP8 icon and the APP13 icon as the first display content, and displays the APP8 and APP13 icons on the operation interface 1101 corresponding to the coordinates of the finger.
  • the mobile phone 100 detects a touch operation on the operation interface 1101, and displays the APP8 icon and the APP13 icon on the operation interface 1101 at a preset magnification ratio; in FIG. 11(c), the mobile phone 100 detects two touch operations on the operation interface 1101 within a preset time period, and displays the APP8 icon and the APP13 icon on the operation interface 1101 at a preset reduced scale.
  • Scenario 2 The controls of the game interface.
  • the process of displaying the display content corresponding to the viewpoint coordinates to the display area corresponding to the finger coordinates by the mobile phone 100 is introduced.
  • the mobile phone 100 before displaying the display content corresponding to the viewpoint coordinates in the second display area corresponding to the finger coordinates, the mobile phone 100 further includes: judging whether there is an application icon at the first position corresponding to the finger coordinates in the second display area corresponding to the finger coordinates or control icon; if it exists, the display content is displayed in the second position of the second display area; if it does not exist, the display content is displayed in the first position, wherein the second position and the first position are separated by the first preset value.
  • FIG. 12A shows a schematic diagram of a possible game interface, in which controls 1, controls 2, controls 3, controls 4, controls are set in the game interface 5, control 6, control 7; wherein, the mobile phone 100 detects a click operation or a touch operation for the control 1, control 2, and control 3, and the processor 110 in the mobile phone 100 can control the walking speed of the characters in the game interface; the mobile phone 100 When a click operation or a touch operation for the control 4, control 5, and control 6 is detected, the processor 110 in the mobile phone 100 can control the characters in the game interface to make an attack action; the mobile phone 100 detects a click operation or touch operation for the control 7, The processor 110 in the mobile phone 100 can control the characters in the game interface to change game equipment.
  • Example 1 in (a) in FIG. 12A, the mobile phone 100 has turned on the one-handed operation mode, the mobile phone 100 starts to obtain the coordinates of the user's viewpoint and finger coordinates, and detects that the display area corresponding to the coordinates of the user's viewpoint includes controls 7 12A in (b), the mobile phone 100 determines the display area corresponding to the user's thumb, namely the operation interface 1201, the mobile phone 100 further determines that there is a control 6 in the position corresponding to the thumb coordinates, and the mobile phone 100 determines the operation interface 1201
  • the middle distance is another position of the preset distance from the thumb coordinate, and then the mobile phone 100 displays the control 7 to this position, so that the user can operate with one hand.
  • Example 2 in (a) in FIG. 12B, the mobile phone 100 has turned on the one-handed operation mode, the mobile phone 100 starts to obtain the coordinates of the user's viewpoint and finger coordinates, and detects that the display area corresponding to the coordinates of the user's viewpoint includes controls 7 12B in (b), the mobile phone 100 determines the display area corresponding to the user's thumb, namely the operation interface 1001, the mobile phone 100 further determines that there is no control 6 in the position corresponding to the thumb coordinates, and then the mobile phone 100 will The control 7 is displayed to this position, enabling the user to operate with one hand.
  • Scenario 3 The user answers the phone with one hand.
  • the operation interface of the mobile phone 100 includes an answering control 1301 for instructing to reject a call and an answering control 1302 for instructing to answer a call.
  • the mobile phone 100 detects the voice command for enabling the one-handed mode, enters the one-handed operation mode, and obtains the coordinates of the user's finger and Viewpoint coordinates; if the mobile phone 100 detects that the display area corresponding to the viewpoint coordinates contains the rejection control 1301, the icon of the rejection control 1301 is displayed to the operable range of the user's finger (for example, as shown in (b) in FIG. 13 ) ); if the display area corresponding to the coordinates of the viewpoint includes the answering control 1302, the original display mode is maintained (for example, as shown in (a) in FIG. 13 ).
  • the computer 1401 and the mobile phone 100 are connected in communication (wired connection, Bluetooth connection, wifi connection, etc., no limitation).
  • the computer 1401 obtains the coordinates of the user's viewpoint on the computer 1401
  • the mobile phone 100 obtains the coordinates of the user's finger on the mobile phone 100
  • the computer 1401 displays the display content in the display area corresponding to the coordinates of the viewpoint to the operation interface 1402 on the mobile phone 100 (ie the user's finger). operative range).
  • the process of obtaining the coordinates of the viewpoint, the coordinates of the finger, and controlling the display can be referred to above, and will not be repeated here.
  • the methods provided by the embodiments of the present application are introduced from the perspective of an electronic device as an execution subject.
  • the electronic device may include a hardware structure and/or software modules, and implement the above functions in the form of a hardware structure, a software module, or a hardware structure plus a software module. Whether one of the above functions is performed in the form of a hardware structure, a software module, or a hardware structure plus a software module depends on the specific application and design constraints of the technical solution.
  • the present application further provides an electronic device 1500 for implementing the methods in the embodiments shown in FIG. 3 , FIG. 5 , and FIG. 7 .
  • the electronic device 1500 may include a processor 1501 for executing programs or instructions stored in the memory 1502 .
  • the processor is used for the interface display method shown in FIG. 3 . .
  • the electronic device 1500 may further include a communication interface 1503 .
  • FIG. 15 shows with dashed lines that the communication interface 1503 is optional to the electronic device 1500 .
  • the numbers of the processors 1501, the memories 1502, and the communication interfaces 1503 do not constitute a limitation on the embodiments of the present application, and can be arbitrarily configured according to business requirements during specific implementation.
  • the memory 1502 is located outside the electronic device 1500 .
  • the electronic device 1500 includes the memory 1502 , the memory 1502 is connected to the at least one processor 1501 , and the memory 1502 stores instructions executable by the at least one processor 1501 .
  • FIG. 15 shows with dashed lines that memory 1502 is optional to electronic device 1500 .
  • the processor 1501 and the memory 1502 may be coupled through an interface circuit, or may be integrated together, which is not limited here.
  • the specific connection medium between the processor 1501, the memory 1502, and the communication interface 1503 is not limited in the embodiments of the present application.
  • the processor 1501, the memory 1502, and the communication interface 1503 are connected through a bus 1504 in FIG. 15.
  • the bus is represented by a thick line in FIG. 15.
  • the connection between other components is only for schematic illustration. , is not limited.
  • the bus can be divided into an address bus, a data bus, a control bus, and the like. For ease of representation, only one thick line is shown in FIG. 15, but it does not mean that there is only one bus or one type of bus.
  • the processor mentioned in the embodiments of the present application may be implemented by hardware or software.
  • the processor When implemented in hardware, the processor may be a logic circuit, an integrated circuit, or the like.
  • the processor When implemented in software, the processor may be a general-purpose processor implemented by reading software codes stored in memory.
  • the processor may be a central processing unit (Central Processing Unit, CPU), or other general-purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuit (Application Specific Integrated Circuit, ASIC) , Off-the-shelf Programmable Gate Array (Field Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
  • the memory mentioned in the embodiments of the present application may be volatile memory or non-volatile memory, or may include both volatile and non-volatile memory.
  • the non-volatile memory may be a read-only memory (Read-Only Memory, ROM), a programmable read-only memory (Programmable ROM, PROM), an erasable programmable read-only memory (Erasable PROM, EPROM), an electrically programmable read-only memory (Erasable PROM, EPROM). Erase programmable read-only memory (Electrically EPROM, EEPROM) or flash memory.
  • Volatile memory may be Random Access Memory (RAM), which acts as an external cache.
  • RAM Static RAM
  • DRAM Dynamic RAM
  • SDRAM Synchronous DRAM
  • SDRAM double data rate synchronous dynamic random access memory
  • Double Data Eate SDRAM DDR SDRAM
  • enhanced SDRAM ESDRAM
  • synchronous link dynamic random access memory Synchlink DRAM, SLDRAM
  • Direct Rambus RAM Direct Rambus RAM
  • the processor is a general-purpose processor, DSP, ASIC, FPGA or other programmable logic devices, discrete gate or transistor logic devices, or discrete hardware components
  • the memory storage module
  • memory described herein is intended to include, but not be limited to, these and any other suitable types of memory.
  • embodiments of the present application further provide a computer-readable medium, where the computer-readable medium stores program codes for device execution, the program codes including the interface display methods for executing the foregoing embodiments.
  • the embodiments of the present application also provide a computer program product containing instructions, when the computer program product runs on a computer, the computer can execute the interface display method in the foregoing embodiments.
  • an embodiment of the present application further provides a chip, the chip includes a processor and a data interface, the processor reads the instructions stored in the memory through the data interface, and executes the interface display in the foregoing embodiments. method.
  • the chip may further include a memory, in which instructions are stored, the processor is configured to execute the instructions stored in the memory, and when the instructions are executed, the processor uses to execute the interface display method in the foregoing embodiment.
  • each functional module in the embodiments of the present application may be integrated into one processing module, or each module may exist physically alone, or two or more modules may be integrated into one module.
  • the above-mentioned integrated modules can be implemented in the form of hardware, and can also be implemented in the form of software function modules.
  • the above-mentioned embodiments it may be implemented in whole or in part by software, hardware, firmware or any combination thereof.
  • software it can be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, all or part of the processes or functions described in the embodiments of the present application are generated.
  • the computer may be a general purpose computer, special purpose computer, computer network, or other programmable device.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be downloaded from a website site, computer, server, or data center Transmission to another website site, computer, server, or data center by wire (eg, coaxial cable, optical fiber, digital subscriber line) or wireless (eg, infrared, wireless, microwave, etc.).
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that includes an integration of one or more available media.
  • the usable media may be magnetic media (eg, floppy disks, hard disks, magnetic tapes), optical media (eg, DVDs), or semiconductor media (eg, solid state drives), and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

An interface display method and an electronic device. The method comprises: after receiving a first operation, an electronic device starting a one-handed operation mode in response to the first operation; and in the one-handed operation mode, the electronic device acquiring first viewpoint coordinates and finger coordinates, acquiring first display content in a first display area corresponding to the first viewpoint coordinates, and displaying the first display content in a second display area corresponding to the finger coordinates. By means of the method, a user can operate any content in an operation interface with one hand, thereby effectively improving the user experience.

Description

一种界面显示方法及电子设备Interface display method and electronic device
相关申请的交叉引用CROSS-REFERENCE TO RELATED APPLICATIONS
本申请要求在2020年10月29日提交中国专利局、申请号为202011179164.7、申请名称为“一种界面显示方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。This application claims the priority of the Chinese patent application with the application number 202011179164.7 and the application title "An interface display method and electronic device" filed with the China Patent Office on October 29, 2020, the entire contents of which are incorporated into this application by reference middle.
技术领域technical field
本申请涉及终端技术领域,尤其涉及一种界面显示方法及电子设备。The present application relates to the field of terminal technologies, and in particular, to an interface display method and an electronic device.
背景技术Background technique
随着终端技术的进步,电子设备的功能逐渐完善。为满足人们的视觉体验,电子设备的屏幕通常设置较大,然而用户单手操作大屏的电子设备时,由于用户手指长度的限制,导致用户体验较差。With the advancement of terminal technology, the functions of electronic devices are gradually improved. In order to satisfy people's visual experience, the screen of an electronic device is usually set to be larger. However, when a user operates an electronic device with a large screen with one hand, the user experience is poor due to the limitation of the length of the user's finger.
一些电子设备提供了单手操作模式,但由于电子设备的屏幕宽度和用户的手指长度的限制,仍需要用户多个手指配合使用(比如用户想要用左手点击手机屏幕最右侧的应用图标,需要跨越整个屏幕的宽度),仍然存在用户体验差的问题。Some electronic devices provide a one-handed operation mode, but due to the limitations of the screen width of the electronic device and the length of the user's fingers, the user still needs to use multiple fingers together (for example, if the user wants to click the application icon on the far right of the mobile phone screen with the left hand, need to span the entire width of the screen), there is still the problem of poor user experience.
因此,如何提供一种便捷的单手操作模式是亟需解决的问题。Therefore, how to provide a convenient one-handed operation mode is an urgent problem to be solved.
发明内容SUMMARY OF THE INVENTION
本申请提供一种界面显示方法及电子设备,用以解决现有单手操作模式中用户操作不便的问题。The present application provides an interface display method and an electronic device, which are used to solve the problem of inconvenient user operation in the existing one-hand operation mode.
第一方面,提供一种界面显示方法,该方法可以由电子设备执行,该方法包括:接收第一操作;响应于第一操作,启动单手操作模式;在单手操作模式下,获取第一视点坐标和手指坐标;获取第一视点坐标对应的第一显示区域内的第一显示内容,将第一显示内容显示于手指坐标对应的第二显示区域内。In a first aspect, an interface display method is provided, the method can be executed by an electronic device, the method includes: receiving a first operation; in response to the first operation, starting a one-handed operation mode; in the one-handed operation mode, acquiring the first Viewpoint coordinates and finger coordinates; obtain the first display content in the first display area corresponding to the first viewpoint coordinates, and display the first display content in the second display area corresponding to the finger coordinates.
在上述技术方案中,电子设备接收到第一操作时,响应于第一操作,启动单手操作模式;并且在单手操作模式下,电子设备可以获取第一视点坐标和手指坐标、以及获取第一视点坐标对应的第一显示区域内的第一显示内容,并将第一显示内容显示于手指坐标对应的第二显示区域内。这样,有效实现用户单手操作显示界面中的任意内容,进而有效提升用户体验。In the above technical solution, when the electronic device receives the first operation, in response to the first operation, the single-handed operation mode is activated; and in the single-handed operation mode, the electronic device can obtain the coordinates of the first viewpoint and the finger, and obtain the first The first display content in the first display area corresponding to a viewpoint coordinate, and the first display content is displayed in the second display area corresponding to the finger coordinate. In this way, the user can effectively operate any content in the display interface with one hand, thereby effectively improving the user experience.
为了提高方案的灵活性,在本申请实施例中,第一操作可以是电子设备被抬起或被摇晃;或者,针对电子设备的显示屏的点击操作或滑动操作;或者,语音指令;或者,针对电子设备的硬件按键的操作。In order to improve the flexibility of the solution, in this embodiment of the present application, the first operation may be that the electronic device is lifted or shaken; or, a click operation or a sliding operation on the display screen of the electronic device; or, a voice command; or, Operation of hardware keys for electronic devices.
需要说明的是,本申请实施例中,上述提供的仅为第一操作的几种具体实现方式,此外,本申请实施例中还可以通过其它方式实现第一操作,本申请实施例对此不作限定。It should be noted that, in the embodiments of the present application, the above only provide several specific implementation manners of the first operation. In addition, the first operation may also be implemented in other manners in the embodiments of the present application, which are not implemented in the embodiments of the present application. limited.
在一种可能的设计中,电子设备在将第一显示内容显示于手指坐标对应的第二显示区域内之后,电子设备还可以接收在第二显示区域内的第一滑动操作,并根据第一滑动操作, 确定第二显示内容,将第二显示区域内的显示内容由第一显示内容切换为第二显示内容。In a possible design, after the electronic device displays the first display content in the second display area corresponding to the coordinates of the finger, the electronic device can also receive the first sliding operation in the second display area, and The sliding operation determines the second display content, and switches the display content in the second display area from the first display content to the second display content.
在上述技术方案中,电子设备在将第一显示内容显示于手指坐标对应的第二显示区域内之后,若电子设备检测到针对第二显示区域的滑动操作,可以更新第二显示区域的显示内容,有效满足用户的操作需求,进而使得用户界面的显示更加智能。In the above technical solution, after the electronic device displays the first display content in the second display area corresponding to the coordinates of the finger, if the electronic device detects a sliding operation on the second display area, the display content of the second display area can be updated. , which effectively meets the user's operational needs, thereby making the display of the user interface more intelligent.
在一种可能的设计中,电子设备在将第一显示内容显示于手指坐标对应的第二显示区域内之后,还可以获取第二视点坐标以及第二视点坐标对应的第三显示区域内的第三显示内容,将第二显示区域内的显示内容由第一显示内容切换为第三显示内容。In a possible design, after displaying the first display content in the second display area corresponding to the coordinates of the finger, the electronic device may also obtain the coordinates of the second viewpoint and the first display area in the third display area corresponding to the coordinates of the second viewpoint. The third display content is to switch the display content in the second display area from the first display content to the third display content.
在上述技术方案中,电子设备在将第一显示内容显示于手指坐标对应的第二显示区域内之后,若电子设备还可以实时监测视点坐标的变化,例如,电子设备检测到第一视点坐标变化为第二视点坐标,电子设备可以获取该二视点坐标对应的第三显示区域内的第三显示内容,并将第二显示区域内的显示内容由第一显示内容切换为第三显示内容。这样,有效满足用户的操作需求,进而有效提升用户操作界面的智能性。In the above technical solution, after the electronic device displays the first display content in the second display area corresponding to the finger coordinates, if the electronic device can also monitor the change of the viewpoint coordinates in real time, for example, the electronic device detects the change of the first viewpoint coordinates For the second viewpoint coordinates, the electronic device can acquire the third display content in the third display area corresponding to the two viewpoint coordinates, and switch the display content in the second display area from the first display content to the third display content. In this way, the user's operation requirements are effectively met, thereby effectively improving the intelligence of the user operation interface.
在一种可能的设计中,电子设备在将第一显示内容显示于手指坐标对应的第二显示区域内之前,还需要判断手指坐标在第二显示区域中对应的第一位置上是否存在应用图标或控件图标;若存在,则将第一显示内容显示于第二显示区域中的第二位置上;第二位置与第一位置相距第一预设值;若不存在,则将第一显示内容显示于第一位置上。In a possible design, before displaying the first display content in the second display area corresponding to the finger coordinates, the electronic device also needs to determine whether there is an application icon at the first position corresponding to the finger coordinates in the second display area or a control icon; if it exists, the first display content will be displayed at the second position in the second display area; the second position is separated from the first position by the first preset value; if it does not exist, the first display content will be displayed displayed in the first position.
在上述技术方案中,电子设备在将第一显示内容显示于手指坐标对应的第二显示区域内之前,通过判断手指坐标对应的第一位置是否存在应用图标或控件图标,若第一位置存在应用图标或控件图标,则将第一显示内容显示于第二显示区域中与第一位置相距第一预设值的第二位置上;若不存在,则将第一显示内容显示于第一位置上。这样,有效避免第一显示内容覆盖第二显示区域内的其他控件图标,导致用户操作不便的情况,进而有效提升用户体验。In the above technical solution, before displaying the first display content in the second display area corresponding to the finger coordinates, the electronic device judges whether there is an application icon or a control icon at the first position corresponding to the finger coordinates, if there is an application icon or control icon at the first position icon or control icon, the first display content will be displayed at the second position in the second display area that is far from the first position by the first preset value; if it does not exist, the first display content will be displayed at the first position . In this way, it is effectively avoided that the first display content covers other control icons in the second display area, which may cause inconvenience for the user to operate, thereby effectively improving the user experience.
在一种可能的设计中,电子设备在将第一显示内容显示于手指对应的第二显示区域内之前,电子设备还可以确定用户视线停留在第一视点坐标的时长超过第二预设阈值。In a possible design, before the electronic device displays the first display content in the second display area corresponding to the finger, the electronic device may further determine that the user's line of sight stays on the coordinates of the first viewpoint for longer than a second preset threshold.
在上述技术方案中,电子设备在确定用户视线停留在第一视点坐标的时长超过第二预设阈值之后,才将第一显示内容显示于手指对应的第二显示区域内。这样,有效避免误操作,进而有效提升用户操作界面显示的准确性,有效提升用户体验。In the above technical solution, the electronic device displays the first display content in the second display area corresponding to the finger only after it is determined that the user's line of sight stays on the coordinates of the first viewpoint for longer than the second preset threshold. In this way, misoperations are effectively avoided, thereby effectively improving the display accuracy of the user operation interface and effectively improving the user experience.
在一种可能的设计中,电子设备还可以接收针对第一显示内容中至少一个目标图标的点击操作,确定第一目标图标,执行第一目标图标对应的应用进程。In a possible design, the electronic device may also receive a click operation on at least one target icon in the first display content, determine the first target icon, and execute an application process corresponding to the first target icon.
在上述技术方案中,电子设备若检测到针对第一显示内容中至少一个目标图标的点击操作,就可以确定用户想要操作的第一目标图标,进而启动第一目标图标对应的应用进程,有效提升电子设备和用户的交互效率,进而有效提升用户体验。In the above technical solution, if the electronic device detects a click operation on at least one target icon in the first display content, it can determine the first target icon the user wants to operate, and then start the application process corresponding to the first target icon, effectively Improve the interaction efficiency between electronic devices and users, thereby effectively improving the user experience.
在一种可能的设计中,电子设备响应于第一操作,获取第一视点坐标的具体方式可以是:确定用户的眼睛与红外摄像头的第一距离,以及角膜反射光斑坐标以及瞳孔中心坐标;根据第一距离以及角膜反射光斑坐标以及瞳孔中心坐标,确定第一视点坐标。In a possible design, in response to the first operation, the electronic device may acquire the coordinates of the first viewpoint by: determining the first distance between the user's eyes and the infrared camera, as well as the coordinates of the corneal reflection spot and the center of the pupil; The first distance, the coordinates of the corneal reflection spot and the coordinates of the pupil center determine the coordinates of the first viewpoint.
在上述技术方案中,电子设备通过用户的眼睛与红外摄像头的第一距离、角膜反射光斑坐标以及瞳孔中心坐标里确定第一视点坐标。这样,有效提升电子设备和用户的交互效率,进而有效提升用户体验。In the above technical solution, the electronic device determines the coordinates of the first viewpoint through the first distance between the user's eyes and the infrared camera, the coordinates of the corneal reflection spot, and the coordinates of the center of the pupil. In this way, the interaction efficiency between the electronic device and the user is effectively improved, thereby effectively improving the user experience.
在一种可能的设计中,电子设备响应于第一操作,获取手指坐标,具体方式可以是:确定手指与电子设备中的红外光传感器的第二距离、手指距离电子设备的显示屏的第三距 离、手指相对于红外光传感器的方位;根据第二距离和第三距离,确定手指在显示屏上的投影点与红外光传感器的第四距离;根据第四距离和方位,确定手指坐标。In a possible design, the electronic device acquires the coordinates of the finger in response to the first operation, and the specific manner may be: determining the second distance between the finger and the infrared light sensor in the electronic device, and the third distance between the finger and the display screen of the electronic device. distance, the orientation of the finger relative to the infrared light sensor; according to the second distance and the third distance, determine the fourth distance between the projection point of the finger on the display screen and the infrared light sensor; according to the fourth distance and orientation, determine the coordinates of the finger.
在上述技术方案中,电子设备通过手指在显示屏上的投影点与红外光传感器的第四距离、手指相对于红外光传感器的方位,确定手指坐标。这样,有效提升电子设备和用户的交互效率,进而有效提升用户体验。In the above technical solution, the electronic device determines the coordinates of the finger according to the fourth distance between the projection point of the finger on the display screen and the infrared light sensor, and the orientation of the finger relative to the infrared light sensor. In this way, the interaction efficiency between the electronic device and the user is effectively improved, thereby effectively improving the user experience.
第二方面,提供一种电子设备,该电子设备包括用于执行第一方面中的方法的模块。In a second aspect, there is provided an electronic device comprising means for performing the method of the first aspect.
作为一种示例,该电子设备可以包括:处理器和存储器;其中,存储器用于存储一个或多个计算机程序;当存储器存储的一个或多个计算机程序被处理器执行时,使得电子设备执行如下步骤:As an example, the electronic device may include: a processor and a memory; wherein the memory is used to store one or more computer programs; when the one or more computer programs stored in the memory are executed by the processor, the electronic device is made to execute as follows step:
接收第一操作;响应于第一操作,启动单手操作模式;在单手操作模式下,获取第一视点坐标和手指坐标;获取第一视点坐标对应的第一显示区域内的第一显示内容,将第一显示内容显示于手指坐标对应的第二显示区域内。Receive a first operation; in response to the first operation, start a single-handed operation mode; in the single-handed operation mode, obtain the coordinates of the first viewpoint and the coordinates of the finger; obtain the first display content in the first display area corresponding to the coordinates of the first viewpoint , the first display content is displayed in the second display area corresponding to the finger coordinates.
可选的,存储器位于电子设备之外。Optionally, the memory is located outside the electronic device.
可选的,电子设备包括存储器,存储器与至少一个处理器相连,存储器存储有可被至少一个处理器执行的指令。Optionally, the electronic device includes a memory connected to the at least one processor, and the memory stores instructions executable by the at least one processor.
在一种可能的设计中,电子设备还包括显示屏,第一操作为:电子设备被抬起或被摇晃;或者,针对显示屏的点击操作或滑动操作;或者,语音指令;或者,针对电子设备的硬件按键的操作。In a possible design, the electronic device further includes a display screen, and the first operation is: the electronic device is lifted or shaken; or, a click operation or sliding operation for the display screen; or a voice command; or, for the electronic device The operation of the hardware keys of the device.
在一种可能的设计中,当存储器存储的一个或多个计算机程序被处理器执行时,还使得电子设备执行如下步骤:接收在第二显示区域内的第一滑动操作;根据第一滑动操作,确定第二显示内容,将第二显示区域内的显示内容由第一显示内容切换为第二显示内容。In a possible design, when one or more computer programs stored in the memory are executed by the processor, the electronic device is further caused to perform the following steps: receiving a first sliding operation in the second display area; according to the first sliding operation , determine the second display content, and switch the display content in the second display area from the first display content to the second display content.
在一种可能的设计中,当存储器存储的一个或多个计算机程序被处理器执行时,还使得电子设备执行如下步骤:获取第二视点坐标以及第二视点坐标对应的第三显示区域内的第三显示内容,将第二显示区域内的显示内容由第一显示内容切换为第三显示内容。In a possible design, when one or more computer programs stored in the memory are executed by the processor, the electronic device is further caused to perform the following steps: acquiring the coordinates of the second viewpoint and the coordinates in the third display area corresponding to the coordinates of the second viewpoint The third display content is to switch the display content in the second display area from the first display content to the third display content.
在一种可能的设计中,当存储器存储的一个或多个计算机程序被处理器执行时,还使得电子设备执行如下步骤:判断手指坐标在第二显示区域中对应的第一位置上是否存在应用图标或控件图标;若存在,则将第一显示内容显示于第二显示区域中的第二位置上;第二位置与第一位置相距第一预设值;若不存在,则将第一显示内容显示于第一位置上。In a possible design, when one or more computer programs stored in the memory are executed by the processor, the electronic device is further caused to perform the following steps: judging whether there is an application on the first position corresponding to the finger coordinates in the second display area icon or control icon; if it exists, the first display content will be displayed on the second position in the second display area; the second position is separated from the first position by the first preset value; if it does not exist, the first display will be displayed The content is displayed in the first position.
在一种可能的设计中,当存储器存储的一个或多个计算机程序被处理器执行时,还使得电子设备执行如下步骤:确定用户视线停留在第一视点坐标的时长超过第二时间阈值。In a possible design, when one or more computer programs stored in the memory are executed by the processor, the electronic device also causes the electronic device to perform the following step: determining that the user's sight stays on the coordinates of the first viewpoint for a duration exceeding a second time threshold.
在一种可能的设计中,当存储器存储的一个或多个计算机程序被处理器执行时,还使得电子设备执行如下步骤:接收针对第一显示内容中至少一个目标图标的点击操作,确定第一目标图标,执行第一目标图标对应的应用进程。In a possible design, when one or more computer programs stored in the memory are executed by the processor, the electronic device is further caused to perform the following steps: receiving a click operation on at least one target icon in the first display content, determining the first Target icon, execute the application process corresponding to the first target icon.
在一种可能的设计中,电子设备还包括红外摄像头,当存储器存储的一个或多个计算机程序被处理器执行时,还使得电子设备执行如下步骤:确定用户的眼睛与红外摄像头的第一距离、角膜反射光斑坐标以及瞳孔中心坐标;根据第一距离、角膜反射光斑坐标以及瞳孔中心坐标,确定第一视点坐标。In a possible design, the electronic device further includes an infrared camera, and when one or more computer programs stored in the memory are executed by the processor, the electronic device also causes the electronic device to perform the following steps: determining a first distance between the user's eyes and the infrared camera , the coordinates of the corneal reflection spot and the coordinates of the pupil center; the coordinates of the first viewpoint are determined according to the first distance, the coordinates of the cornea reflection spot and the coordinates of the pupil center.
在一种可能的设计中,电子设备还包括红外光传感器,当存储器存储的一个或多个计算机程序被处理器执行时,还使得电子设备执行如下步骤:确定手指与电子设备中的红外光传感器的第二距离、手指距离电子设备的显示屏的第三距离、手指相对于红外光传感器 的方位;根据第二距离和第三距离,确定手指在显示屏上的投影点与红外光传感器的第四距离;根据第四距离和方位,确定手指坐标。In a possible design, the electronic device further includes an infrared light sensor, and when one or more computer programs stored in the memory are executed by the processor, the electronic device also causes the electronic device to perform the following steps: determine the relationship between the finger and the infrared light sensor in the electronic device According to the second distance, the third distance between the finger and the display screen of the electronic device, the orientation of the finger relative to the infrared light sensor; according to the second distance and the third distance, determine the projection point of the finger on the display screen and the infrared light sensor. Four distances; according to the fourth distance and orientation, determine the finger coordinates.
这些电子设备可以执行上述第一方面或第一方面任一种可能的设计中的方法示例中的相应功能,具体参见方法示例中的详细描述,此处不作赘述。These electronic devices may perform the corresponding functions in the method example in the first aspect or any possible design of the first aspect. For details, please refer to the detailed description in the method example, which will not be repeated here.
第三方面,提供一种计算机可读介质,该计算机可读介质存储有用于设备执行的程序代码,该程序代码被所述设备执行时,上述第一方面或第一方面任一种可能的设计中的方法将被执行。In a third aspect, a computer-readable medium is provided, and the computer-readable medium stores a program code for execution by a device. When the program code is executed by the device, the above-mentioned first aspect or any possible design of the first aspect The method in will be executed.
第四方面,提供一种包含指令的计算机程序指令,当该程序指令在计算机上运行时,使得上述第一方面或第一方面任一种可能的设计中的方法被执行。In a fourth aspect, a computer program instruction comprising instructions is provided, when the program instruction is executed on a computer, the method in the above-mentioned first aspect or any possible design of the first aspect is performed.
第五方面,提供一种芯片,所述芯片包括处理器与数据接口,所述处理器用于通过所述数据接口读取并执行存储器上存储的指令,使得第一方面或第一方面任一种可能的设计中的方法被执行。A fifth aspect provides a chip, the chip includes a processor and a data interface, the processor is configured to read and execute instructions stored on a memory through the data interface, so that the first aspect or any one of the first aspect The methods in the possible designs are implemented.
在一种可能的设计中,所述芯片还可以包括所述存储器,所述存储器中存储有所述指令。In a possible design, the chip may further include the memory, where the instructions are stored.
上述第三方面至第五方面中任一方面中的各种设计方案可以达到的技术效果请参照上述第一方面或第一方面任一种可能的设计中的方法可以带来的技术效果,这里不再予以重复赘述。For the technical effects that can be achieved by the various design solutions in any of the above-mentioned third aspects to the fifth aspect, please refer to the technical effects that can be brought about by the above-mentioned first aspect or any possible design method of the first aspect, here It will not be repeated.
附图说明Description of drawings
图1A为用户单手操作电子设备的场景图;1A is a scene diagram of a user operating an electronic device with one hand;
图1B为本申请提供的一种手机的用户图形界面的示意图;1B is a schematic diagram of a user graphical interface of a mobile phone provided by the application;
图1C为本申请提供的一种手机的用户图形界面的示意图;1C is a schematic diagram of a user graphical interface of a mobile phone provided by the application;
图2A为本申请一实施例提供的手机100的硬件结构示意图;FIG. 2A is a schematic diagram of a hardware structure of a mobile phone 100 according to an embodiment of the present application;
图2B为本申请一实施例提供的手机100的软件结构示意图;FIG. 2B is a schematic diagram of the software structure of the mobile phone 100 according to an embodiment of the application;
图3为本申请一实施例提供的一种界面显示方法的流程示意图;3 is a schematic flowchart of an interface display method according to an embodiment of the present application;
图4A为本申请一实施例提供的手机100进入单手操作模式的操作示意图之一;FIG. 4A is one of schematic diagrams of operations of the mobile phone 100 entering a one-handed operation mode according to an embodiment of the present application;
图4B为本申请一实施例提供的手机100进入单手操作模式的操作示意图之二;FIG. 4B is the second schematic diagram of the operation of the mobile phone 100 entering the one-hand operation mode according to an embodiment of the present application;
图4C为本申请一实施例提供的手机100进入单手操作模式的操作示意图之三;4C is the third schematic diagram of the operation of the mobile phone 100 entering the one-hand operation mode according to an embodiment of the present application;
图4D为本申请一实施例提供的手机100进入单手操作模式的操作示意图之三;FIG. 4D is the third schematic diagram of the operation of the mobile phone 100 entering the one-hand operation mode according to an embodiment of the present application;
图5为本申请一实施例提供的一种可能的确定用户视点坐标的流程示意图;5 is a schematic flowchart of a possible determination of user viewpoint coordinates according to an embodiment of the present application;
图6A为本申请一实施例提供的手机100的获取视点坐标的示意图之一;FIG. 6A is one of the schematic diagrams of acquiring viewpoint coordinates of the mobile phone 100 according to an embodiment of the present application;
图6B为本申请一实施例提供的手机100的获取视点坐标的示意图之二;FIG. 6B is the second schematic diagram of acquiring viewpoint coordinates of the mobile phone 100 according to an embodiment of the application;
图6C为本申请一实施例提供的手机100的用户图形界面的示意图;6C is a schematic diagram of a user graphical interface of the mobile phone 100 according to an embodiment of the present application;
图7为本申请一实施例提供的一种可能的确定用户手指坐标的流程示意图;FIG. 7 is a schematic flowchart of a possible determination of the coordinates of a user's finger provided by an embodiment of the present application;
图8A为本申请一实施例提供的手机100的获取手指坐标的示意图;FIG. 8A is a schematic diagram of acquiring finger coordinates of the mobile phone 100 according to an embodiment of the application;
图8B为本申请一实施例提供的手机100的用户图形界面的示意图;FIG. 8B is a schematic diagram of a user graphical interface of the mobile phone 100 according to an embodiment of the present application;
图9A为本申请一实施例提供的手机100的用户图形界面的示意图;9A is a schematic diagram of a user graphical interface of a mobile phone 100 according to an embodiment of the present application;
图9B为本申请一实施例提供的手机100的用户图形界面的示意图;FIG. 9B is a schematic diagram of a user graphical interface of the mobile phone 100 according to an embodiment of the application;
图10为本申请一实施例提供的手机100的用户图形界面的示意图;FIG. 10 is a schematic diagram of a user graphical interface of a mobile phone 100 according to an embodiment of the application;
图11为本申请一实施例提供的手机100的用户图形界面的示意图;FIG. 11 is a schematic diagram of a user graphical interface of a mobile phone 100 according to an embodiment of the application;
图12A为本申请一实施例提供的一种可能的游戏界面示意图;12A is a schematic diagram of a possible game interface provided by an embodiment of the application;
图12B为本申请一实施例提供的另一种可能的游戏界面示意图;FIG. 12B is a schematic diagram of another possible game interface provided by an embodiment of the application;
图13为本申请一实施例提供的手机100的用户图形界面的示意图;FIG. 13 is a schematic diagram of a user graphical interface of a mobile phone 100 according to an embodiment of the application;
图14为本申请一实施例提供的一种多设备实现单手操作的用户图形界面的示意图;14 is a schematic diagram of a user graphical interface for implementing one-handed operation with multiple devices according to an embodiment of the application;
图15为本申请一实施例提供的一种电子设备的结构示意图。FIG. 15 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
具体实施方式Detailed ways
图1A中示出了用户单手操作电子设备的场景图,其中电子设备是以手机为例。图1A中的电子设备的图形用户界面(graphical user interface,GUI)电子设备101上显示有多个应用程序(application,APP)的图标。用户单手握持电子设备时,若用户需要对距离用户当前用于操作的手指较远位置处的区域(例如图1A所示操作界面101上方的APP1图标),由于用户手指长度的限制,无法对该些区域进行操作,使得用户体验较差。FIG. 1A shows a scene diagram of a user operating an electronic device with one hand, where the electronic device is a mobile phone as an example. A graphical user interface (graphical user interface, GUI) of the electronic device in FIG. 1A is displayed on the electronic device 101 with icons of a plurality of application programs (application, APP). When the user holds the electronic device with one hand, if the user needs to control the area that is far away from the finger currently used for operation by the user (for example, the APP1 icon above the operation interface 101 shown in FIG. Operating on these areas makes the user experience poor.
为了让用户能够较为方便地单手操作电子设备,一些电子设备可以提供单手操作模式。In order to allow the user to operate the electronic device with one hand more conveniently, some electronic devices may provide a one-handed operation mode.
一种具体的示例,请参见图1B,在图1B中,电子设备检测到针对屏幕上从上到下的滑动的操作之后,响应于该操作,确定启动单手模式,将操作界面101整体缩小,并在靠近操作位置(即用户手指)的显示区域进行显示,得到图1B所示的操作界面102。在这种显示方式中,电子设备设定的操作界面102相对于操作界面101的缩小比例是固定的,若电子设备屏幕较大或用户手指较短,用户仍然存在单手操作不方便的问题。For a specific example, please refer to FIG. 1B . In FIG. 1B , after the electronic device detects an operation for sliding from top to bottom on the screen, in response to the operation, it is determined to activate the one-handed mode, and the operation interface 101 is reduced as a whole. , and display it in the display area close to the operation position (ie, the user's finger) to obtain the operation interface 102 shown in FIG. 1B . In this display mode, the reduction ratio of the operation interface 102 set by the electronic device relative to the operation interface 101 is fixed. If the screen of the electronic device is large or the user's fingers are short, the user still has the problem of inconvenient one-handed operation.
另一种具体的示例,请参见图1C,电子设备在预设时长内检测到针对该电子设备的起始键的连续两次点击操作时,将电子设备的操作界面101下移到该电子设备的显示屏下方显示,得到操作界面103,但这种方式中仅仅是纵向缩小操作界面101,横向并未缩小操作界面101,用户仍然很难操作横向距离用户手指较远的区域。例如在图1C中,用户右手持电子设备,右手的拇指是无法触摸到APP1图标所在的区域的,仍然存在用户体验差的问题。For another specific example, see FIG. 1C , when the electronic device detects two consecutive click operations on the start key of the electronic device within a preset time period, it moves the operation interface 101 of the electronic device down to the electronic device The operation interface 103 is obtained, but in this way, the operation interface 101 is only reduced in the vertical direction, and the operation interface 101 is not reduced in the horizontal direction. For example, in FIG. 1C , the user holds the electronic device with the right hand, and the thumb of the right hand cannot touch the area where the APP1 icon is located, and there is still a problem of poor user experience.
鉴于此,本申请实施例提供了一种界面显示方法,在电子设备检测到第一操作之后,进入单手操作模式,在单手操作模式下,电子设备获取手指坐标和第一视点坐标,并将第一视点坐标对应的第一显示区域内的第一显示内容显示在手指坐标对应的第二显示区域(即手指的可操作范围)。这样,用户只需要控制视点位置,就可以在同一区域单手操作位于操作界面中任意位置的应用图标或控件图标,提升了用户体验。该技术方案的具体实现方式将在后文详细介绍。In view of this, an embodiment of the present application provides an interface display method. After the electronic device detects the first operation, it enters a one-handed operation mode. In the one-handed operation mode, the electronic device obtains the coordinates of the finger and the first viewpoint coordinates, and The first display content in the first display area corresponding to the coordinates of the first viewpoint is displayed in the second display area corresponding to the coordinates of the finger (ie, the operable range of the finger). In this way, the user only needs to control the position of the viewpoint, and can operate the application icon or control icon located at any position in the operation interface with one hand in the same area, which improves the user experience. The specific implementation manner of this technical solution will be described in detail later.
首先,对本申请实施例中的部分用语进行解释说明,以便于本领域技术人员理解。First, some terms in the embodiments of the present application are explained to facilitate understanding by those skilled in the art.
本申请实施例涉及的应用程序(application,简称app),为能够实现某项或多项特定功能的软件程序。通常,电子设备中可以安装多个应用程序。比如,相机应用、图库应用、短信应用、彩信应用、各种邮箱应用、微信、腾讯聊天软件(QQ)、WhatsApp Messenger、连我(Line)、照片分享(instagram)、Kakao Talk、钉钉等。下文中提到的应用程序,可以是终端出厂时已安装的应用程序,也可以是用户在使用电子设备的过程中从网络下载或其他终端获取的应用程序。An application program (application, app for short) involved in the embodiments of the present application is a software program capable of implementing one or more specific functions. Typically, multiple applications can be installed in an electronic device. For example, camera application, gallery application, SMS application, MMS application, various mailbox applications, WeChat, Tencent chat software (QQ), WhatsApp Messenger, Link (Line), photo sharing (instagram), Kakao Talk, DingTalk, etc. The application program mentioned below may be an application program installed when the terminal leaves the factory, or it may be an application program downloaded from the network or acquired by other terminals when the user uses the electronic device.
本申请实施例涉及的操作界面,也可以称为用户界面(User Interface,UI)或图形界面,或者其它名称,操作界面是电子设备与用户进行人机交互的界面,电子设备可以通过操作界面向输出信息,例如显示图像或文字等,还可以通过操作界面接收用户的操作,例如接 收用户的触摸操作。比如,图1A中的操作界面101,或图1B中的操作界面102,或图1C中的操作界面103,或图9A或图9B中的操作界面901,或图10中的操作界面1001,或图11中的操作界面1101等都是操作界面。The operation interface involved in the embodiments of the present application may also be referred to as a user interface (User Interface, UI) or a graphical interface, or other names. The operation interface is an interface between an electronic device and a user for human-computer interaction. Output information, such as displaying images or text, etc., can also receive user operations through the operation interface, such as receiving user touch operations. For example, the operation interface 101 in FIG. 1A , or the operation interface 102 in FIG. 1B , or the operation interface 103 in FIG. 1C , or the operation interface 901 in FIG. 9A or FIG. 9B , or the operation interface 1001 in FIG. 10 , or The operation interface 1101 and the like in FIG. 11 are all operation interfaces.
本申请实施例涉及的应用图标,是具有明确指代含义的图形,明确指代一个应用。应用图标可以显示在电子设备的桌面(或者称为主屏幕界面上)。电子设备检测到针对这些应用图标的点击操作,可以运行相应的应用程序,启动相应的应用进程。例如,假设图1A中的APP1图标为微信的应用图标,手机检测到针对APP1图标的点击操作时,运行微信,启动微信。The application icons involved in the embodiments of the present application are graphics with explicit meanings, which explicitly refer to an application. Application icons can be displayed on the desktop (or on the home screen interface) of the electronic device. The electronic device detects the click operation on these application icons, and can run the corresponding application program and start the corresponding application process. For example, assuming that the APP1 icon in FIG. 1A is an application icon of WeChat, when the mobile phone detects a click operation on the APP1 icon, it runs WeChat and starts WeChat.
本申请实施例涉及的控件图标(简称控件),可以是某个应用的界面内用于实现该应用的某个具体功能的图标,电子设备检测到针对该控件图标的点击操作,可以启动该某个应用下对应的子进程。例如,图12A中在游戏应用已启动时,手机检测到针对游戏应用操作界面中的控件①的点击或长按操作,可以启动该游戏应用中控制游戏人物的前进方向的进程。The control icon (referred to as the control) involved in the embodiment of the present application may be an icon in the interface of a certain application used to implement a specific function of the application. The electronic device detects a click operation on the control icon and can start the certain application. The corresponding child process under each application. For example, when the game application has been started in FIG. 12A, the mobile phone detects a click or long-press operation on the control ① in the game application operation interface, and can start the process of controlling the forward direction of the game character in the game application.
本申请实施例涉及的单手操作模式,是为了方便用户可以单手操作显示界面中的任意图标而设置的一种界面显示模式,在这种界面显示模式中,电子设备可以将用户的视点坐标对应的显示区域内的显示内容显示到该用户的手指坐标对应的显示区域内。例如,图9A中将视点坐标对应的APP8图标显示到手指坐标对应的显示区域,即图9A的(b)中所示操作界面901。The one-handed operation mode involved in the embodiments of the present application is an interface display mode set for the convenience of the user to operate any icon in the display interface with one hand. In this interface display mode, the electronic device can display the coordinates of the user's viewpoint The display content in the corresponding display area is displayed in the display area corresponding to the coordinates of the user's finger. For example, in FIG. 9A , the APP8 icon corresponding to the viewpoint coordinates is displayed in the display area corresponding to the finger coordinates, that is, the operation interface 901 shown in (b) of FIG. 9A .
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述。显然,所描述的实施例仅仅是本申请一部分实施例,并不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application. Obviously, the described embodiments are only a part of the embodiments of the present application, not all of the embodiments. Based on the embodiments in the present application, all other embodiments obtained by those of ordinary skill in the art without creative efforts shall fall within the protection scope of the present application.
本申请实施例涉及的多个,是指大于或等于两个。需要说明的是,本文中术语“和/或”,仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。另外,本文中字符“/”,如无特殊说明,一般表示前后关联对象是一种“或”的关系。且在本申请实施例的描述中,“第一”、“第二”等词汇,仅用于区分描述的目的,而不能理解为指示或暗示相对重要性,也不能理解为指示或暗示顺序。The multiple involved in the embodiments of the present application refers to greater than or equal to two. It should be noted that the term "and/or" in this document is only an association relationship to describe associated objects, indicating that there can be three kinds of relationships, for example, A and/or B, which can mean that A exists alone, and A exists at the same time and B, there are three cases of B alone. In addition, the character "/" in this text, unless otherwise specified, generally indicates that the related objects before and after are an "or" relationship. In the description of the embodiments of the present application, words such as "first" and "second" are only used for the purpose of distinguishing and describing, and cannot be understood as indicating or implying relative importance, nor can they be understood as indicating or implying order.
以下介绍用于执行本申请实施例提供的界面显示方法的电子设备、用于这样的电子设备的图形用户界面(graphical user interface,GUI)、和用于使用这样的电子设备的实施例。在本申请一些实施例中,电子设备可以是包含显示屏的便携式终端,诸如手机、平板电脑等。便携式电子设备的示例性实施例包括但不限于搭载
Figure PCTCN2021118075-appb-000001
或者其它操作系统的便携式电子设备。上述便携式电子设备也可以是其它便携式电子设备,例如数码相机。还应当理解的是,在本申请其他一些实施例中,上述电子设备也可以不是便携式电子设备,而是具有显示屏的台式计算机等。
The following describes an electronic device for executing the interface display method provided by the embodiments of the present application, a graphical user interface (GUI) for such an electronic device, and an embodiment for using such an electronic device. In some embodiments of the present application, the electronic device may be a portable terminal including a display screen, such as a mobile phone, a tablet computer, and the like. Exemplary embodiments of portable electronic devices include, but are not limited to, carry-on
Figure PCTCN2021118075-appb-000001
Or portable electronic devices with other operating systems. The above-mentioned portable electronic device may also be other portable electronic device, such as a digital camera. It should also be understood that, in some other embodiments of the present application, the above-mentioned electronic device may not be a portable electronic device, but a desktop computer having a display screen or the like.
通常情况下,电子设备可以支持多种应用。比如以下应用中的一个或多个:通讯应用、即时消息收发应用、游戏应用等。其中,即时消息收发应用可以有多种。比如微信(Wechat)、微博、腾讯聊天软件(QQ)、WhatsApp Messenger、连我(Line)、照片分享(Instagram)、Kakao Talk、钉钉等。用户通过即时消息收发应用,可以将文字、语音、图片、视频文件以及其他各种文件等信息发送给其他联系人(或其它联系人);或者,用户可以通过即时 消息收发应用实现与其他联系人的视频或音频通话。Typically, electronic devices can support multiple applications. For example, one or more of the following applications: messaging applications, instant messaging applications, gaming applications, etc. Among them, there may be many kinds of instant messaging applications. Such as WeChat (Wechat), Weibo, Tencent chat software (QQ), WhatsApp Messenger, Lianme (Line), photo sharing (Instagram), Kakao Talk, DingTalk, etc. Users can send text, voice, pictures, video files and other various files to other contacts (or other contacts) through instant messaging applications; or, users can communicate with other contacts through instant messaging applications video or audio call.
下文以电子设备是手机为例,图2A示出了手机100的结构示意图。Hereinafter, the electronic device is a mobile phone as an example, and FIG. 2A shows a schematic structural diagram of the mobile phone 100 .
手机100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,方向传感器180C,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,触摸传感器180K,骨传导传感器180M等。The mobile phone 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, Mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194, and user Identity module (subscriber identification module, SIM) card interface 195 and so on. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an orientation sensor 180C, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a touch sensor 180K, a bone conduction sensor 180M, and the like.
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。The processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) Wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
其中,控制器可以是手机100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。The controller may be the nerve center and command center of the mobile phone 100 . The controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
处理器110可以运行本申请实施例提供的界面显示方法的软件代码,实现手机100的单手操作模式。The processor 110 may run the software code of the interface display method provided by the embodiment of the present application, so as to realize the one-hand operation mode of the mobile phone 100 .
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口130可以用于连接充电器为手机100充电,也可以用于手机100与外围设备之间传输数据。The USB interface 130 is an interface that conforms to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like. The USB interface 130 can be used to connect a charger to charge the mobile phone 100, and can also be used to transmit data between the mobile phone 100 and peripheral devices.
充电管理模块140用于从充电器接收充电输入。电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,外部存储器,显示屏194,摄像头193,和无线通信模块160等供电。The charging management module 140 is used to receive charging input from the charger. The power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 . The power management module 141 receives input from the battery 142 and/or the charging management module 140 and supplies power to the processor 110 , the internal memory 121 , the external memory, the display screen 194 , the camera 193 , and the wireless communication module 160 .
手机100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。The wireless communication function of the mobile phone 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
天线1和天线2用于发射和接收电磁波信号。手机100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。 Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals. Each antenna in handset 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization. For example, the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
移动通信模块150可以提供应用在手机100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经 调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。The mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G etc. applied on the mobile phone 100 . The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like. The mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation. The mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1. In some embodiments, at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 . In some embodiments, at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
无线通信模块160可以提供应用在手机100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。The wireless communication module 160 can provide applications on the mobile phone 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 . The wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify the signal, and convert it into electromagnetic waves for radiation through the antenna 2 .
在一些实施例中,手机100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得手机100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。In some embodiments, the antenna 1 of the mobile phone 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the mobile phone 100 can communicate with the network and other devices through wireless communication technology. The wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc. The GNSS may include global positioning system (global positioning system, GPS), global navigation satellite system (global navigation satellite system, GLONASS), Beidou navigation satellite system (beidou navigation satellite system, BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
手机100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。The mobile phone 100 realizes the display function through the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
显示屏194用于显示图像、视频、应用图标、控件图标等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,手机100可以包括1个或N个显示屏194,N为大于1的正整数。示例性的,显示屏194可以用于显示本申请实施例提供的操作界面(例如图9A所示的操作界面901),以及操作界面中的应用图标或控件图标。The display screen 194 is used to display images, videos, application icons, control icons, and the like. Display screen 194 includes a display panel. The display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light). emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on. In some embodiments, the handset 100 may include 1 or N display screens 194, where N is a positive integer greater than 1. Exemplarily, the display screen 194 may be used to display an operation interface provided by an embodiment of the present application (eg, the operation interface 901 shown in FIG. 9A ), and application icons or control icons in the operation interface.
摄像头193用于捕获静态图像或视频。摄像头193可以包括前置摄像头和后置摄像头。示例性,摄像头193可以用于捕捉用户的人脸图像。可选的,前置摄像头可以是红外摄像头,可以用于捕捉用户的眼睛反射回来的红外光,并生成眼睛的深度图像。Camera 193 is used to capture still images or video. The camera 193 may include a front camera and a rear camera. Exemplarily, the camera 193 may be used to capture an image of the user's face. Optionally, the front camera may be an infrared camera, which may be used to capture infrared light reflected from the user's eyes and generate a depth image of the eyes.
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行手机100的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存 储操作系统,以及至少一个应用程序(比如游戏应用,微信应用等)的软件代码等。存储数据区可存储手机100使用过程中所产生的数据(比如图像、视频等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。 Internal memory 121 may be used to store computer executable program code, which includes instructions. The processor 110 executes various functional applications and data processing of the mobile phone 100 by executing the instructions stored in the internal memory 121 . The internal memory 121 may include a storage program area and a storage data area. Wherein, the storage program area can store the operating system, and the software code of at least one application (such as game application, WeChat application, etc.). The storage data area can store data (such as images, videos, etc.) and the like generated during the use of the mobile phone 100 . In addition, the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
内部存储器121还可以存储本申请实施例提供的界面显示方法的软件代码,当处理器110运行所述软件代码时,执行界面显示方法的流程步骤,实现单手操作模式。The internal memory 121 may also store software codes of the interface display method provided by the embodiments of the present application. When the processor 110 executes the software codes, the process steps of the interface display method are executed to realize a one-handed operation mode.
内部存储器121还可以存储电子设备在运行过程中生成或接收到的信息,例如用户自定义的进入单手操作模式的快捷手势信息、人脸图像、视点坐标、手指坐标、指纹信息等。The internal memory 121 can also store information generated or received by the electronic device during operation, such as user-defined shortcut gesture information for entering the one-hand operation mode, face image, viewpoint coordinates, finger coordinates, fingerprint information, and the like.
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展手机100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。The external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the mobile phone 100 . The external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example to save files like music, video etc in external memory card.
当然,本申请实施例提供的界面显示方法的软件代码也可以存储在外部存储器中,处理器110可以通过外部存储器接口120运行所述软件代码,执行界面显示方法的流程步骤,实现单手操作模式。手机100采集到的用户的人脸信息、视点坐标、手指坐标等也可以存储在外部存储器中。Of course, the software code of the interface display method provided in the embodiment of the present application may also be stored in an external memory, and the processor 110 may run the software code through the external memory interface 120 to execute the process steps of the interface display method to realize the one-handed operation mode . The user's face information, viewpoint coordinates, finger coordinates, etc. collected by the mobile phone 100 may also be stored in an external memory.
手机100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如接听电话,录音等。The mobile phone 100 can implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, and an application processor. Such as answering calls, recording, etc.
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。在一些实施例中,手机100可以通过压力传感器180A检测针对显示屏194或针对显示屏194所显示的操作界面中的应用图标的点击操作。在另一些实施例中,压力传感器180A可以用于检测骨关节针对显示屏194的滑动操作确定的快捷手势,例如,图4B中的快捷手势字母“Z”、“C”。The pressure sensor 180A is used to sense pressure signals, and can convert the pressure signals into electrical signals. In some embodiments, the pressure sensor 180A may be provided on the display screen 194 . In some embodiments, the mobile phone 100 can detect the click operation on the display screen 194 or on the application icon in the operation interface displayed on the display screen 194 through the pressure sensor 180A. In other embodiments, the pressure sensor 180A may be used to detect the shortcut gestures determined by the bone joint for the sliding operation of the display screen 194, for example, the shortcut gesture letters "Z" and "C" in FIG. 4B.
陀螺仪传感器180B可以用于确定手机100的运动姿态。在一些实施例中,可以通过陀螺仪传感器180B确定手机100围绕三个轴(即,x,y和z轴)的角速度,进而确定手机100是否被抬起。The gyroscope sensor 180B can be used to determine the motion attitude of the mobile phone 100 . In some embodiments, the angular velocity of the mobile phone 100 around three axes (ie, the x, y and z axes) can be determined by the gyro sensor 180B to determine whether the mobile phone 100 is lifted.
方向传感器180C可以检测手机100的绝对姿态值,进而确定手机100的角度变化。The orientation sensor 180C can detect the absolute attitude value of the mobile phone 100 , and then determine the angle change of the mobile phone 100 .
加速度传感器180E可检测手机100在各个方向上(一般为三轴)加速度的大小。当手机100静止时可检测出重力的大小及方向。还可以用于识别电子设备姿态,应用于横竖屏切换,计步器等应用。在一些可能的实施例中,还可以和方向传感器180C一起,检测手机100是否被抬起。The acceleration sensor 180E can detect the magnitude of the acceleration of the mobile phone 100 in various directions (generally three axes). When the mobile phone 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc. In some possible embodiments, together with the direction sensor 180C, it can also detect whether the mobile phone 100 is lifted.
距离传感器180F,用于测量距离。手机100可以通过红外或激光测量距离。在一些实施例中,拍摄场景,手机100可以利用距离传感器180F测距以实现快速对焦。 Distance sensor 180F for measuring distance. The cell phone 100 can measure the distance through infrared or laser. In some embodiments, when shooting a scene, the mobile phone 100 can use the distance sensor 180F to measure the distance to achieve fast focusing.
接近光传感器180G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。手机100通过发光二极管向外发射红外光。手机100使用光电二极管检测来自附近物体的红外反射光。当检测到充分的反射光时,可以确定手机100附近有物体。当检测到不充分的反射光时,手机100可以确定手机100附近没有物体。手机100可以利用接近光传感器180G检测用户手持手机100贴近耳朵通话,以便自动熄灭屏幕达到省电的目的。接近光传感器180G也可用于皮套模式,口袋模式自动解锁与锁屏。 Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes. The light emitting diodes may be infrared light emitting diodes. The mobile phone 100 emits infrared light through the light emitting diodes. Cell phone 100 uses photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the mobile phone 100 . When insufficient reflected light is detected, the cell phone 100 may determine that there is no object near the cell phone 100 . The mobile phone 100 can use the proximity light sensor 180G to detect that the user holds the mobile phone 100 close to the ear to talk, so as to automatically turn off the screen to save power. The proximity light sensor 180G can also be used in holster mode, pocket mode automatically unlocks and locks the screen.
可选的,接近光传感器180G为红外光传感器时,还可以通过红外发光二极管获取用 户眼睛与屏幕194的距离、以及用户手指相对于显示屏194的坐标。Optionally, when the proximity light sensor 180G is an infrared light sensor, the distance between the user's eyes and the screen 194 and the coordinates of the user's finger relative to the display screen 194 can also be obtained through infrared light emitting diodes.
指纹传感器180H用于采集指纹。手机100可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。在一种可能的实施例中,指纹传感器180H的大小可以与显示屏194一样大,手机100获取到用户的手指坐标之后,显示屏194在手指坐标对应的显示区域显示指纹解锁的图案,手机100检测到针对该指纹解锁图案的点击操作,启动指纹传感器180H,指纹传感器180H开始采集用户的指纹信息,指纹传感器180H将采集到的指纹信息发送至处理器100,处理器100根据内部存储器121所存储的用户指纹信息对该指纹信息进行匹配,若匹配成功,则启动解锁的应用进程,若匹配不成功,继续保持锁屏模式。The fingerprint sensor 180H is used to collect fingerprints. The mobile phone 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, accessing application locks, taking photos with fingerprints, answering incoming calls with fingerprints, and the like. In a possible embodiment, the size of the fingerprint sensor 180H may be the same as the size of the display screen 194. After the mobile phone 100 obtains the coordinates of the user's finger, the display screen 194 displays the fingerprint unlock pattern in the display area corresponding to the coordinates of the finger, and the mobile phone 100 Detecting the click operation for the fingerprint unlocking pattern, the fingerprint sensor 180H is activated, the fingerprint sensor 180H starts to collect the fingerprint information of the user, and the fingerprint sensor 180H sends the collected fingerprint information to the processor 100, and the processor 100 stores the fingerprint information according to the internal memory 121. The fingerprint information of the user matches the fingerprint information. If the matching is successful, the unlocking application process is started. If the matching is unsuccessful, the screen lock mode is continued.
触摸传感器180K,也称“触控面板”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器180K可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于手机100的表面,与显示屏194所处的位置不同。例如,触摸传感器180K检测到针对显示屏194的触摸动作,可以确定用户的手指坐标。 Touch sensor 180K, also called "touch panel". The touch sensor 180K may be disposed on the display screen 194 , and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”. The touch sensor 180K is used to detect a touch operation on or near it. The touch sensor 180K may pass the detected touch operation to the application processor to determine the type of touch event. Visual output related to touch operations may be provided through display screen 194 . In other embodiments, the touch sensor 180K may also be disposed on the surface of the mobile phone 100 , which is different from the position where the display screen 194 is located. For example, the touch sensor 180K detects a touch action on the display screen 194, and the coordinates of the user's finger can be determined.
骨传导传感器180M可以获取振动信号。在一些实施例中,骨传导传感器180M可以获取人体声部振动骨块的振动信号。骨传导传感器180M也可以接触人体脉搏,接收血压跳动信号。The bone conduction sensor 180M can acquire vibration signals. In some embodiments, the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice. The bone conduction sensor 180M can also contact the pulse of the human body and receive the blood pressure beating signal.
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。手机100可以接收按键输入,产生与手机100的用户设置以及功能控制有关的键信号输入。在一种可能的实施例中,手机100在预设时长内检测到针对按键190的多次点击操作,启动手机100的单手操作模式。The keys 190 include a power-on key, a volume key, and the like. Keys 190 may be mechanical keys. It can also be a touch key. The cell phone 100 can receive key input and generate key signal input related to user settings and function control of the cell phone 100 . In a possible embodiment, the mobile phone 100 detects multiple click operations on the button 190 within a preset time period, and starts the one-handed operation mode of the mobile phone 100 .
马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如接听电话,音频播放等)的触摸操作,可以对应不同的振动反馈效果。 Motor 191 can generate vibrating cues. The motor 191 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback. For example, touch operations acting on different applications (such as answering a call, playing audio, etc.) can correspond to different vibration feedback effects.
指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。The indicator 192 can be an indicator light, which can be used to indicate the charging state, the change of the power, and can also be used to indicate a message, a missed call, a notification, and the like.
SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和手机100的接触和分离。The SIM card interface 195 is used to connect a SIM card. The SIM card can be contacted and separated from the mobile phone 100 by being inserted into the SIM card interface 195 or pulled out from the SIM card interface 195 .
可以理解的是,本申请实施例示意的结构并不构成对手机100的具体限定。在本申请另一些实施例中,手机100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。It can be understood that the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the mobile phone 100 . In other embodiments of the present application, the mobile phone 100 may include more or less components than shown, or some components may be combined, or some components may be separated, or different component arrangements. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
以上介绍了手机100的硬件结构,下面介绍手机100的软件架构。The hardware structure of the mobile phone 100 is described above, and the software structure of the mobile phone 100 is described below.
具体的,手机100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本申请实施例以分层架构的安卓(android)系统为例,示例性说明手机100的软件结构。分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。Specifically, the software system of the mobile phone 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. The embodiments of the present application take an Android system with a layered architecture as an example to illustrate the software structure of the mobile phone 100 as an example. The layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
请参见图2B,在一些可能的实施例中,将Android系统分为四层,从上至下分别为应用程序(Applications)层(本文中简称“应用层”),应用程序框架(Application Framework) 层(本文中简称“框架层”),安卓运行时(Android runtime)和系统库层(本文中简称“系统运行库层”),以及内核层。Referring to FIG. 2B, in some possible embodiments, the Android system is divided into four layers, from top to bottom, the applications layer (hereinafter referred to as the "application layer"), the application framework (Application Framework) layer (herein referred to as "framework layer"), Android runtime (Android runtime) and system library layer (herein referred to as "system runtime layer"), and kernel layer.
其中,应用程序层中运行有至少一个应用程序,这些应用程序可以是操作系统自带的窗口(Window)程序、系统设置程序、联系人程序、短信程序、时钟程序、相机应用等;也可以是第三方开发者所开发的应用程序,比如即时通信程序、相片美化程序、游戏程序等。当然,在具体实施时,应用程序层中的应用程序包不限于以上举例,实际还可以包括其它应用程序包,本申请实施例对此不做限制。Among them, at least one application program runs in the application program layer, and these application programs can be a Window program, a system setting program, a contact program, a short message program, a clock program, a camera application, etc. Applications developed by third-party developers, such as instant messaging programs, photo enhancement programs, game programs, etc. Of course, during specific implementation, the application package in the application layer is not limited to the above examples, and may actually include other application packages, which are not limited in this embodiment of the present application.
框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。应用程序框架层相当于一个处理中心,这个中心决定让应用层中的应用程序做出动作。The framework layer provides an application programming interface (API) and a programming framework for applications in the application layer. The application framework layer includes some predefined functions. The application framework layer is equivalent to a processing center, which decides to let the applications in the application layer take action.
如图2B所示,应用程序框架层可以包括窗口管理器,内容提供器,视图系统,电话管理器,资源管理器,通知管理器等。As shown in Figure 2B, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,控制显示窗口变化。例如,图9A、图9B中选取部分显示内容以单独的显示窗口显示(例如操作界面901),又例如,在图11中手机100检测到针对操作界面1101(即显示窗口)的滑动操作,将操作界面1101中的图标缩小或放大显示。窗口管理器还可以判断是否有状态栏,锁定屏幕,截取屏幕等。内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。A window manager is used to manage window programs. The window manager can get the size of the display screen and control the change of the display window. For example, in FIG. 9A and FIG. 9B, a selected part of the display content is displayed in a separate display window (eg, the operation interface 901). For another example, in FIG. 11, the mobile phone 100 detects a sliding operation for the operation interface 1101 (ie, the display window). The icons in the operation interface 1101 are displayed in reduced or enlarged size. The window manager can also tell if there is a status bar, lock screen, screen capture, etc. Content providers are used to store and retrieve data and make these data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone book, etc.
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面(例如图1B所示的操作界面101,或图9A所示的操作界面901)可以由一个或多个视图组成的。The view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on. View systems can be used to build applications. A display interface (eg, the operation interface 101 shown in FIG. 1B , or the operation interface 901 shown in FIG. 9A ) may be composed of one or more views.
电话管理器用于提供手机100的通信功能。例如通话状态的管理(包括接通,挂断等)。资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。The phone manager is used to provide the communication function of the mobile phone 100 . For example, the management of call status (including connecting, hanging up, etc.). The resource manager provides various resources for the application, such as localization strings, icons, pictures, layout files, video files and so on.
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,电子设备振动,指示灯闪烁等。The notification manager enables applications to display notification information in the status bar, which can be used to convey notification-type messages, and can disappear automatically after a brief pause without user interaction. For example, the notification manager is used to notify download completion, message reminders, etc. The notification manager can also display notifications in the status bar at the top of the system in the form of graphs or scroll bar text, such as notifications of applications running in the background, and notifications on the screen in the form of dialog windows. For example, text information is prompted in the status bar, a prompt sound is issued, the electronic device vibrates, and the indicator light flashes.
示例性的,如图4D所示,在手机100检测到针对显示屏194显示的操作界面301的多次点击操作,但未触发任何应用进程时,通知管理器控制显示194,显示对话框302。Exemplarily, as shown in FIG. 4D , when the mobile phone 100 detects multiple click operations on the operation interface 301 displayed on the display screen 194 but no application process is triggered, the notification manager controls the display 194 and displays the dialog 302 .
系统运行库层为上层即框架层提供支撑,当框架层被使用时,安卓操作系统会运行系统运行库层中包含的C/C++库以实现框架层要实现的功能。The system runtime layer provides support for the upper layer, that is, the framework layer. When the framework layer is used, the Android operating system will run the C/C++ library contained in the system runtime layer to implement the functions to be implemented by the framework layer.
内核层是硬件和软件之间的层。如图2B所示,内核层至少包含显示驱动和传感器驱动(如红外光传感器,触摸传感器、压力传感器等)、摄像头驱动,音频驱动等。The kernel layer is the layer between hardware and software. As shown in FIG. 2B , the core layer at least includes a display driver, a sensor driver (such as an infrared light sensor, a touch sensor, a pressure sensor, etc.), a camera driver, an audio driver, and the like.
为了便于理解,本申请以下实施例将以具有图2A和图2B所示结构的手机100为例,结合附图对本申请实施例提供的界面显示方法进行具体阐述。For ease of understanding, the following embodiments of the present application will take the mobile phone 100 having the structure shown in FIG. 2A and FIG. 2B as an example, and specifically describe the interface display method provided by the embodiments of the present application with reference to the accompanying drawings.
请参见图3所示,图3为本申请实施例提供的界面显示方法的流程示意图。如图3所示,该方法可以包括以下几个步骤:Please refer to FIG. 3 , which is a schematic flowchart of an interface display method provided by an embodiment of the present application. As shown in Figure 3, the method may include the following steps:
S301:手机100接收第一操作,响应于第一操作,进入单手操作模式。S301: The mobile phone 100 receives the first operation, and in response to the first operation, enters a one-handed operation mode.
第一操作为触发手机100进入单手操作模式的操作。The first operation is an operation to trigger the mobile phone 100 to enter the one-hand operation mode.
手机100接收到第一操作,后手机100中的处理器110可以控制显示屏194以单手操作模式显示可供用户单手操作的操作界面,在该操作界面中的任意图标(例如,应用图标、某个应用内的参数控件图标、功能控件图标等等),用户均可以实现单手操作。具体方案参见后文的介绍。After the mobile phone 100 receives the first operation, the processor 110 in the mobile phone 100 can control the display screen 194 to display an operation interface that can be operated by the user with one hand in a one-handed operation mode. , parameter control icons, function control icons in an application, etc.), users can operate with one hand. For specific plans, see the introduction below.
其中,第一操作的具体实现方式有多种,包括但不限于以下几种:Wherein, there are various specific implementation manners of the first operation, including but not limited to the following:
方式1,抬起手机100。 Mode 1, lift up the mobile phone 100 .
在方式1中,第一操作为抬起手机100被抬起的操作。In Mode 1, the first operation is an operation of lifting the mobile phone 100 to be lifted.
示例性的,手机100可以通过加速传感器180E和方向传感器180C检测手机100是否被抬起。Exemplarily, the mobile phone 100 can detect whether the mobile phone 100 is lifted through the acceleration sensor 180E and the direction sensor 180C.
具体的,手机100被抬起时,加速传感器180E将检测到的信号转换成处理器110能够处理的信息并传递给处理器110,处理器110中运行的内核层基于该信息生成对应的加速度数据;方向传感器180C将检测到的信号转换成处理器110能够处理的信息并传递给处理器110,处理器110中运行的内核层基于该信息生成手机100相对于其自身水平轴的角度变化数据;内核层确定该加速度数据与预设的加速度数据的差值超出预设阈值、以及确定该角度变化数据满足预设条件(例如角度值会从0变化到180),则确定手机100被抬起,处理器110控制显示屏194进入单手操作模式。Specifically, when the mobile phone 100 is lifted, the acceleration sensor 180E converts the detected signal into information that can be processed by the processor 110 and transmits it to the processor 110, and the kernel layer running in the processor 110 generates corresponding acceleration data based on the information The direction sensor 180C converts the detected signal into information that the processor 110 can process and transmits it to the processor 110, and the kernel layer running in the processor 110 generates the angle change data of the mobile phone 100 relative to its own horizontal axis based on this information; The kernel layer determines that the difference between the acceleration data and the preset acceleration data exceeds a preset threshold, and determines that the angle change data satisfies a preset condition (for example, the angle value changes from 0 to 180), then it is determined that the mobile phone 100 is lifted, The processor 110 controls the display screen 194 to enter a one-handed operation mode.
可选的,在手机100检测到手机100被抬起之后,还可以继续检测是否存在针对手机100的显示屏194的点击操作或摇晃操作,若存在,再进入单手操作模式。这样可以有效地避免用户误操作,进而提升用户体验。Optionally, after the mobile phone 100 detects that the mobile phone 100 is lifted, it may continue to detect whether there is a click operation or a shaking operation on the display screen 194 of the mobile phone 100, and if so, enter the one-handed operation mode. In this way, user misoperations can be effectively avoided, thereby improving user experience.
示例性的,请参见图4A所示,手机100通过加速传感器检180E测到手机100被抬起,且检测到针对显示屏194的双击动作或摇晃动作,处理器110控制显示屏194进入单手操作模式。4A , the mobile phone 100 detects that the mobile phone 100 is lifted through the acceleration sensor detection 180E, and detects a double-tap action or a shaking action on the display screen 194, and the processor 110 controls the display screen 194 to enter one hand. operating mode.
方式2,快捷手势。 Method 2, shortcut gestures.
在方式2中,第一操作为用户自定义的快捷手势。In Mode 2, the first operation is a user-defined shortcut gesture.
示例性的,手机100可以通过压力传感器180A检测该快捷手势。Exemplarily, the mobile phone 100 can detect the shortcut gesture through the pressure sensor 180A.
具体的,压力传感器180A将检测到的信号转换成处理器110能够处理的信息并传递给处理器110,处理器110中运行的内核层基于该信息生成操作对应的位置数据(具体可以包括触点坐标、触点坐标对应的时间戳等);内核层根据第一预设时长内采集到的手指位置数据,绘制快捷手势的数据,并判定该快捷手势的数据与内部存储器121中存储的预设快捷手势的数据是否匹配,若匹配,处理器110控制显示屏194进入单手操作模式。Specifically, the pressure sensor 180A converts the detected signal into information that can be processed by the processor 110 and transmits it to the processor 110, and the kernel layer running in the processor 110 generates the position data (specifically, the contact point) corresponding to the operation based on the information. coordinates, time stamps corresponding to contact coordinates, etc.); the kernel layer draws the data of the shortcut gesture according to the finger position data collected within the first preset time period, and determines the data of the shortcut gesture and the preset stored in the internal memory 121. Whether the data of the shortcut gesture matches, if so, the processor 110 controls the display screen 194 to enter the one-hand operation mode.
示例性的,请参见图4B,假设手机100的内部存储器121存储有用户根据自己的喜好定制的快捷手势(字母Z和字母C)。如图4B中的(a)所示,当手机100中的处理器110通过压力传感器180A检测到字母Z时,处理器110控制显示屏194进入单手操作模。如图4B中的(b)所示,当手机100中的处理器110通过压力传感器180A检测到字母Z时,处理器110控制显示屏194进入单手操作模式。4B, it is assumed that the internal memory 121 of the mobile phone 100 stores shortcut gestures (the letter Z and the letter C) customized by the user according to their own preferences. As shown in (a) of FIG. 4B , when the processor 110 in the mobile phone 100 detects the letter Z through the pressure sensor 180A, the processor 110 controls the display screen 194 to enter the one-hand operation mode. As shown in (b) of FIG. 4B , when the processor 110 in the mobile phone 100 detects the letter Z through the pressure sensor 180A, the processor 110 controls the display screen 194 to enter the one-hand operation mode.
方式3,语音指令。 Mode 3, voice command.
在方式3中,第一操作可以是声音信息。In mode 3, the first operation may be sound information.
具体的,音频模块170将检测到的声音信号转换成处理器110能够处理的信息并传递给处理器110,处理器110中运行的内核层基于该信息生成操作对应的声音指令数据。内 核层将预设时长内采集到的声音指令数据与预设的声音指令数据进行匹配,若匹配成功,处理器110控制显示屏194进入单手操作模式;若匹配不成功,则保持原来的显示模式。Specifically, the audio module 170 converts the detected sound signal into information that can be processed by the processor 110 and transmits it to the processor 110, and the kernel layer running in the processor 110 generates sound instruction data corresponding to the operation based on the information. The kernel layer matches the voice command data collected within the preset duration with the preset voice command data. If the matching is successful, the processor 110 controls the display screen 194 to enter the one-hand operation mode; if the matching is unsuccessful, the original display is maintained. model.
示例性的,请参见图4C,假设手机100的内部存储器121存储有预设语音指令“开启单手模式”,当手机100中的处理器110通过音频模块170检测到“开启单手模式”的语音指令时,处理器110控制显示屏194进入单手操作模式。For example, referring to FIG. 4C , assuming that the internal memory 121 of the mobile phone 100 stores a preset voice command “turn on the one-handed mode”, when the processor 110 in the mobile phone 100 detects the “turn on the one-handed mode” through the audio module 170 In the case of a voice command, the processor 110 controls the display screen 194 to enter the one-hand operation mode.
方式4,按键操作。 Mode 4, key operation.
在方式4中,第一操作可以是针对手机100的硬件控件(例如按键190)的操作。In Mode 4, the first operation may be an operation on a hardware control (eg, key 190 ) of the mobile phone 100 .
按键190可以是电源键,也可以是音量键,本申请实施例不作限定。具体的,第一操作可以是手机100在预设时长内检测到的针对按键190的两次点击操作,或者,三次点击操作,或者,触摸操作,本申请实施例不作具体的限定。The key 190 may be a power key or a volume key, which is not limited in this embodiment of the present application. Specifically, the first operation may be two click operations on the button 190 detected by the mobile phone 100 within a preset duration, or three click operations, or a touch operation, which is not specifically limited in this embodiment of the present application.
示例性的,手机100在预设时长内检测到针对电源键的两次点击操作时,处理器110控制显示屏194进入单手操作模式。Exemplarily, when the mobile phone 100 detects two click operations on the power key within a preset time period, the processor 110 controls the display screen 194 to enter the one-handed operation mode.
方式5,点击操作或触摸操作。 Method 5, click operation or touch operation.
在方式5中,第一操作可以为预设时长内针对显示屏194的多次点击操作或触摸操作。具体的,在显示屏194中的触摸传感器180K检测到该多次点击操作和触摸操作,触摸传感器180K将检测到的信号转换成处理器110能够处理的信息并传递给处理器110,处理器110确定该多次点击操作或触摸操作,未触发任何应用进程,处理器110控制窗口管理器输出用于询问用户是否进入单手操作模式的对话框(请参见图4D所示的对话框402),若处理器110检测到确定指令,控制显示屏194进入单手操作模式,若处理器110检测到否定指令,控制显示屏194保持原来的显示模式。In Mode 5, the first operation may be multiple click operations or touch operations on the display screen 194 within a preset time period. Specifically, the touch sensor 180K in the display screen 194 detects the multiple click operations and touch operations, and the touch sensor 180K converts the detected signals into information that can be processed by the processor 110 and transmits it to the processor 110 , and the processor 110 It is determined that the multiple click operation or touch operation does not trigger any application process, and the processor 110 controls the window manager to output a dialog box for asking the user whether to enter the one-handed operation mode (refer to the dialog box 402 shown in FIG. 4D ), If the processor 110 detects a confirmation command, the display screen 194 is controlled to enter the one-hand operation mode, and if the processor 110 detects a negative command, the display screen 194 is controlled to maintain the original display mode.
以上提供的五种方式,用户均可以在手机100的设置应用程序中的辅助功能中,根据自己的需求自行设置。For the five methods provided above, the user can set it according to his own needs in the auxiliary function in the setting application program of the mobile phone 100 .
应理解,上述五种方式仅为举例而非限定,在实际应用中还可以有其它具体实现方式。It should be understood that the above five manners are merely examples rather than limitations, and other specific implementation manners are also possible in practical applications.
S302:手机100获取视点坐标和手指坐标。S302: The mobile phone 100 obtains the coordinates of the viewpoint and the coordinates of the finger.
应理解,本申请实施例中的视点坐标用于表征用户的视线落在手机100的显示屏194的具体位置。It should be understood that the viewpoint coordinates in this embodiment of the present application are used to represent the specific position where the user's line of sight falls on the display screen 194 of the mobile phone 100 .
视点坐标系可以是一个二维坐标系,该二维坐标系对应的二维平面可以是手机100的显示屏194所在的平面,手机100获取用户的视点坐标的具体实现方式有多种,例如,基于瞳孔角膜向量反射技术、基于3D眼球模型的视觉跟踪技术等等。The viewpoint coordinate system can be a two-dimensional coordinate system, and the two-dimensional plane corresponding to the two-dimensional coordinate system can be the plane where the display screen 194 of the mobile phone 100 is located. Based on pupil corneal vector reflection technology, visual tracking technology based on 3D eyeball model, etc.
可选的,在手机100确定视点坐标之后,还可以判断该用户视线驻留在该视点坐标所在的位置的时间是否超过第一时间阈值,若超过,手机100再获取用户的手指坐标;若未超过,手机100继续检测新的视点坐标,直到出现用户视线驻留时间超过该第一时间阈值的视点坐标,再获取用户的手指坐标。这样,获取到的视点坐标更为准确,更接近用户实际想要操作的目标的视点坐标。Optionally, after the mobile phone 100 determines the coordinates of the viewpoint, it can also be judged whether the time that the user's line of sight resides at the position where the coordinates of the viewpoint are located exceeds the first time threshold, and if so, the mobile phone 100 obtains the coordinates of the user's finger; If it exceeds, the mobile phone 100 continues to detect the coordinates of the new viewpoint until the viewpoint coordinates for which the dwell time of the user's sight exceeds the first time threshold occurs, and then obtains the coordinates of the user's finger. In this way, the obtained viewpoint coordinates are more accurate and closer to the viewpoint coordinates of the target that the user actually wants to operate.
应理解,本申请实施例中的手指坐标用于表征用户手指的投影落在手机100的显示屏194上的位置信息或用户手指与显示屏194的接触点在显示屏194上的位置信息。It should be understood that the finger coordinates in this embodiment of the present application are used to represent the position information of the projection of the user's finger on the display screen 194 of the mobile phone 100 or the position information of the contact point between the user's finger and the display screen 194 on the display screen 194 .
可以理解的是,用户的手指坐标可以是三维坐标,也可以是二维坐标,本申请的实施例不作具体的限定。当手指坐标为二维坐标时,手指坐标系和视点坐标系可以是同一个坐标系。It can be understood that the coordinates of the user's finger may be three-dimensional coordinates or two-dimensional coordinates, which are not specifically limited in the embodiments of the present application. When the finger coordinates are two-dimensional coordinates, the finger coordinate system and the viewpoint coordinate system may be the same coordinate system.
其中,手机100获取手指坐标的具体实施方式有多种,本申请不做限制。There are various specific implementations for the mobile phone 100 to obtain the coordinates of the finger, which are not limited in this application.
示例1,手机100通过触摸传感器180K检测触摸操作对应的位置数据(具体可以包括触点坐标、触点坐标对应的时间戳等),进而确定用户的手指坐标。Example 1, the mobile phone 100 detects the position data corresponding to the touch operation through the touch sensor 180K (specifically may include contact coordinates, time stamps corresponding to the contact coordinates, etc.), and then determines the coordinates of the user's finger.
示例2、手机100通过红外光学传感器向外发射红外线,检测手指反射回来的红外光,确定用户的手指坐标。Example 2. The mobile phone 100 emits infrared rays through the infrared optical sensor, detects the infrared light reflected by the finger, and determines the coordinates of the user's finger.
S303:手机100获取视点坐标对应的第一显示区域的第一显示内容,显示到手指坐标对应的第二显示区域。S303: The mobile phone 100 acquires the first display content of the first display area corresponding to the viewpoint coordinates, and displays the first display content to the second display area corresponding to the finger coordinates.
具体的,手机100获取到该用户的视点坐标和手指坐标之后转换成处理器110能够处理的信息,处理器110控制处理器110运行的应用程序框架层中的窗口管理器获取该视点坐标对应的第一显示区域的第一显示内容,并根据该第一显示内容生成新的显示窗口;处理器110控制显示屏194在手指坐标对应的第二显示区域显示该新的显示窗口。Specifically, the mobile phone 100 obtains the user's viewpoint coordinates and finger coordinates and converts them into information that the processor 110 can process. the first display content of the first display area, and generate a new display window according to the first display content; the processor 110 controls the display screen 194 to display the new display window in the second display area corresponding to the finger coordinates.
在一种可能的实施方式中,第一显示区域可以是以视点坐标为圆心,半径为第一预设值的圆形显示区域;在另一种可能的实施方式中第一显示区域可以是以视点坐标为中心、边长为第二预设值的正方形的显示区域。当然,以上两种仅为举例而非限定。In a possible implementation manner, the first display area may be a circular display area with the viewpoint coordinates as the center and the radius as the first preset value; in another possible implementation manner, the first display area may be A display area of a square whose center is the coordinate of the viewpoint and whose side length is the second preset value. Of course, the above two are only examples and not limitations.
可选的,第一显示内容可以包括至少一个应用图标或控件图标。具体内容可以根据视点坐标所在的第一显示区域的显示内容确定。Optionally, the first display content may include at least one application icon or control icon. The specific content may be determined according to the display content of the first display area where the viewpoint coordinates are located.
可选的,手机100在检测到针对第一显示内容内的任一应用图标或控件图标的点击操作,手机100将该图标确定为目标图标,启动目标图标对应的进程。Optionally, when the mobile phone 100 detects a click operation on any application icon or control icon in the first display content, the mobile phone 100 determines the icon as the target icon, and starts the process corresponding to the target icon.
可选的,在手机100确定视点坐标和手指坐标之后,还可以判断该用户视线驻留在该视点坐标所在的位置的时间是否超过第二时间阈值,若超过,手机100再将第一显示内容显示到第二显示区域;若未超过,手机100继续检测新的视点坐标,直到出现用户视线驻留时间超过该第二时间阈值的视点坐标,手机100再将第一显示内容显示到第二显示区域。这样,获取到的第一显示内容更为准确,更接近用户实际想要操作的显示内容。Optionally, after the mobile phone 100 determines the coordinates of the viewpoint and the coordinates of the finger, it can also be judged whether the time that the user's line of sight resides at the position of the coordinates of the viewpoint exceeds the second time threshold. Display to the second display area; if not exceeded, the mobile phone 100 continues to detect the new viewpoint coordinates, until there is a viewpoint coordinate where the user's sight dwell time exceeds the second time threshold, the mobile phone 100 displays the first display content to the second display area. In this way, the acquired first display content is more accurate and closer to the display content that the user actually wants to operate.
通过上述可知,在本申请实施例中,手机100检测到第一操作之后,进入单手操作模式,在单手操作模式下,手机100可以获取用户的视点坐标和手指坐标,并将该视点坐标对应的第一显示区域内的第一显示内容显示到手指坐标对应的第二显示区域。有效解决用户单手不便手机的技术问题,有效提升用户体验。As can be seen from the above, in the embodiment of the present application, after detecting the first operation, the mobile phone 100 enters the one-handed operation mode. In the one-handed operation mode, the mobile phone 100 can obtain the coordinates of the user's viewpoint and finger, and use the coordinates of the viewpoint to The first display content in the corresponding first display area is displayed in the second display area corresponding to the finger coordinates. It can effectively solve the technical problem of inconvenient mobile phones for users with one hand, and effectively improve the user experience.
下面介绍本申请实施例提供的一种确定视点坐标的方法。请参见图5,该方法包括:The following introduces a method for determining the coordinates of a viewpoint provided by an embodiment of the present application. Referring to Figure 5, the method includes:
S501:手机100检测用户的人脸图像。S501: The mobile phone 100 detects the face image of the user.
具体的,手机100中的处理器110启动摄像头193,摄像头193拍摄手机100周围的环境图像数据,并将该环境图像数据转换转化为处理器110能够处理的信息并传递给处理器110;处理器110确定该环境图像数据中是否存在人脸数据,若存在人脸数据,处理器110继续执行S502;若不存在人脸数据,处理器110控制显示屏194退出单手操作模式。Specifically, the processor 110 in the mobile phone 100 starts the camera 193, and the camera 193 captures environmental image data around the mobile phone 100, converts the environmental image data into information that can be processed by the processor 110, and transmits it to the processor 110; 110 determines whether there is human face data in the environmental image data, if there is human face data, the processor 110 continues to execute S502; if there is no human face data, the processor 110 controls the display screen 194 to exit the one-handed operation mode.
可选的,在手机100执行步骤S502之前,还可以进一步验证用户身份的合法性。其中,验证用户身份的合法性的具体实施方式包括但不限于以下三种方式:Optionally, before the mobile phone 100 performs step S502, the legitimacy of the user identity may be further verified. The specific implementations for verifying the legitimacy of the user identity include but are not limited to the following three ways:
方式1:手机100中的处理器110判断环境图像数据中的人脸图像与已存储的人脸图像是否匹配,若匹配,执行S402;若不匹配,控制显示屏194退出单手操作模式。Mode 1: The processor 110 in the mobile phone 100 determines whether the face image in the environmental image data matches the stored face image, and if so, executes S402; if not, controls the display screen 194 to exit the one-handed operation mode.
方式2:手机100中的处理器110检测用户的声纹信息,并判断当前用户的声纹信息与已存储的声纹信息是否匹配,若匹配,执行S402;若不匹配,控制显示屏194退出单手操作模式。Mode 2: The processor 110 in the mobile phone 100 detects the user's voiceprint information, and determines whether the current user's voiceprint information matches the stored voiceprint information. If it matches, execute S402; if not, control the display screen 194 to exit One-handed operation mode.
方式3:手机100中的处理器110检测用户的虹膜信息,并判断当前用户的虹膜信息 与已存储的虹膜信息是否匹配,若匹配,执行S402;若不匹配,控制显示屏194退出单手操作模式。Mode 3: The processor 110 in the mobile phone 100 detects the iris information of the user, and judges whether the iris information of the current user matches the stored iris information, if so, execute S402; if not, control the display screen 194 to exit the one-handed operation model.
S502:手机100确定该用户的眼睛与红外摄像头的距离,以及角膜反射光斑坐标以及瞳孔中心坐标。S502: The mobile phone 100 determines the distance between the user's eyes and the infrared camera, as well as the coordinates of the corneal reflection spot and the center of the pupil.
具体的,请参见图6A,手机100中的处理器110启动内核层的红外光传感器、设置于显示屏194上边缘中心的红外摄像头;红外光传感器180G控制其内部的红外发光二极管向该用户的人脸发射红外线;红外摄像头捕捉该用户的眼睛返回的光线,并将该光线数据转换转化为处理器110能够处理的信息并传递给处理器110;处理器110根据该信息,控制处理器110运行的系统库中的图像处理库生成该用户眼睛的深度图像;处理器110根据内部存储器121存储的算法(例如,坐标变换算法)对该深度图像进行处理,进而确定出该用户的眼睛与红外摄像头的距离;当该用户眼睛相对于红外摄像头眼动时,处理器110还可以根据瞳孔分割、瞳孔粗定位、边缘提取、边缘拟合等算法,确定出角膜反射光斑坐标以及瞳孔中心坐标。Specifically, referring to FIG. 6A , the processor 110 in the mobile phone 100 activates the infrared light sensor in the kernel layer and the infrared camera arranged in the center of the upper edge of the display screen 194; The face emits infrared rays; the infrared camera captures the light returned by the user's eyes, converts the light data into information that can be processed by the processor 110 and transmits it to the processor 110; the processor 110 controls the processor 110 to run according to the information The image processing library in the system library of the system library generates the depth image of the user's eyes; the processor 110 processes the depth image according to the algorithm (for example, the coordinate transformation algorithm) stored in the internal memory 121, and then determines the user's eyes and the infrared camera. When the user's eyes move relative to the infrared camera, the processor 110 can also determine the corneal reflection spot coordinates and the pupil center coordinates according to algorithms such as pupil segmentation, pupil coarse positioning, edge extraction, and edge fitting.
S503:手机100根据该距离以及角膜反射光斑坐标以及瞳孔中心坐标,确定该用户的视点坐标。S503: The mobile phone 100 determines the coordinates of the user's viewpoint according to the distance, the coordinates of the corneal reflection spot, and the coordinates of the pupil center.
在一种可能的实施方式中,请参见图6A,手机100以显示屏194上边缘中心点为原点,显示屏194上边缘为Y轴,经过显示屏194上边缘中心点且垂直于显示屏194上边缘的直线作为X轴,手机100根据该用户的眼睛与红外摄像头的距离、角膜反射光斑坐标以及瞳孔中心坐标,就可以确定出该用户的视线相对于显示屏194的位置(即视点坐标)。具体的,处理器110可以获取眼睛的深度图像中角膜反射光斑相对瞳孔中心的偏移量,并根据预设的偏移量与视点坐标的映射关系,确定用户的视线在世界坐标系中的第一坐标;处理器110再根据该用户的眼睛与红外摄像头的距离对第一坐标进行矩阵转换,进而得到该用户的视线相对于显示屏194的视点坐标(Ex,Ey)。处理器110确定该视点坐标对应的位置信息,控制显示屏194在该视点坐标对应的位置,显示提示信息(例如,光标)。In a possible implementation, referring to FIG. 6A , the mobile phone 100 takes the center point of the upper edge of the display screen 194 as the origin, and the upper edge of the display screen 194 is the Y axis, passing through the center point of the upper edge of the display screen 194 and perpendicular to the display screen 194 The straight line on the upper edge is used as the X-axis, and the mobile phone 100 can determine the position of the user's line of sight relative to the display screen 194 according to the distance between the user's eyes and the infrared camera, the coordinates of the corneal reflection spot, and the center coordinates of the pupil (ie, the coordinates of the viewpoint) . Specifically, the processor 110 may acquire the offset of the corneal reflection spot relative to the center of the pupil in the depth image of the eye, and determine the first position of the user's line of sight in the world coordinate system according to the preset mapping relationship between the offset and the viewpoint coordinates. A coordinate; the processor 110 then performs matrix transformation on the first coordinate according to the distance between the user's eyes and the infrared camera, and then obtains the viewpoint coordinates (Ex, Ey) of the user's line of sight relative to the display screen 194 . The processor 110 determines the position information corresponding to the viewpoint coordinates, and controls the display screen 194 to display prompt information (eg, a cursor) at the position corresponding to the viewpoint coordinates.
在另一种可能的实施方式中,请参见图6B,手机100的处理器110启动人眼视点跟踪传感器,人眼视点跟踪传感器获取视点坐标(Ex,Ey)并转换转化为处理器110能够处理的信息并传递给处理器110;处理器110确定该视点坐标对应的位置信息,控制显示屏194在该视点数据对应的位置,显示提示信息(例如,悬浮图标)。In another possible implementation, referring to FIG. 6B , the processor 110 of the mobile phone 100 activates the human eye viewpoint tracking sensor, and the human eye viewpoint tracking sensor acquires viewpoint coordinates (Ex, Ey) and converts them into the processor 110 can process The information is sent to the processor 110; the processor 110 determines the position information corresponding to the viewpoint coordinates, and controls the display screen 194 to display prompt information (eg, a floating icon) at the position corresponding to the viewpoint data.
可选的,人眼视点跟踪传感器并获取到的视点坐标转换转化为处理器110能够处理的信息并传递给处理器110;处理器110控制处理器110运行的应用程序框架层中的窗口管理器获取该视点坐标对应的预设范围内的显示内容。示例性的,请参见图6C,处理器110确定该视点坐标落在APP8图标,则手机100获取图标APP8对应的显示信息(即虚线框的内容)。Optionally, the human eye tracking sensor and the obtained viewpoint coordinates are converted into information that can be processed by the processor 110 and transmitted to the processor 110; the processor 110 controls the window manager in the application framework layer that the processor 110 runs. Obtain the display content within the preset range corresponding to the coordinates of the viewpoint. For example, referring to FIG. 6C , the processor 110 determines that the coordinate of the viewpoint falls on the APP8 icon, and the mobile phone 100 obtains the display information corresponding to the icon APP8 (ie, the content of the dotted box).
下面介绍本申请实施例提供的一种确定手指坐标的方法。请参见图7,该方法包括:The following introduces a method for determining the coordinates of a finger provided by an embodiment of the present application. Referring to Figure 7, the method includes:
S701:手机100向该用户的手指发射红外线。S701: The mobile phone 100 emits infrared rays to the user's finger.
可选的,手机100在执行S701~S702之前,手机100中的处理器110可以控制手机100的前置摄像头启动,前置摄像头开始采集周围的环境图像数据,并将该环境图像数据转换转化为处理器110能够处理的信息并传递给处理器110;处理器110执行内部存储器121存储的手指识别程序,若确定该环境图像数据中存在多个手指,且该多个手指中包括用户的大拇指,处理器110执行S701~S702,获取该大拇指的坐标;若确定该环境图像数据中 仅存在一个手指,处理器110执行S701~S702,获取该手指的坐标。Optionally, before the mobile phone 100 executes S701-S702, the processor 110 in the mobile phone 100 may control the front camera of the mobile phone 100 to start, and the front camera starts to collect the surrounding environment image data, and converts the environment image data into The information that can be processed by the processor 110 is transmitted to the processor 110; the processor 110 executes the finger recognition program stored in the internal memory 121, if it is determined that there are multiple fingers in the environmental image data, and the multiple fingers include the user's thumb , the processor 110 executes S701-S702 to obtain the coordinates of the thumb; if it is determined that there is only one finger in the environmental image data, the processor 110 executes S701-S702 to obtain the coordinates of the finger.
可选的,该用户的手指可以是距离显示屏194最近的一个手指。Optionally, the user's finger may be a finger closest to the display screen 194 .
具体的,手机100在向该用户的手指发射红外线之前,通过手机100内置的手势跟踪传感器确定该手指距离屏幕194的距离是否小于预设值,若小于该预设值,手机100执行步骤S701,若不小于该预设值,手机100中的处理器控制显示屏194退出单手操作模式。Specifically, before transmitting infrared rays to the user's finger, the mobile phone 100 determines whether the distance between the finger and the screen 194 is less than a preset value through the built-in gesture tracking sensor of the mobile phone 100. If it is less than the preset value, the mobile phone 100 performs step S701, If not less than the preset value, the processor in the mobile phone 100 controls the display screen 194 to exit the one-handed operation mode.
S702:手机100接收该手指反射回来的红外光。S702: The mobile phone 100 receives the infrared light reflected by the finger.
具体的,手机100中的处理器110启动处理器110运行的内核层中的红外光学传感器,红外光学传感器控制其内部的红外发光二极管向该用户的手指发射红外线,并接收该手指返回来的红外光。Specifically, the processor 110 in the mobile phone 100 activates the infrared optical sensor in the kernel layer where the processor 110 runs, and the infrared optical sensor controls the infrared light-emitting diode inside it to emit infrared rays to the user's finger, and receives the infrared rays returned by the finger. Light.
S703:手机100根据手指反射回来的红外光,确定手指的坐标。S703: The mobile phone 100 determines the coordinates of the finger according to the infrared light reflected by the finger.
具体的,请参见图8A,红外光传感器设置于显示屏194上边缘中心点,红外光传感器中的两个红外发光二极管分别设置于显示屏194上边缘的中心点两侧。红外传感器接收到手指反射回来的红外光信息后,手机100中的处理器110可以根据公式S=((t2-t1)c)/2确定该手指与红外光传感器的第一距离,其中,t1为红外传感器向手指发射红外线的时刻,t2为红外传感器接收到手指反射的红外光的时刻,c为光速,S为该手指与红外光传感器的第一距离。Specifically, referring to FIG. 8A , the infrared light sensor is disposed at the center point of the upper edge of the display screen 194 , and the two infrared light emitting diodes in the infrared light sensor are disposed on both sides of the center point of the upper edge of the display screen 194 respectively. After the infrared sensor receives the infrared light information reflected by the finger, the processor 110 in the mobile phone 100 can determine the first distance between the finger and the infrared light sensor according to the formula S=((t2-t1)c)/2, where t1 is the moment when the infrared sensor emits infrared rays to the finger, t2 is the moment when the infrared sensor receives the infrared light reflected by the finger, c is the speed of light, and S is the first distance between the finger and the infrared light sensor.
进一步,处理器110根据内置的手势跟踪传感器获取该手指距离显示屏194的第二距离,处理器110对第一距离和第二距离进行几何运算,得到该手指在显示屏194上的投影点与显示屏194上边缘中心点的第三距离;处理器110控制红外摄像头拍摄该手指的图像,根据该图像确定该手指相对于红外光传感器的方位;进而处理器110可以确定该手指相对于显示屏194中心点(即红外光传感器)的坐标和该手指相对于红外光传感器的方位,确定手指坐标。例如,如图8A所示,第二距离为Ez,第三距离为Ex,用户手指位于红外光传感器的正北方向,则手指坐标为(Ex,0,Ez)。这里的手指坐标为三维坐标。请参见图8B,在确定该用户的手指坐标之后,可以在手机100的显示屏194中显示相应的图标。Further, the processor 110 obtains the second distance of the finger from the display screen 194 according to the built-in gesture tracking sensor, and the processor 110 performs geometric operations on the first distance and the second distance to obtain the projection point of the finger on the display screen 194 and The third distance from the center point of the upper edge of the display screen 194; the processor 110 controls the infrared camera to take an image of the finger, and determines the position of the finger relative to the infrared light sensor according to the image; and then the processor 110 can determine the finger relative to the display screen. 194 The coordinates of the center point (ie, the infrared light sensor) and the orientation of the finger relative to the infrared light sensor determine the coordinates of the finger. For example, as shown in FIG. 8A , the second distance is Ez, the third distance is Ex, and the user's finger is located in the true north direction of the infrared light sensor, the finger coordinates are (Ex, 0, Ez). The finger coordinates here are three-dimensional coordinates. Referring to FIG. 8B , after the coordinates of the user's finger are determined, a corresponding icon may be displayed on the display screen 194 of the mobile phone 100 .
可选的,手机100在未解锁状态下,手机100也可以上述红外光传感器获取用户的手指坐标,并且控制显示屏194在该手指坐标对应的可操作界面显示指纹解锁图标。手机100的触摸传感器180K检测到针对该解锁图标的触摸操作,将该触摸操作转换成一个电信号,由内核层驱动指纹传感器180H,由指纹传感器180H采集用户的指纹,将该指纹和已存储指纹进行匹配,并在匹配成功之后,进入解锁模式。通过该实施方式,指纹解锁的位置无需固定在固定的位置,有效提升用户体验。Optionally, when the mobile phone 100 is not unlocked, the mobile phone 100 can also acquire the coordinates of the user's finger by the infrared light sensor, and control the display screen 194 to display a fingerprint unlock icon on the operable interface corresponding to the coordinates of the finger. The touch sensor 180K of the mobile phone 100 detects the touch operation for the unlock icon, converts the touch operation into an electrical signal, drives the fingerprint sensor 180H by the kernel layer, collects the user's fingerprint by the fingerprint sensor 180H, and uses the fingerprint and the stored fingerprint. Make a match, and after the match is successful, enter the unlock mode. With this implementation, the position of the fingerprint unlocking does not need to be fixed at a fixed position, which effectively improves the user experience.
下面结合本申请实施例提供的界面显示方法,示例性说明电子设备的软件以及硬件的工作流程。The software and hardware workflows of the electronic device are exemplarily described below with reference to the interface display method provided by the embodiments of the present application.
触摸传感器180K检测到对显示屏194的滑动触摸操作,将检测到的信号转换成处理器110能够处理的信息并传递给处理器110,处理器110中运行的内核层基于该信息生成操作对应的位置数据(具体可以包括触点坐标、触点坐标对应的时间戳等);处理器110确定该多次点击操作或触摸操作,未触发任何应用进程,处理器110控制窗口管理器输出用于询问用户是否进入单手操作模式的对话框(请参见图4D所示的对话框302),若处理器110检测到确定指令,控制显示屏194进入单手操作模式,手机100启动红外摄像头,获取红外图像,并将红外图像传递至处理器110,处理器110对根据该红外图像确定用户的视点坐标和手指坐标,并将该视点坐标和手指坐标传递至应用程序框架层,框架层中的 视图系统获取该视点坐标对应的第一显示区域内的第一显示内容,以及该手指坐标对应的第二显示区域,将第一显示内容显示于该第二显示区域。The touch sensor 180K detects the sliding touch operation on the display screen 194, converts the detected signal into information that can be processed by the processor 110 and transmits it to the processor 110, and the kernel layer running in the processor 110 generates the corresponding operation based on the information. Position data (specifically may include contact coordinates, time stamps corresponding to the contact coordinates, etc.); the processor 110 determines that the multiple click operations or touch operations do not trigger any application process, and the processor 110 controls the window manager to output for querying A dialog box for whether the user enters the one-handed operation mode (refer to the dialog box 302 shown in FIG. 4D ). If the processor 110 detects the confirmation instruction, it controls the display screen 194 to enter the one-handed operation mode, and the mobile phone 100 activates the infrared camera to obtain infrared image, and transmit the infrared image to the processor 110, the processor 110 determines the user's viewpoint coordinates and finger coordinates according to the infrared image, and transmits the viewpoint coordinates and finger coordinates to the application framework layer, the view system in the framework layer The first display content in the first display area corresponding to the viewpoint coordinates and the second display area corresponding to the finger coordinates are acquired, and the first display content is displayed in the second display area.
当然,以上只是以第一操作为触摸传感器180K检测到的滑动触摸操作为例,来对本申请实施例提供的界面显示方法进行举例说明。具体实施时还可以有其它实现方式,本申请实施例对此不做限制。Of course, the above only takes the first operation as a sliding touch operation detected by the touch sensor 180K as an example to illustrate the interface display method provided by the embodiment of the present application. There may also be other implementation manners during specific implementation, which are not limited in this embodiment of the present application.
更为更好地理解本申请实施例提供技术方案,下面结合几个具体的应用场景来介绍本申请实施例中的单手操作模式下的界面显示方法。To better understand the technical solutions provided by the embodiments of the present application, the following describes the interface display method in the one-hand operation mode in the embodiments of the present application in combination with several specific application scenarios.
场景1-启动应用。Scenario 1 - Launch the app.
在场景1中,手机100将视点坐标对应的第一显示区域内的第一显示内容显示到手指坐标对应的第二显示区域的具体实施方式有多种,包括但不限于以下方式:In scenario 1, there are various specific implementations for the mobile phone 100 to display the first display content in the first display area corresponding to the viewpoint coordinates to the second display area corresponding to the finger coordinates, including but not limited to the following methods:
方式1,手机100中的处理器110确定视点坐标对应的位置和某个APP图标的位置重叠时,将该某个APP图标确定为第一显示内容,将该第一显示内容显示到手指坐标的对应的第二显示区域。 Mode 1, when the processor 110 in the mobile phone 100 determines that the position corresponding to the viewpoint coordinates overlaps the position of an APP icon, the processor 110 of the mobile phone 100 determines the APP icon as the first display content, and displays the first display content to the position of the finger coordinates. the corresponding second display area.
示例性的,请参见图9A,在图9A的(a)中,手机100中的处理器110确定视点坐标对应的位置与APP8图标的位置重叠,手机100将APP8图标作为第一显示内容,控制显示屏194将其显示到手指坐标对应的第二显示区域即图9A的(b)中的操作界面901。9A, in (a) of FIG. 9A, the processor 110 in the mobile phone 100 determines that the position corresponding to the viewpoint coordinates overlaps the position of the APP8 icon, the mobile phone 100 uses the APP8 icon as the first display content, and controls The display screen 194 displays it in the second display area corresponding to the finger coordinates, that is, the operation interface 901 in (b) of FIG. 9A .
可选的,手机100可以实时监测视点坐标的变化,即实时获取新的视点坐标对应的第一显示区域内的第三显示内容,并将手指坐标对应的第二显示区域内的显示内容由第一显示内容切换为第三显示内容。Optionally, the mobile phone 100 may monitor the change of the viewpoint coordinates in real time, that is, acquire the third display content in the first display area corresponding to the new viewpoint coordinates in real time, and change the display content in the second display area corresponding to the finger coordinates from the first display area. A display content is switched to a third display content.
示例性的,请参见图9B,假设手机100在第一时刻检测到视点坐标所在的位置对应APP8图标,在第二时刻检测到视点坐标所在的位置对应APP9图标,则第一时刻在手指坐标对应的操作界面901显示APP8图标,第二时刻在手指坐标对应的操作界面901显示APP9图标。9B, suppose the mobile phone 100 detects that the position of the viewpoint coordinates corresponds to the APP8 icon at the first moment, and detects that the position of the viewpoint coordinates corresponds to the APP9 icon at the second moment, then the finger coordinates at the first moment correspond to the APP9 icon. The APP8 icon is displayed on the operation interface 901 at the second moment, and the APP9 icon is displayed on the operation interface 901 corresponding to the finger coordinates at the second moment.
在本申请实施例中,手指坐标对应的第二显示区域(例如操作界面901)内的显示内容可以实时地随着视点坐标的变化而变化。In this embodiment of the present application, the display content in the second display area (eg, the operation interface 901 ) corresponding to the coordinates of the finger may change in real time as the coordinates of the viewpoint change.
方式2,手机100中的处理器110确定视点坐标所在位置未与任何APP图标的位置重叠时,将与视点坐标所在位置距离为预设值的APP图标确定为第一显示内容,并将该第一显示内容显示到手指坐标的对应的第二显示区域。 Method 2, when the processor 110 in the mobile phone 100 determines that the position of the viewpoint coordinates does not overlap with the position of any APP icon, it determines the APP icon whose distance from the position of the viewpoint coordinates is a preset value as the first display content, and the first display content is A display content is displayed to the corresponding second display area of the finger coordinates.
示例性的,请参见图10,在图10中的(a)中,手机100获取到视点坐标、手指坐标之后,处理器110确定该视点坐标位于APP8图标和APP13图标之间,则处理器110将APP8图标和APP13图标确定为第一显示内容,并将图标APP8和图标APP13显示到该手指坐标对应的第二显示区域,即图10中的(b)中所示的操作界面1001。10, in (a) of FIG. 10, after the mobile phone 100 obtains the coordinates of the viewpoint and the coordinates of the finger, the processor 110 determines that the coordinates of the viewpoint are located between the APP8 icon and the APP13 icon, then the processor 110 The APP8 icon and the APP13 icon are determined as the first display content, and the icon APP8 and APP13 are displayed in the second display area corresponding to the finger coordinates, that is, the operation interface 1001 shown in (b) in FIG. 10 .
可选的,手机100将第一显示内容显示到第二显示区域之后,检测到针对第二显示区域的滑动操作,手机100中的处理器110可以控制显示屏194在第二显示区域显示新的显示内容。Optionally, after the mobile phone 100 displays the first display content in the second display area and detects a sliding operation for the second display area, the processor 110 in the mobile phone 100 can control the display screen 194 to display a new content in the second display area. Display content.
示例性的,在图10中的(c)中,手机100检测到针对操作界面1001的滑动操作(左右滑动、上下滑动等等),在操作界面1001显示APP8图标、APP9图标、APP13图标、APP14图标。Exemplarily, in (c) of FIG. 10 , the mobile phone 100 detects a sliding operation (swiping left and right, sliding up and down, etc.) on the operation interface 1001 , and the operation interface 1001 displays the APP8 icon, APP9 icon, APP13 icon, and APP14 icon.
可选的,手机100将第一显示内容显示到第二显示区域之后,检测到针对第二显示区域的滑动操作或触摸操作,手机100中的处理器110可以控制显示屏194在第二显示区域 显示以预设的缩放比例进行缩放后的第一显示内容。其中,缩放比例也可以是用户在使用手机100时通过语音指令设定的。Optionally, after the mobile phone 100 displays the first display content in the second display area, and detects a sliding operation or a touch operation for the second display area, the processor 110 in the mobile phone 100 can control the display screen 194 in the second display area. Display the first display content after scaling at a preset scaling ratio. The zoom ratio may also be set by the user through a voice command when using the mobile phone 100 .
示例性的,请参见图11,在图11的(a)中,手机100获取到视点坐标、手指坐标之后,手机100中的处理器110确定该视点坐标位于APP8图标和APP13图标之间,处理器110将APP8图标和APP13图标作为第一显示内容,并将图标APP8和图标APP13显示到该手指坐标对应的操作界面1101。在图11的(b)中,手机100检测到针对操作界面1101的一次触摸操作,将APP8图标和APP13图标以预设的放大比例显示到操作界面1101;在图11的(c)中,手机100在预设时长内检测到针对操作界面1101的两次触摸操作,将APP8图标和APP13图标以预设的缩小比例显示到操作界面1101。11 , in (a) of FIG. 11 , after the mobile phone 100 obtains the coordinates of the viewpoint and the coordinates of the finger, the processor 110 in the mobile phone 100 determines that the coordinates of the viewpoint are located between the APP8 icon and the APP13 icon, and processes The controller 110 takes the APP8 icon and the APP13 icon as the first display content, and displays the APP8 and APP13 icons on the operation interface 1101 corresponding to the coordinates of the finger. In FIG. 11(b), the mobile phone 100 detects a touch operation on the operation interface 1101, and displays the APP8 icon and the APP13 icon on the operation interface 1101 at a preset magnification ratio; in FIG. 11(c), the mobile phone 100 detects two touch operations on the operation interface 1101 within a preset time period, and displays the APP8 icon and the APP13 icon on the operation interface 1101 at a preset reduced scale.
场景2-游戏界面的控件。Scenario 2 - The controls of the game interface.
在场景2中,进一步以手机100为例,介绍手机100将视点坐标对应的显示内容显示到手指坐标对应的显示区域的过程。具体的,手机100在将视点坐标对应的显示内容显示于手指坐标对应的第二显示区域内之前,还包括:判断手指坐标在其对应的第二显示区域中对应的第一位置是否存在应用图标或控件图标;若存在,将该显示内容显示于第二显示区域的第二位置;若不存在,将该显示内容显示于第一位置,其中,第二位置与第一位置相距第一预设值。In Scenario 2, further taking the mobile phone 100 as an example, the process of displaying the display content corresponding to the viewpoint coordinates to the display area corresponding to the finger coordinates by the mobile phone 100 is introduced. Specifically, before displaying the display content corresponding to the viewpoint coordinates in the second display area corresponding to the finger coordinates, the mobile phone 100 further includes: judging whether there is an application icon at the first position corresponding to the finger coordinates in the second display area corresponding to the finger coordinates or control icon; if it exists, the display content is displayed in the second position of the second display area; if it does not exist, the display content is displayed in the first position, wherein the second position and the first position are separated by the first preset value.
请参见图12A中的(a)所示,图12A中的(a)给出了一种可能的游戏界面示意图,在该游戏界面中设置了控件①、控件②、控件③、控件④、控件⑤、控件⑥、控件⑦;其中,手机100检测到针对控件①、控件②、控件③的点击操作或触摸操作,手机100中的处理器110可以控制游戏界面中的人物的行走速度;手机100检测到针对控件④、控件⑤、控件⑥的点击操作或触摸操作,手机100中的处理器110可以控制游戏界面中的人物作出攻击动作;手机100检测到针对控件⑦的点击操作或触摸操作,手机100中的处理器110可以控制游戏界面中的人物更换游戏装备。Please refer to (a) in FIG. 12A , (a) in FIG. 12A shows a schematic diagram of a possible game interface, in which controls ①, controls ②, controls ③, controls ④, controls are set in the game interface ⑤, control ⑥, control ⑦; wherein, the mobile phone 100 detects a click operation or a touch operation for the control ①, control ②, and control ③, and the processor 110 in the mobile phone 100 can control the walking speed of the characters in the game interface; the mobile phone 100 When a click operation or a touch operation for the control ④, control ⑤, and control ⑥ is detected, the processor 110 in the mobile phone 100 can control the characters in the game interface to make an attack action; the mobile phone 100 detects a click operation or touch operation for the control ⑦, The processor 110 in the mobile phone 100 can control the characters in the game interface to change game equipment.
示例1,在图12A中的(a)中,手机100已开启单手操作模式,手机100开始获取该用户的视点坐标以及手指坐标,检测到该用户的视点坐标对应的显示区域内包括控件⑦,在图12A中的(b)中,手机100确定该用户的大拇指对应的显示区域即操作界面1201,手机100进一步确定该大拇指坐标对应的位置中存在控件⑥,手机100确定操作界面1201中距离大拇指坐标预设距离的另一位置,进而手机100将控件⑦显示到该位置,实现了用户单手操作。Example 1, in (a) in FIG. 12A, the mobile phone 100 has turned on the one-handed operation mode, the mobile phone 100 starts to obtain the coordinates of the user's viewpoint and finger coordinates, and detects that the display area corresponding to the coordinates of the user's viewpoint includes controls ⑦ 12A in (b), the mobile phone 100 determines the display area corresponding to the user's thumb, namely the operation interface 1201, the mobile phone 100 further determines that there is a control ⑥ in the position corresponding to the thumb coordinates, and the mobile phone 100 determines the operation interface 1201 The middle distance is another position of the preset distance from the thumb coordinate, and then the mobile phone 100 displays the control ⑦ to this position, so that the user can operate with one hand.
示例2,在图12B中的(a)中,手机100已开启单手操作模式,手机100开始获取该用户的视点坐标以及手指坐标,检测到该用户的视点坐标对应的显示区域内包括控件⑦,在图12B中的(b)中,手机100确定该用户的大拇指对应的显示区域即操作界面1001,手机100进一步确定该大拇指坐标对应的位置中未存在任何控件⑥,进而手机100将控件⑦显示到该位置,实现了用户单手操作。Example 2, in (a) in FIG. 12B, the mobile phone 100 has turned on the one-handed operation mode, the mobile phone 100 starts to obtain the coordinates of the user's viewpoint and finger coordinates, and detects that the display area corresponding to the coordinates of the user's viewpoint includes controls ⑦ 12B in (b), the mobile phone 100 determines the display area corresponding to the user's thumb, namely the operation interface 1001, the mobile phone 100 further determines that there is no control ⑥ in the position corresponding to the thumb coordinates, and then the mobile phone 100 will The control ⑦ is displayed to this position, enabling the user to operate with one hand.
场景3-用户单手接电话。Scenario 3 - The user answers the phone with one hand.
请参见图13中的(a),在图13中的(a)中,手机100的操作界面中包括用于指示拒接电话的接听控件1301以及用于指示接听电话的接听控件1302。Please refer to FIG. 13( a ), in FIG. 13( a ), the operation interface of the mobile phone 100 includes an answering control 1301 for instructing to reject a call and an answering control 1302 for instructing to answer a call.
当用户右手握住手机时,此时有来电提醒,用户右手的手指仅仅能够到接听控件1301,手机100检测用于开启单手模式的语音指令,进入单手操作模式,获取用户的手指坐标和视点坐标;若手机100检测到该视点坐标对应的显示区域内包含拒接控件1301,将拒接控 件1301的图标,显示到该用户手指的可操作范围(例如图13中的(b)所示);若该视点坐标对应的显示区域包含接听控件1302,则保持原来的显示模式(例如图13中的(a)所示)。When the user holds the mobile phone with the right hand, there is an incoming call reminder, and the finger of the user's right hand can only reach the answering control 1301. The mobile phone 100 detects the voice command for enabling the one-handed mode, enters the one-handed operation mode, and obtains the coordinates of the user's finger and Viewpoint coordinates; if the mobile phone 100 detects that the display area corresponding to the viewpoint coordinates contains the rejection control 1301, the icon of the rejection control 1301 is displayed to the operable range of the user's finger (for example, as shown in (b) in FIG. 13 ) ); if the display area corresponding to the coordinates of the viewpoint includes the answering control 1302, the original display mode is maintained (for example, as shown in (a) in FIG. 13 ).
应理解,以上实施例均是以单一设备实现单手操作模式为例进行介绍的。本申请实施例提供的单手操作方法同样可以应用于多个电子设备或者多个电子设备组成的系统。下面以应用于两个电子设备为例。It should be understood that the above embodiments are all described by taking a single device implementing the one-hand operation mode as an example. The one-hand operation method provided by the embodiments of the present application can also be applied to multiple electronic devices or a system composed of multiple electronic devices. The following is an example of application to two electronic devices.
示例性的,请参见图14,以电脑1401和手机100为例,电脑1401和手机100通信接连(有线连接、蓝牙连接、wifi连接等,不做限制)。电脑1401获取用户在电脑1401上的视点坐标,手机100获取用户在手机100上的手指坐标,电脑1401将该视点坐标对应的显示区域内显示内容显示到手机100上的操作界面1402(即用户手指的可操作范围)。获取视点坐标、手指坐标以及控制显示的过程可以参考上文,这里不再赘述。14, taking the computer 1401 and the mobile phone 100 as an example, the computer 1401 and the mobile phone 100 are connected in communication (wired connection, Bluetooth connection, wifi connection, etc., no limitation). The computer 1401 obtains the coordinates of the user's viewpoint on the computer 1401, the mobile phone 100 obtains the coordinates of the user's finger on the mobile phone 100, and the computer 1401 displays the display content in the display area corresponding to the coordinates of the viewpoint to the operation interface 1402 on the mobile phone 100 (ie the user's finger). operative range). The process of obtaining the coordinates of the viewpoint, the coordinates of the finger, and controlling the display can be referred to above, and will not be repeated here.
上述本申请提供的实施例中,从电子设备作为执行主体的角度对本申请实施例提供的方法进行了介绍。为了实现上述本申请实施例提供的方法中的各功能,电子设备可以包括硬件结构和/或软件模块,以硬件结构、软件模块、或硬件结构加软件模块的形式来实现上述各功能。上述各功能中的某个功能以硬件结构、软件模块、还是硬件结构加软件模块的方式来执行,取决于技术方案的特定应用和设计约束条件。In the above-mentioned embodiments of the present application, the methods provided by the embodiments of the present application are introduced from the perspective of an electronic device as an execution subject. In order to implement the functions in the methods provided by the above embodiments of the present application, the electronic device may include a hardware structure and/or software modules, and implement the above functions in the form of a hardware structure, a software module, or a hardware structure plus a software module. Whether one of the above functions is performed in the form of a hardware structure, a software module, or a hardware structure plus a software module depends on the specific application and design constraints of the technical solution.
基于相同的技术构思,本申请还提供的一种电子设备1500,用于实现图3、图5、图7所示实施例中的方法。如图15所示,电子设备1500可以包括处理器1501用于执行存储器1502存储的程序或指令,当存储器1502存储的程序或指令被执行时,所述处理器用于图3所示的界面显示方法。Based on the same technical concept, the present application further provides an electronic device 1500 for implementing the methods in the embodiments shown in FIG. 3 , FIG. 5 , and FIG. 7 . As shown in FIG. 15 , the electronic device 1500 may include a processor 1501 for executing programs or instructions stored in the memory 1502 . When the programs or instructions stored in the memory 1502 are executed, the processor is used for the interface display method shown in FIG. 3 . .
可选的,电子设备1500还可以包括通信接口1503。附图15用虚线表示通信接口1503对于电子设备1500是可选的。Optionally, the electronic device 1500 may further include a communication interface 1503 . FIG. 15 shows with dashed lines that the communication interface 1503 is optional to the electronic device 1500 .
其中,处理器1501、存储器1502和通信接口1503的个数并不构成对本申请实施例的限定,具体实施时,可以根据业务需求任意配置。The numbers of the processors 1501, the memories 1502, and the communication interfaces 1503 do not constitute a limitation on the embodiments of the present application, and can be arbitrarily configured according to business requirements during specific implementation.
可选的,所述存储器1502位于所述电子设备1500之外。Optionally, the memory 1502 is located outside the electronic device 1500 .
可选的,所述电子设备1500包括所述存储器1502,所述存储器1502与所述至少一个处理器1501相连,所述存储器1502存储有可被所述至少一个处理器1501执行的指令。附图15用虚线表示存储器1502对于电子设备1500是可选的。Optionally, the electronic device 1500 includes the memory 1502 , the memory 1502 is connected to the at least one processor 1501 , and the memory 1502 stores instructions executable by the at least one processor 1501 . FIG. 15 shows with dashed lines that memory 1502 is optional to electronic device 1500 .
其中,所述处理器1501和所述存储器1502可以通过接口电路耦合,也可以集成在一起,这里不做限制。The processor 1501 and the memory 1502 may be coupled through an interface circuit, or may be integrated together, which is not limited here.
本申请实施例中不限定上述处理器1501、存储器1502以及通信接口1503之间的具体连接介质。本申请实施例在图15中以处理器1501、存储器1502以及通信接口1503之间通过总线1504连接,总线在图15中以粗线表示,其它部件之间的连接方式,仅是进行示意性说明,并不引以为限。所述总线可以分为地址总线、数据总线、控制总线等。为便于表示,图15中仅用一条粗线表示,但并不表示仅有一根总线或一种类型的总线。The specific connection medium between the processor 1501, the memory 1502, and the communication interface 1503 is not limited in the embodiments of the present application. In this embodiment of the present application, the processor 1501, the memory 1502, and the communication interface 1503 are connected through a bus 1504 in FIG. 15. The bus is represented by a thick line in FIG. 15. The connection between other components is only for schematic illustration. , is not limited. The bus can be divided into an address bus, a data bus, a control bus, and the like. For ease of representation, only one thick line is shown in FIG. 15, but it does not mean that there is only one bus or one type of bus.
应理解,本申请实施例中提及的处理器可以通过硬件实现也可以通过软件实现。当通过硬件实现时,该处理器可以是逻辑电路、集成电路等。当通过软件实现时,该处理器可以是一个通用处理器,通过读取存储器中存储的软件代码来实现。It should be understood that the processor mentioned in the embodiments of the present application may be implemented by hardware or software. When implemented in hardware, the processor may be a logic circuit, an integrated circuit, or the like. When implemented in software, the processor may be a general-purpose processor implemented by reading software codes stored in memory.
示例性的,处理器可以是中央处理单元(Central Processing Unit,CPU),还可以是其他通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现成可编程门阵列(Field Programmable Gate Array,FPGA) 或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。Exemplarily, the processor may be a central processing unit (Central Processing Unit, CPU), or other general-purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuit (Application Specific Integrated Circuit, ASIC) , Off-the-shelf Programmable Gate Array (Field Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
应理解,本申请实施例中提及的存储器可以是易失性存储器或非易失性存储器,或可包括易失性和非易失性存储器两者。其中,非易失性存储器可以是只读存储器(Read-Only Memory,ROM)、可编程只读存储器(Programmable ROM,PROM)、可擦除可编程只读存储器(Erasable PROM,EPROM)、电可擦除可编程只读存储器(Electrically EPROM,EEPROM)或闪存。易失性存储器可以是随机存取存储器(Random Access Memory,RAM),其用作外部高速缓存。通过示例性但不是限制性说明,许多形式的RAM可用,例如静态随机存取存储器(Static RAM,SRAM)、动态随机存取存储器(Dynamic RAM,DRAM)、同步动态随机存取存储器(Synchronous DRAM,SDRAM)、双倍数据速率同步动态随机存取存储器(Double Data Eate SDRAM,DDR SDRAM)、增强型同步动态随机存取存储器(Enhanced SDRAM,ESDRAM)、同步连接动态随机存取存储器(Synchlink DRAM,SLDRAM)和直接内存总线随机存取存储器(Direct Rambus RAM,DR RAM)。It should be understood that the memory mentioned in the embodiments of the present application may be volatile memory or non-volatile memory, or may include both volatile and non-volatile memory. Wherein, the non-volatile memory may be a read-only memory (Read-Only Memory, ROM), a programmable read-only memory (Programmable ROM, PROM), an erasable programmable read-only memory (Erasable PROM, EPROM), an electrically programmable read-only memory (Erasable PROM, EPROM). Erase programmable read-only memory (Electrically EPROM, EEPROM) or flash memory. Volatile memory may be Random Access Memory (RAM), which acts as an external cache. By way of illustration and not limitation, many forms of RAM are available, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM, SDRAM), double data rate synchronous dynamic random access memory (Double Data Eate SDRAM, DDR SDRAM), enhanced synchronous dynamic random access memory (Enhanced SDRAM, ESDRAM), synchronous link dynamic random access memory (Synchlink DRAM, SLDRAM) ) and direct memory bus random access memory (Direct Rambus RAM, DR RAM).
需要说明的是,当处理器为通用处理器、DSP、ASIC、FPGA或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件时,存储器(存储模块)可以集成在处理器中。It should be noted that when the processor is a general-purpose processor, DSP, ASIC, FPGA or other programmable logic devices, discrete gate or transistor logic devices, or discrete hardware components, the memory (storage module) can be integrated in the processor.
应注意,本文描述的存储器旨在包括但不限于这些和任意其它适合类型的存储器。It should be noted that the memory described herein is intended to include, but not be limited to, these and any other suitable types of memory.
基于同一技术构思,本申请实施例还提供一种计算机可读介质,该计算机可读介质存储用于设备执行的程序代码,该程序代码包括用于执行前述实施例中的界面显示方法。Based on the same technical concept, embodiments of the present application further provide a computer-readable medium, where the computer-readable medium stores program codes for device execution, the program codes including the interface display methods for executing the foregoing embodiments.
基于同一技术构思,本申请实施例还提供一种包含指令的计算机程序产品,当该计算机程序产品在计算机上运行时,使得计算机执行前述实施例中的界面显示方法。Based on the same technical concept, the embodiments of the present application also provide a computer program product containing instructions, when the computer program product runs on a computer, the computer can execute the interface display method in the foregoing embodiments.
基于同一技术构思,本申请实施例还提供一种芯片,所述芯片包括处理器与数据接口,所述处理器通过所述数据接口读取存储器上存储的指令,执行前述实施例中的界面显示方法。Based on the same technical concept, an embodiment of the present application further provides a chip, the chip includes a processor and a data interface, the processor reads the instructions stored in the memory through the data interface, and executes the interface display in the foregoing embodiments. method.
在一种可能的设计中,所述芯片还可以包括存储器,所述存储器中存储有指令,所述处理器用于执行所述存储器上存储的指令,当所述指令被执行时,所述处理器用于执行前述实施例中的界面显示方法。In a possible design, the chip may further include a memory, in which instructions are stored, the processor is configured to execute the instructions stored in the memory, and when the instructions are executed, the processor uses to execute the interface display method in the foregoing embodiment.
需要说明的是,本申请实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。在本申请的实施例中的各功能模块可以集成在一个处理模块中,也可以是各个模块单独物理存在,也可以两个或两个以上模块集成在一个模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。It should be noted that, the division of modules in the embodiments of the present application is schematic, and is only a logical function division, and there may be other division manners in actual implementation. Each functional module in the embodiments of the present application may be integrated into one processing module, or each module may exist physically alone, or two or more modules may be integrated into one module. The above-mentioned integrated modules can be implemented in the form of hardware, and can also be implemented in the form of software function modules.
本申请的各个实施例可以单独使用,也可以相互结合使用,以实现不同的技术效果。The various embodiments of the present application may be used alone or in combination with each other to achieve different technical effects.
以上所述,以上实施例仅用以对本申请的技术方案进行了详细介绍,但以上实施例的说明只是用于帮助理解本申请实施例的方法,不应理解为对本申请实施例的限制。本技术领域的技术人员可轻易想到的变化或替换,都应涵盖在本申请实施例的保护范围之内。As described above, the above embodiments are only used to describe the technical solutions of the present application in detail, but the descriptions of the above embodiments are only used to help understand the methods of the embodiments of the present application, and should not be construed as limitations on the embodiments of the present application. Changes or substitutions that can be easily conceived by those skilled in the art should all fall within the protection scope of the embodiments of the present application.
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质 中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如DVD)、或者半导体介质(例如固态硬盘)等。In the above-mentioned embodiments, it may be implemented in whole or in part by software, hardware, firmware or any combination thereof. When implemented in software, it can be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, all or part of the processes or functions described in the embodiments of the present application are generated. The computer may be a general purpose computer, special purpose computer, computer network, or other programmable device. The computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be downloaded from a website site, computer, server, or data center Transmission to another website site, computer, server, or data center by wire (eg, coaxial cable, optical fiber, digital subscriber line) or wireless (eg, infrared, wireless, microwave, etc.). The computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that includes an integration of one or more available media. The usable media may be magnetic media (eg, floppy disks, hard disks, magnetic tapes), optical media (eg, DVDs), or semiconductor media (eg, solid state drives), and the like.
为了解释的目的,前面的描述是通过参考具体实施例来进行描述的。然而,上面的示例性的讨论并非意图是详尽的,也并非意图要将本申请限制到所公开的精确形式。根据以上教导内容,很多修改形式和变型形式都是可能的。选择和描述实施例是为了充分阐明本申请的原理及其实际应用,以由此使得本领域的其他技术人员能够充分利用具有适合于所构想的特定用途的各种修改的本申请以及各种实施例。For purposes of explanation, the foregoing description has been made with reference to specific embodiments. However, the exemplary discussion above is not intended to be exhaustive, nor to limit the application to the precise form disclosed. Many modifications and variations are possible in light of the above teachings. The embodiment was chosen and described in order to fully explain the principles of the application and its practical application, to thereby enable others skilled in the art to utilize the application and various implementations with various modifications as are suited to the particular use contemplated example.

Claims (20)

  1. 一种界面显示方法,其特征在于,应用于电子设备,所述方法包括:An interface display method, characterized in that, applied to an electronic device, the method comprising:
    接收第一操作;receive the first operation;
    响应于所述第一操作,启动单手操作模式;in response to the first operation, initiating a one-handed operation mode;
    在所述单手操作模式下,获取第一视点坐标和手指坐标;获取所述第一视点坐标对应的第一显示区域内的第一显示内容,将所述第一显示内容显示于所述手指坐标对应的第二显示区域内。In the one-handed operation mode, obtain the coordinates of the first viewpoint and the coordinates of the finger; obtain the first display content in the first display area corresponding to the coordinates of the first viewpoint, and display the first display content on the finger in the second display area corresponding to the coordinates.
  2. 如权利要求1所述的方法,其特征在于,所述第一操作为:The method of claim 1, wherein the first operation is:
    所述电子设备被抬起或被摇晃;或者,the electronic device is lifted or shaken; or,
    针对所述电子设备的显示屏的点击操作或滑动操作;或者,A click operation or a swipe operation on the display screen of the electronic device; or,
    语音指令;或者,voice commands; or,
    针对所述电子设备的硬件按键的操作。Operation of hardware keys for the electronic device.
  3. 如权利要求1或2所述的方法,其特征在于,在将所述第一显示内容显示于所述手指坐标对应的第二显示区域内之后,所述方法还包括:The method according to claim 1 or 2, wherein after the first display content is displayed in the second display area corresponding to the finger coordinates, the method further comprises:
    接收在所述第二显示区域内的第一滑动操作;receiving a first sliding operation in the second display area;
    根据所述第一滑动操作,确定第二显示内容,将所述第二显示区域内的显示内容由所述第一显示内容切换为所述第二显示内容。According to the first sliding operation, the second display content is determined, and the display content in the second display area is switched from the first display content to the second display content.
  4. 如权利要求1或2所述的方法,其特征在于,在将所述第一显示内容显示于所述手指坐标对应的第二显示区域内之后,还包括:The method according to claim 1 or 2, wherein after displaying the first display content in the second display area corresponding to the finger coordinates, the method further comprises:
    获取第二视点坐标以及所述第二视点坐标对应的第三显示区域内的第三显示内容,将所述第二显示区域内的显示内容由所述第一显示内容切换为所述第三显示内容。Obtain the coordinates of the second viewpoint and the third display content in the third display area corresponding to the coordinates of the second viewpoint, and switch the display content in the second display area from the first display content to the third display content.
  5. 如权利要求1-4任一项所述的方法,其特征在于,在将所述第一显示内容显示于所述手指坐标对应的第二显示区域内之前,还包括:The method according to any one of claims 1-4, wherein before displaying the first display content in the second display area corresponding to the finger coordinates, the method further comprises:
    判断所述手指坐标在所述第二显示区域中对应的第一位置上是否存在应用图标或控件图标;judging whether there is an application icon or a control icon at the first position corresponding to the finger coordinates in the second display area;
    若存在,则将所述第一显示内容显示于所述第二显示区域中的第二位置上;所述第二位置与所述第一位置相距第一预设值;If there is, the first display content is displayed at a second position in the second display area; the second position is separated from the first position by a first preset value;
    若不存在,则将所述第一显示内容显示于所述第一位置上。If it does not exist, the first display content is displayed on the first position.
  6. 如权利要求1-5任一项所述的方法,其特征在于,在将所述第一显示内容显示于所述手指对应的第二显示区域内之前,还包括:确定用户视线停留在所述第一视点坐标的时长超过第二预设值。The method according to any one of claims 1-5, wherein before displaying the first display content in the second display area corresponding to the finger, the method further comprises: determining that the user's sight stays on the The duration of the first viewpoint coordinates exceeds the second preset value.
  7. 如权利要求1-6任一项所述的方法,其特征在于,所述方法还包括:The method according to any one of claims 1-6, wherein the method further comprises:
    接收针对所述第一显示内容中至少一个目标图标的点击操作,确定第一目标图标,执行所述第一目标图标对应的应用进程。A click operation on at least one target icon in the first display content is received, a first target icon is determined, and an application process corresponding to the first target icon is executed.
  8. 如权利要求1-7任一项所述的方法,其特征在于,所述响应于所述第一操作,获取第一视点坐标,包括:The method according to any one of claims 1-7, wherein, in response to the first operation, acquiring the coordinates of the first viewpoint includes:
    确定所述用户的眼睛与红外摄像头的第一距离、角膜反射光斑坐标以及瞳孔中心坐标;determining the first distance between the user's eyes and the infrared camera, the coordinates of the corneal reflection spot, and the coordinates of the pupil center;
    根据所述第一距离、所述角膜反射光斑坐标以及所述瞳孔中心坐标,确定所述第一视 点坐标。The coordinates of the first viewpoint are determined according to the first distance, the coordinates of the corneal reflection spot and the coordinates of the pupil center.
  9. 如权利要求2-8任一项所述的方法,其特征在于,所述响应于所述第一操作,获取手指坐标,包括:The method according to any one of claims 2-8, wherein the acquiring finger coordinates in response to the first operation comprises:
    确定所述手指与所述电子设备中的红外光传感器的第二距离、所述手指与所述显示屏的第三距离、所述手指相对于所述红外光传感器的方位;determining the second distance between the finger and the infrared light sensor in the electronic device, the third distance between the finger and the display screen, and the orientation of the finger relative to the infrared light sensor;
    根据第二距离和第三距离,确定所述手指在所述显示屏上的投影点与所述红外光传感器的第四距离;According to the second distance and the third distance, determine the fourth distance between the projection point of the finger on the display screen and the infrared light sensor;
    根据所述第四距离和所述方位,确定所述手指坐标。The finger coordinates are determined based on the fourth distance and the orientation.
  10. 一种电子设备,其特征在于,所述电子设备包括:处理器和存储器;其中,所述存储器用于存储一个或多个计算机程序;An electronic device, characterized in that the electronic device comprises: a processor and a memory; wherein the memory is used to store one or more computer programs;
    当所述存储器存储的一个或多个计算机程序被所述处理器执行时,使得所述电子设备执行如下步骤:When executed by the processor, one or more computer programs stored in the memory cause the electronic device to perform the following steps:
    接收第一操作;receive the first operation;
    响应于所述第一操作,启动单手操作模式;in response to the first operation, initiating a one-handed operation mode;
    在所述单手操作模式下,获取第一视点坐标和手指坐标;获取所述第一视点坐标对应的第一显示区域内的第一显示内容,将所述第一显示内容显示于所述手指坐标对应的第二显示区域内。In the one-handed operation mode, obtain the coordinates of the first viewpoint and the coordinates of the finger; obtain the first display content in the first display area corresponding to the coordinates of the first viewpoint, and display the first display content on the finger in the second display area corresponding to the coordinates.
  11. 如权利要求10所述的电子设备,其特征在于,所述电子设备还包括显示屏,所述第一操作为:The electronic device according to claim 10, wherein the electronic device further comprises a display screen, and the first operation is:
    所述电子设备被抬起或被摇晃;或者,the electronic device is lifted or shaken; or,
    针对所述显示屏的点击操作或滑动操作;或者,A tap operation or a swipe operation on the display screen; or,
    语音指令;或者,voice commands; or,
    针对所述电子设备的硬件按键的操作。Operation of hardware keys for the electronic device.
  12. 如权利要求10或11所述的电子设备,其特征在于,当所述存储器存储的一个或多个计算机程序被所述处理器执行时,还使得所述电子设备执行如下步骤:The electronic device according to claim 10 or 11, wherein when one or more computer programs stored in the memory are executed by the processor, the electronic device is further caused to perform the following steps:
    接收在所述第二显示区域内的第一滑动操作;receiving a first sliding operation in the second display area;
    根据所述第一滑动操作,确定第二显示内容,将所述第二显示区域内的显示内容由所述第一显示内容切换为所述第二显示内容。According to the first sliding operation, the second display content is determined, and the display content in the second display area is switched from the first display content to the second display content.
  13. 如权利要求10或11所述的电子设备,其特征在于,当所述存储器存储的一个或多个计算机程序被所述处理器执行时,还使得所述电子设备执行如下步骤:The electronic device according to claim 10 or 11, wherein when one or more computer programs stored in the memory are executed by the processor, the electronic device is further caused to perform the following steps:
    获取第二视点坐标以及所述第二视点坐标对应的第三显示区域内的第三显示内容,将所述第二显示区域内的显示内容由所述第一显示内容切换为所述第三显示内容。Obtain the coordinates of the second viewpoint and the third display content in the third display area corresponding to the coordinates of the second viewpoint, and switch the display content in the second display area from the first display content to the third display content.
  14. 如权利要求10-13任一项所述的电子设备,其特征在于,当所述存储器存储的一个或多个计算机程序被所述处理器执行时,还使得所述电子设备执行如下步骤:The electronic device according to any one of claims 10-13, wherein when one or more computer programs stored in the memory are executed by the processor, the electronic device is further caused to perform the following steps:
    判断所述手指坐标在所述第二显示区域中对应的第一位置上是否存在应用图标或控件图标;judging whether there is an application icon or a control icon at the first position corresponding to the finger coordinates in the second display area;
    若存在,则将所述第一显示内容显示于所述第二显示区域中的第二位置上;所述第二位置与所述第一位置相距第一预设值;If there is, the first display content is displayed at a second position in the second display area; the second position is separated from the first position by a first preset value;
    若不存在,则将所述第一显示内容显示于所述第一位置上。If it does not exist, the first display content is displayed on the first position.
  15. 如权利要求10-14任一项所述的电子设备,其特征在于,当所述存储器存储的一个 或多个计算机程序被所述处理器执行时,还使得所述电子设备执行如下步骤:确定用户视线停留在所述第一视点坐标的时长超过第二预设值。The electronic device according to any one of claims 10-14, wherein when one or more computer programs stored in the memory are executed by the processor, the electronic device is further caused to perform the following step: determining The time period for which the user's sight stays on the coordinates of the first viewpoint exceeds a second preset value.
  16. 如权利要求10-15任一项所述的电子设备,其特征在于,当所述存储器存储的一个或多个计算机程序被所述处理器执行时,还使得所述电子设备执行如下步骤:The electronic device according to any one of claims 10-15, wherein when one or more computer programs stored in the memory are executed by the processor, the electronic device is further caused to perform the following steps:
    接收针对所述第一显示内容中至少一个目标图标的点击操作,确定第一目标图标,执行所述第一目标图标对应的应用进程。A click operation on at least one target icon in the first display content is received, a first target icon is determined, and an application process corresponding to the first target icon is executed.
  17. 如权利要求10-16任一项所述的电子设备,其特征在于,所述电子设备还包括红外摄像头,当所述存储器存储的一个或多个计算机程序被所述处理器执行时,还使得所述电子设备执行如下步骤:The electronic device according to any one of claims 10-16, wherein the electronic device further comprises an infrared camera, and when the one or more computer programs stored in the memory are executed by the processor, the electronic device further enables The electronic device performs the following steps:
    确定所述用户的眼睛与所述红外摄像头的第一距离、角膜反射光斑坐标以及瞳孔中心坐标;determining the first distance between the user's eyes and the infrared camera, the coordinates of the corneal reflection spot, and the coordinates of the pupil center;
    根据所述第一距离、所述角膜反射光斑坐标以及所述瞳孔中心坐标,确定所述第一视点坐标。The coordinates of the first viewpoint are determined according to the first distance, the coordinates of the corneal reflection spot, and the coordinates of the pupil center.
  18. 如权利要求11-17任一项所述的电子设备,其特征在于,电子设备还包括红外光传感器,当所述存储器存储的一个或多个计算机程序被所述处理器执行时,还使得所述电子设备执行如下步骤:The electronic device according to any one of claims 11-17, characterized in that, the electronic device further comprises an infrared light sensor, and when one or more computer programs stored in the memory are executed by the processor, the electronic device further enables all the computer programs stored in the memory to be executed by the processor. The electronic device performs the following steps:
    确定所述手指与所述红外光传感器的第二距离、所述手指与所述显示屏的第三距离、所述手指相对于所述红外光传感器的方位;determining the second distance between the finger and the infrared light sensor, the third distance between the finger and the display screen, and the orientation of the finger relative to the infrared light sensor;
    根据第二距离和第三距离,确定所述手指在所述显示屏上的投影点与所述红外光传感器的第四距离;According to the second distance and the third distance, determine the fourth distance between the projection point of the finger on the display screen and the infrared light sensor;
    根据所述第四距离和所述方位,确定所述手指坐标。The finger coordinates are determined based on the fourth distance and the orientation.
  19. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质包括计算机程序,当计算机程序在电子设备上运行时,使得所述电子设备执行如权利要求1至9任一所述的方法。A computer-readable storage medium, characterized in that the computer-readable storage medium comprises a computer program, which, when the computer program is run on an electronic device, causes the electronic device to perform any one of claims 1 to 9. method.
  20. 一种芯片,其特征在于,所述芯片包括处理器与数据接口,所述处理器用于通过所述数据接口读取并执行存储器上存储的指令,使得计算机执行如权利要求1至9任一所述的方法被执行。A chip, characterized in that the chip includes a processor and a data interface, and the processor is configured to read and execute the instructions stored in the memory through the data interface, so that the computer executes any one of claims 1 to 9. The described method is executed.
PCT/CN2021/118075 2020-10-29 2021-09-13 Interface display method and electronic device WO2022089060A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011179164.7 2020-10-29
CN202011179164.7A CN114510174A (en) 2020-10-29 2020-10-29 Interface display method and electronic equipment

Publications (1)

Publication Number Publication Date
WO2022089060A1 true WO2022089060A1 (en) 2022-05-05

Family

ID=81381880

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/118075 WO2022089060A1 (en) 2020-10-29 2021-09-13 Interface display method and electronic device

Country Status (2)

Country Link
CN (1) CN114510174A (en)
WO (1) WO2022089060A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114866641A (en) * 2022-07-07 2022-08-05 荣耀终端有限公司 Icon processing method, terminal equipment and storage medium
CN114911394A (en) * 2022-05-26 2022-08-16 青岛海信移动通信技术股份有限公司 Terminal device and one-hand operation method
CN116700659A (en) * 2022-09-02 2023-09-05 荣耀终端有限公司 Interface interaction method and electronic equipment
WO2024037379A1 (en) * 2022-08-18 2024-02-22 华为技术有限公司 Notification checking method and system, and related apparatus

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116052261A (en) * 2022-05-31 2023-05-02 荣耀终端有限公司 Sight estimation method and electronic equipment
CN117135256A (en) * 2023-04-06 2023-11-28 荣耀终端有限公司 Data processing method and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130307797A1 (en) * 2012-05-18 2013-11-21 Fujitsu Limited Tablet terminal and recording medium
CN108540642A (en) * 2018-02-28 2018-09-14 维沃移动通信有限公司 The operating method and mobile terminal of mobile terminal
CN109246292A (en) * 2018-08-17 2019-01-18 珠海格力电器股份有限公司 A kind of moving method and device of terminal desktop icon
CN109819102A (en) * 2018-12-19 2019-05-28 努比亚技术有限公司 A kind of navigation bar control method and mobile terminal, computer readable storage medium
CN111078044A (en) * 2019-11-22 2020-04-28 深圳传音控股股份有限公司 Terminal interaction method, terminal and computer storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130307797A1 (en) * 2012-05-18 2013-11-21 Fujitsu Limited Tablet terminal and recording medium
CN108540642A (en) * 2018-02-28 2018-09-14 维沃移动通信有限公司 The operating method and mobile terminal of mobile terminal
CN109246292A (en) * 2018-08-17 2019-01-18 珠海格力电器股份有限公司 A kind of moving method and device of terminal desktop icon
CN109819102A (en) * 2018-12-19 2019-05-28 努比亚技术有限公司 A kind of navigation bar control method and mobile terminal, computer readable storage medium
CN111078044A (en) * 2019-11-22 2020-04-28 深圳传音控股股份有限公司 Terminal interaction method, terminal and computer storage medium

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114911394A (en) * 2022-05-26 2022-08-16 青岛海信移动通信技术股份有限公司 Terminal device and one-hand operation method
CN114866641A (en) * 2022-07-07 2022-08-05 荣耀终端有限公司 Icon processing method, terminal equipment and storage medium
WO2024037379A1 (en) * 2022-08-18 2024-02-22 华为技术有限公司 Notification checking method and system, and related apparatus
CN116700659A (en) * 2022-09-02 2023-09-05 荣耀终端有限公司 Interface interaction method and electronic equipment
CN116700659B (en) * 2022-09-02 2024-03-08 荣耀终端有限公司 Interface interaction method and electronic equipment

Also Published As

Publication number Publication date
CN114510174A (en) 2022-05-17

Similar Documents

Publication Publication Date Title
WO2022089060A1 (en) Interface display method and electronic device
EP3982247A1 (en) Screen projection method and electronic device
WO2021063074A1 (en) Method for split-screen display and electronic apparatus
WO2021057868A1 (en) Interface switching method and electronic device
JP2022546006A (en) Devices with foldable screens and touch control methods for foldable screen devices
US20170315777A1 (en) Method, terminal, and storage medium for starting voice input function of terminal
WO2021023030A1 (en) Message display method and electronic device
JP7354308B2 (en) Display method and electronic device
WO2021063090A1 (en) Method for establishing application combination, and electronic device
WO2021037223A1 (en) Touch control method and electronic device
KR20210068097A (en) Method for controlling display of system navigation bar, graphical user interface and electronic device
WO2019000287A1 (en) Icon display method and device
US20180063130A1 (en) Terminal control method and device
CN112615947B (en) Method for rapidly entering application and folding screen electronic equipment
EP4033339A1 (en) User interface display method and electronic device
WO2019100298A1 (en) Photographing method and terminal
EP3736691A1 (en) Display method and apparatus for authentication window
CN111510482B (en) Method and device for determining failed network request and computer storage medium
WO2022028290A1 (en) Method for interaction between devices based on pointing operation, and electronic device
WO2023179751A9 (en) Object searching method, system and electronic device
WO2022022443A1 (en) Method for moving control and electronic device
CN111049968B (en) Control method and electronic equipment
CN114513479B (en) Message transmitting and receiving method, device, terminal, server and storage medium
CN115390934A (en) Method and terminal for opening function of back module
CN112882823A (en) Screen display method and electronic equipment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21884790

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21884790

Country of ref document: EP

Kind code of ref document: A1