WO2021104015A1 - 一种单手操作的方法和电子设备 - Google Patents

一种单手操作的方法和电子设备 Download PDF

Info

Publication number
WO2021104015A1
WO2021104015A1 PCT/CN2020/128000 CN2020128000W WO2021104015A1 WO 2021104015 A1 WO2021104015 A1 WO 2021104015A1 CN 2020128000 W CN2020128000 W CN 2020128000W WO 2021104015 A1 WO2021104015 A1 WO 2021104015A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
user
interface
floating
thumb
Prior art date
Application number
PCT/CN2020/128000
Other languages
English (en)
French (fr)
Inventor
谭宇超
马中骐
唐钊
钱申
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP20892299.7A priority Critical patent/EP4053688A4/en
Priority to US17/780,678 priority patent/US20230009389A1/en
Publication of WO2021104015A1 publication Critical patent/WO2021104015A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1686Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • G06V40/117Biometrics derived from hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the present invention relates to the technical field of terminals, in particular to a single-handed operation method and electronic equipment.
  • the user interface (UI) of the terminal is adapted to the screen size of the terminal.
  • some users may experience inconvenient clicks with their palms too small when they are operating the terminals (unreachable, resulting in accidental touches, etc.).
  • the existing one-handed operation mode can only reduce the UI of the terminal to a fixed size, and the position of the reduced UI is fixed, or the size of the UI is manually set by the user, and cannot be automatically scaled according to the user.
  • the present application provides a one-handed operation method executed by an electronic device, including: displaying a first interface that occupies all of the screen of the electronic device; and detecting a first trigger operation by a user According to the first trigger operation, display a floating interface, the size of the floating interface is smaller than the size of the first interface, the floating interface is located in the screen in the first area of the user's one-handed operation, the The first area is determined by the position where the user holds the electronic device and the length of the user's thumb.
  • the first interface is the screen UI of the mobile phone
  • the floating interface is the interface displayed when the user turns on the "floating screen” function
  • the first trigger operation is the user turning on the "floating screen” function
  • the first area is the screen when the user operates with one hand.
  • the upper operable area is the comfort zone.
  • the “floating screen” function is enabled to obtain the length of the user’s thumb and the user’s holding of the electronic device.
  • the location of the device calculates a comfort zone, which is the area that the user’s fingers can operate on the screen, and then displays the floating interface in the comfort zone. The user operates on the floating interface to realize the user with one hand Operate large-screen devices.
  • the method further includes: when the floating interface is the reduced first interface, the first interface is no longer displayed on the screen.
  • the floating interface is the UI of the reduced version of the screen. If the content displayed on the floating interface is the same as the content displayed by the UI of the screen, or is the content to be displayed by the UI of the screen, then the UI of the original screen can be used No content is displayed, that is, the area on the screen except the floating screen is a black screen to reduce power consumption and at the same time to avoid interference with the user's attention due to the content displayed on the screen UI.
  • the method before displaying the floating interface, further includes: turning on at least one camera and prompting the user to take an image of the hand; obtaining the length of the user's thumb includes: according to the The image of the hand, calculate the length of the thumb.
  • a way to obtain the length of the thumb is to prompt the user when the user opens the "floating screen" function for the first time or when setting the "floating screen” function (for example, setting a gesture that triggers the opening, etc.) Turn on the camera.
  • the camera When the camera is turned on, the user's hand image is obtained through the camera, and then the length of the user's thumb is calculated based on the hand image. After the thumb length is obtained, it is stored in the memory. When the user subsequently turns on the "floating screen” function again, there is no need to obtain the thumb length separately.
  • timing for obtaining the length of the thumb and the holding position may also be when the "floating screen" function is turned on or setting the "floating screen” function for the first time, or at other timings, this embodiment of the present invention does not do this. limited.
  • the method before displaying the floating interface, further includes: prompting the user to draw an arc on the screen when holding the electronic device with one hand, and obtaining the arc trajectory
  • Obtaining the length of the thumb of the user includes: calculating the length of the thumb according to the arc-shaped trajectory. For example, reference may be made to the description related to FIG. 3 in the embodiment.
  • one way to obtain the length of the thumb is to prompt the user to perform fingerprint recognition on the UI of the screen.
  • the fingerprint sensor collects the user's fingerprint, and then according to the size of the fingerprint, And combine the relationship between the size of the fingerprint and the length of the thumb to determine the length of the user's thumb.
  • the fingerprints obtained when the electronic device is registered before the fingerprint lock screen can also be used, so as to avoid the user from obtaining fingerprints and improve the user experience.
  • the method before displaying the floating interface, further includes: prompting the user to draw an arc on the screen when holding the electronic device with one hand, and obtaining the arc trajectory ;
  • Obtaining the length of the user's thumb includes: calculating the position where the user holds the electronic device according to the arc-shaped trajectory.
  • a way to obtain the length of the thumb is to prompt the user to draw an arc on the UI of the screen. After the user draws an arc, the processor calculates the radius of the arc according to the curvature of the arc , To determine the length of the user’s thumb.
  • the method of obtaining the length of the thumb in this application is not limited to the above three methods, and it is also possible to perform specific operations for determining the length of the thumb, directly inputting data of the length of the thumb, etc. to determine the length of the user's thumb.
  • the method before displaying the floating interface, further includes: prompting the user to draw an arc on the screen when holding the electronic device with one hand, and obtaining the arc trajectory Acquiring the position where the user holds the electronic device includes: calculating the position where the user holds the electronic device according to the arc-shaped trajectory.
  • a way to obtain the position where the user is holding the electronic device is: when the electronic device obtains the length of the thumb, after the user draws an arc on the UI of the screen, the processor not only calculates according to the curvature of the arc The radius of the arc is used to determine the length of the user's thumb, and the position where the user holds the electronic device can also be determined according to the calculated center of the circle. In this way, when the electronic device obtains the length of the user's thumb and the position where the user holds the electronic device, the user is required to perform multiple specific operations, thereby improving the user experience.
  • the method before displaying the floating interface, further includes obtaining at least one operation point of the user operating on the screen within a period of time; obtaining the user’s hold of the electronic device
  • the position includes: calculating the position where the user holds the electronic device according to the at least one operating point.
  • the calculating the position at which the user holds the electronic device according to the operating point includes: determining the position of the at least one operating point on the screen in N circles
  • the center of the circle covering the most number of the at least one operating point is the holding point position
  • the N circles are centered on the screen with the N positions on the edge of the screen as the center and the thumb
  • the length is a circle with a radius
  • N is an integer greater than 2.
  • the holding point is the position where the palm or thumb of the user touches the edge of the screen when the user holds the mobile phone.
  • a way to obtain the position of the user holding the electronic device is: after the "floating screen" is turned on, the processor counts the number of clicks and the location of each click on the screen in the period of time before the "floating screen” function is turned on. 194. Detect the number of operation points covered by the circle centered on the edge of the screen and the radius of the thumb length, determine the circle that covers the most clicks, and use the center of the circle as the holding point of the user's handheld electronic device , So as to determine where the user is holding the electronic device. Since the position of the user holding the electronic device is acquired in this way and executed by the processor, there is no need for the user to perform a specified operation, thereby improving the user experience.
  • the method of obtaining the position of the user holding the electronic device in this application is not limited to the above two methods. It can also detect the temperature of the hand according to the temperature sensor 180J, the pressure sensor to detect the holding pressure, and directly input the palm or thumb when holding the mobile phone. The position in contact with the edge of the screen is used to determine the position where the user holds the electronic device.
  • determining the first area includes: determining, according to the position where the user holds the electronic device, that the palm or thumb of the user is in contact with the edge of the screen when the user holds the mobile phone The grip point; the grip point is the center of the circle, the length of the thumb is the radius of the area formed on the screen as the first area.
  • determining the first area includes: determining, according to the positions where at least two users hold the electronic device, that the palms or thumbs of at least two users are The grip points at the edges of the screen that are in contact with each other; the first area is an overlapping area between at least two areas formed on the screen formed by at least two grip points as the center of the circle and the length of the thumb as the radius.
  • the processor obtains the position of the user's left hand holding the mobile phone and the position of the user's right hand holding the mobile phone, and then combines the length of the user's thumb on the screen.
  • the upper part forms a comfort zone generated by the left-hand holding point and a comfort zone generated by the right-hand holding point.
  • the processor regards the area where the two comfort zones overlap as the area where the floating screen is displayed. The floating screen displayed in this way allows the user to operate with the left hand, or the user to operate with the right hand.
  • the floating interface is displayed in the first region in the shape of the largest size, and the shape of the floating interface is the same as the shape of the first interface. At least one corner of the floating interface presented at this time is located on the edge of the comfort zone. By presenting it in the comfort zone in the shape of the largest size, it is convenient for users to watch the content of the floating interface and operate the applications in the floating interface.
  • the method further includes: detecting that the position where the user holds the electronic device has changed, and determining the second position according to the updated position of the user holding the electronic device and the length of the thumb Area, and display the second area on the first interface.
  • the second area is the area that can be operated on the screen when the position where the user holds the electronic device changes, that is, the new comfort zone.
  • the method further includes: detecting that the user performs a tap operation in the second area, and in response to the tap operation, displaying the floating interface in the second area.
  • the floating interface is moved Into the new comfort zone.
  • the method further includes: detecting that the user performs a tap operation in the second area, and when the floating interface does not overlap with the second area, responding to the tap operation, The floating interface is displayed in the second area.
  • the method further includes: detecting a drag operation in which the user moves the floating interface from the first area to the second area, and in response to the drag operation, The floating interface is displayed at the end position of the drag operation.
  • the user's operation in the floating interface is detected.
  • the drag operation of dragging the floating interface to the new comfort zone ends.
  • the position, and the distance and trajectory that the floating screen moves on the screen are the distance and trajectory that the user drags on the screen.
  • the method further includes: detecting a drag operation of the user moving the floating interface from the first area to the second area, when the floating interface and the second area are dragged When the regions overlap, in response to the drag operation, the floating interface is displayed at the end position of the drag operation.
  • the method further includes: detecting an operation acting on the first position, wherein the floating interface includes a first control, the first interface includes a second control, and the first control The position displayed on the screen is the first position, the position displayed on the screen of the second control is the second position, and the first position and the second position at least partially overlap; Whether to close the display of the floating interface; if an instruction to close the display is received, the floating interface is no longer displayed, and the application corresponding to the first interface responds to the operation on the first position, and according to the The second control displays the corresponding interface.
  • the first control is a touch device on the screen of the electronic device
  • the second control is a part of the touch device
  • the first position is the position displayed by the display on the screen of the electronic device
  • the second position is the display displayed on the display on the electronic device. Part of the location.
  • the position of the click is where the UI of the screen overlaps the floating interface.
  • the user can turn off the "floating screen” function before the operation.
  • the electronic device reminds the user whether to close the floating interface, and closes the "floating screen” function after receiving the user's closing instruction.
  • the method further includes: if the instruction to turn off the display is not received, determining whether at least one of the pressure value or the duration that triggers the operation of the first control is greater than a set value; if At least one of the pressure value or the duration of triggering the operation of the first control is greater than the set value, and the application corresponding to the first interface responds to the operation on the first position, and according to the The second control displays a corresponding interface; if at least one of the pressure value or duration that triggers the operation of the first control is not greater than the set value, the application corresponding to the floating interface responds to the action on the The operation of the first position is described, and the corresponding interface is displayed in the area occupied by the floating interface according to the first control.
  • the electronic device will determine whether the user's click is an operation on the screen UI or a floating screen based on factors such as the user's pressing time or strength on the screen. The operation of the interface to determine the response event corresponding to this click.
  • the method further includes: detecting that there are at least two operating points acting on the floating interface at the same time, and the positions of the at least two operating points are becoming farther and farther apart over time.
  • the method further includes: detecting that there are at least two operating points acting on the floating interface at the same time, and the positions of the at least two operating points are getting closer and closer to each other over time.
  • the size of the floating interface can be enlarged or reduced according to the user's zoom-in and zoom-out operations on the floating interface, so as to realize the display of floating interfaces of different sizes according to the needs of the user.
  • the present application provides a single-handed device that performs the method described in any one of the first aspect.
  • the present application provides an electronic device, including: a screen for displaying a first interface, the first interface occupies all of the screen of the electronic device, and according to the first trigger operation, a floating interface is displayed
  • One or more processors one or more memories; one or more sensors; and one or more computer programs, wherein the one or more computer programs are stored in the one or more memories, so
  • the one or more computer programs include instructions, which when executed by the electronic device, cause the electronic device to perform the method according to any one of the first aspects.
  • the present application provides a computer-readable storage medium having instructions stored in the computer-readable storage medium, wherein, when the instructions are executed on an electronic device, the electronic device is caused to execute The method of any one of the aspects.
  • the present application provides a computer program product containing instructions, characterized in that, when the computer program product runs on an electronic device, the electronic device is caused to execute any one of the instructions described in the first aspect. method.
  • the electronic equipment described in the third aspect, the computer storage medium described in the fourth aspect, and the computer program product described in the fifth aspect provided above are all used to execute the corresponding methods provided above.
  • the beneficial effects that can be achieved reference may be made to the beneficial effects in the corresponding method provided above, which will not be repeated here.
  • FIG. 1 is a schematic structural diagram of an electronic device provided by an embodiment of this application.
  • Figure 2 is a schematic diagram of the position of the touch point and the grip point in the palm of the hand;
  • Figure 3 is a schematic diagram of a user operating on the screen with one hand
  • Figure 4 is a schematic diagram of the position of the comfort zone on the screen
  • Figure 5 is a schematic diagram of the position of the floating screen displayed in the comfort zone
  • Fig. 6 is a block diagram of the software structure of a mobile phone provided by an embodiment of the application.
  • Figure 7 is a schematic diagram of a camera acquiring user palm information
  • FIG. 8 is a schematic diagram of determining a position of a holding point when a user holds a mobile phone according to an embodiment of the application
  • FIG. 9 is a schematic diagram of determining a position of a holding point when a user holds a mobile phone according to an embodiment of the application.
  • FIG. 10 is a schematic diagram of the shape of the comfort zone generated by different gripping points provided by the embodiments of the application;
  • FIG. 11(a) is a schematic diagram of the position of the floating screen in the comfort zone when the mobile phone is held in the horizontal screen according to an embodiment of the application;
  • FIG. 11(b) is a schematic diagram of the relationship between the UI of the screen and the content displayed on the floating screen when the mobile phone is held in a horizontal screen according to an embodiment of the application;
  • FIG. 12 is a schematic diagram of the position of the floating screen in the comfort zone when the holding point is in the middle of the edge of the screen according to an embodiment of the application;
  • FIG. 13 is a schematic diagram of determining the position of the comfort zone according to an embodiment of the application.
  • FIG. 14 is a schematic diagram of the position of the floating screen in the comfort zone provided by an embodiment of the application.
  • FIG. 15 is a flowchart of a method for following a hand on a floating screen according to an embodiment of the application.
  • FIG. 16 is a schematic diagram of a mobile floating screen scenario provided by an embodiment of this application.
  • FIG. 17 is a schematic diagram of a mobile floating screen scenario provided by an embodiment of the application.
  • FIG. 18 is a flowchart of a method for resolving a UI conflict between a floating screen and a screen provided by an embodiment of the application;
  • FIG. 19 is a schematic diagram of a UI and a floating screen of a screen including a status indicator provided by an embodiment of the application;
  • FIG. 20 is a schematic diagram of an enlarged levitation operation provided by an embodiment of the application.
  • FIG. 21 is a schematic diagram of a zoom-out operation provided by an embodiment of the application.
  • FIG. 22 is a schematic structural diagram of an electronic device provided by an embodiment of this application.
  • the one-handed operation method provided by the embodiments of this application can be applied to mobile phones, tablet computers, notebook computers, ultra-mobile personal computers (UMPC), handheld computers, netbooks, and personal digital assistants (personal digital assistants).
  • FIG. 1 shows a schematic diagram of the structure of the mobile phone.
  • the mobile phone 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2,
  • the radio frequency module 150 the communication module 160, the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, the sensor module 180, the buttons 190, the motor 191, the indicator 192, the camera 193, the screen 194, and the user identification module ( subscriber identification module, SIM card interface 195, etc.
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the mobile phone 100.
  • the mobile phone 100 may include more or fewer components than those shown in the figure, or combine certain components, or split certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (AP), a modem processor 110, a graphics processing unit (GPU), and an image signal.
  • AP application processor
  • GPU graphics processing unit
  • Processor 110 image signal processor, ISP
  • controller memory
  • video codec digital signal processor 110
  • DSP digital signal processor
  • baseband processor 110 baseband processor 110
  • neural network processor 110 neural- network processing unit, NPU
  • different processing units may be independent devices, or may be integrated in one or more processors 110.
  • the controller may be the nerve center and command center of the mobile phone 100.
  • the controller can generate operation control signals according to the instruction operation code and timing signals to complete the control of fetching instructions and executing instructions.
  • a memory may also be provided in the processor 110 to store instructions and data.
  • the memory in the processor 110 is a cache memory.
  • the memory can store instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated accesses are avoided, the waiting time of the processor 110 is reduced, and the efficiency of the system is improved.
  • the processor 110 may include one or more interfaces.
  • the interface can include an integrated circuit (I2C) interface, an integrated circuit audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (PCM) interface, and a universal asynchronous transceiver receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, And/or Universal Serial Bus (USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit sound
  • PCM pulse code modulation
  • UART universal asynchronous transceiver receiver/transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB Universal Serial Bus
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (SDA) and a serial clock line (SCL).
  • SDA serial data line
  • SCL serial clock line
  • the I2S interface can be used for audio communication.
  • the processor 110 may include multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170.
  • the PCM interface can also be used for audio communication to sample, quantize and encode analog signals.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a two-way communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • the MIPI interface can be used to connect the processor 110 with the screen 194, the camera 193 and other peripheral devices. MIPI interfaces include camera 193 serial interface (camera serial interface, CSI), display serial interface (display serial interface, DSI), etc.
  • the GPIO interface can be configured through software.
  • the GPIO interface can be configured as a control signal or as a data signal. In some embodiments, the GPIO interface may be used to connect the processor 110 with the camera 193, the screen 194, the communication module 160, the audio module 170, the sensor module 180, and so on.
  • the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, and so on.
  • the USB interface 130 is an interface that complies with the USB standard specifications, and specifically may be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like. .
  • the interface connection relationship between the modules illustrated in the embodiment of the present application is merely a schematic description, and does not constitute a structural limitation of the mobile phone 100.
  • the mobile phone 100 may also adopt different interface connection modes in the foregoing embodiments, or a combination of multiple interface connection modes.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the charging management module 140 may receive the charging input of the wired charger through the USB interface 130.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the external memory, the screen 194, the camera 193, the communication module 160, and the like.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the wireless communication function of the mobile phone 100 can be implemented by the antenna 1, the antenna 2, the radio frequency module 150, the communication module 160, the modem processor 110, and the baseband processor 110.
  • the antenna 1 and the antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the mobile phone 100 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the radio frequency module 150 can provide a wireless communication solution including 2G/3G/4G/5G and the like applied to the mobile phone 100.
  • the radio frequency module 150 may include at least one filter, a switch, a power amplifier, a low noise amplifier (LNA), and the like.
  • the radio frequency module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modem processor 110 for demodulation.
  • the radio frequency module 150 can also amplify the signal modulated by the modem processor 110, and convert it into electromagnetic waves for radiation by the antenna 1.
  • the modem processor 110 may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal. Then the demodulator transmits the low frequency baseband signal obtained by demodulation to the baseband processor 110 for processing. After being processed by the baseband processor 110, the low frequency baseband signal is passed to the application processor.
  • the application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays an image or video through the screen 194.
  • the communication module 160 can provide applications on the mobile phone 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), Bluetooth (BT), and global navigation satellite systems ( Global navigation satellite system, GNSS), frequency modulation (FM), near field communication (NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite systems
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • the communication module 160 may be one or more devices integrating at least one communication processing module.
  • the communication module 160 receives electromagnetic waves via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110.
  • the communication module 160 may also receive the signal to be sent from the processor 110, perform frequency modulation, amplify it, and convert it into electromagnetic waves and radiate it through the antenna 2.
  • the antenna 1 of the mobile phone 100 is coupled with the radio frequency module 150, and the antenna 2 is coupled with the communication module 160, so that the mobile phone 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), 5G, BT, GNSS, WLAN , NFC, FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), and quasi-zenith satellite system (quasi). -zenith satellite system, QZSS) and/or satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the mobile phone 100 can implement shooting functions through an ISP, a camera 193, a video codec, a GPU, a screen 194, and an application processor.
  • the ISP is used to process the data fed back from the camera 193. For example, when taking a picture, the shutter is opened, the light is transmitted to the photosensitive element of the camera 193 through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera 193 transmits the electrical signal to the ISP for processing and is converted into a visible image.
  • ISP can also optimize the image noise, brightness and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the camera 193 is used to capture still images or videos.
  • the object generates an optical image through the lens and is projected to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transfers the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the mobile phone 100 may include one or N cameras 193, and N is a positive integer greater than one.
  • the digital signal processor 110 is used to process digital signals, and in addition to processing digital image signals, it can also process other digital signals. For example, when the mobile phone 100 selects a frequency point, the digital signal processor 110 is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the mobile phone 100 may support one or more video codecs. In this way, the mobile phone 100 can play or record videos in multiple encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
  • MPEG moving picture experts group
  • MPEG2 MPEG2, MPEG3, MPEG4, and so on.
  • a camera may be used to collect the user's hand information.
  • the mobile phone 100 prompts the user to enter hand information.
  • the processor 110 turns on the camera 193 for shooting to obtain the user's hand information.
  • the hand information may include information such as the palm size, the length of each finger, and the fingerprint of each finger.
  • the embodiment of the application obtains the user's hand information, mainly to obtain the length of the user's thumb.
  • the length of the thumb is the position where the user's thumb touches the screen 194 (hereinafter referred to as the touch point) during the operation and when the user adopts the posture of the one-handed mobile phone 100 shown in FIG.
  • the distance between the contact positions (hereinafter referred to as the holding point).
  • the touch point is generally in the fingerprint area of the thumb, as shown at point M in Figure 2; the holding point is generally at a certain point in the palm or the palm to the right, as shown in the dotted area in Figure 2.
  • the position of points A, B, C, and D In this way, the length of the thumb is the distance from point M to point A, point B, point C or point D.
  • the processor 110 calculates the length of the thumb, it selects the holding point according to the size of the palm in the acquired image. That is, the larger the palm, the closer to the left of the selected holding point in the dotted area; the smaller the palm, the closer to the right of the selected holding point in the dotted area.
  • the processor 110 controls one camera 193 of the at least two cameras 193 to obtain red green blue (RGB) images including the user's hand, and controls the other camera 193 of the at least two cameras 193 to obtain The image depth information of the user's hand is included, and then the RGB image and the image depth information are obtained according to the at least two cameras 193, and a red green blue-depth map (RGB-D) is calculated.
  • RGB red green blue
  • the processor 110 recognizes the touch point M and the holding point A according to the RGB-D image (taking the holding point A as an example), and then combines the resolution H(Height)*W(Width) of the RGB image, Calculate the positions M1 (Xp, Yp) and A1 (Xf, Yf) of the touch point M and the holding point A in the RGB image. Then, according to the image depth information, calculate the touch point M and the holding point A in the RGB-D image Middle positions M2 (Xp, Yp, Zp) and A2 (Xf, Yf, Zf).
  • the processor 110 converts coordinate points in RGB-D coordinates into coordinate points in a Cartesian coordinate system in space, and the calculation process is specifically as follows:
  • Cx, Cy, Fx, and Fy are the internal parameter data of the camera that obtains the RGD image
  • Cx, Cy are the vertical and horizontal offsets (unit: pixels) of the image origin relative to the imaging point of the aperture center
  • Fx f/dx, where f Is the focal length of the camera
  • dx is the length unit occupied by a pixel in the x direction
  • dy is the length unit occupied by a pixel in the y direction.
  • H Height
  • W Width
  • the first direction is generally compared with the mobile phone screen.
  • the short side is parallel
  • the second direction is parallel to the longer side of the phone screen.
  • the processor 110 controls two of the at least two cameras 193 to obtain two RGB images including the user's hand, and then calculates the RGB-D image according to the principle of the binocular camera 193. After the processor 110 obtains the RGB-D image including the user's hand, it calculates the length of the thumb according to the resolution H(Height)*W(Width) of the RGB image and the image depth information.
  • the embodiment of the application obtains the user’s hand information to obtain the length of the thumb, which is used to determine the size of the floating screen display when the “floating screen” function is subsequently turned on, so as to ensure that the user is able to hold the phone while holding the phone. Operate at various positions of the floating screen.
  • the mobile phone 100 implements a positioning function and a display function through a GPU, a screen 194, and an application processor.
  • the GPU is a microprocessor for image processing, which connects the screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations and is used for graphics rendering.
  • the processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the screen 194 may include a display and a touch device.
  • the display is used to output display content to the user, and the touch device is used to receive touch events input by the user on the screen 194.
  • the processor 110 may also determine the click position when the user performs an operation after receiving the touch event sent by the screen 194.
  • the touch device may be a touch sensor 1810K.
  • the processor 110 obtains the click position of the user's operation on the screen 194, and then combines the length of the thumb to determine the position of the holding point when the user holds the mobile phone.
  • a prompt for instructing the user to draw an arc on the screen 194 is displayed on the screen 194.
  • the touch device will receive the operation
  • the click position is sent to the processor 110.
  • the processor 110 calculates the center position of the arc on the periphery of the screen 194 according to the curvature of the arc, and uses the center position as the position of the holding point when the user holds the mobile phone.
  • the touch device sends the received click position for the operation to the processor 110.
  • the processor 110 receives the click position, it calculates the center position of the arc according to the curvature of the arc, and the center of the circle at this time is inside the screen or outside the screen 194. At this time, the processor increases or decreases the curvature of the arc according to the arc drawn by the user, so that the calculated center of the circle is on the edge of the screen 194.
  • the processor 110 after the processor 110 obtains the position of the holding point when the user is holding the mobile phone, it will use the holding point according to the position of the holding point (that is, the position of the center of the arc) and the length of the thumb.
  • the position of the point is taken as the center of the circle, and the length of the thumb is taken as the radius, and an area accessible by the user's thumb (hereinafter referred to as the comfort zone) is determined on the screen 194 as the dotted area shown in FIG. 4.
  • the comfort zone an area accessible by the user's thumb
  • the processor 110 presents a floating screen in the comfort zone on the screen 194, as shown in FIG. 5.
  • the floating screen is a reduced version of the UI of the screen 194, for example, it may be a reduced version of the same proportion.
  • the aspect ratio of the UI is the same as the aspect ratio of the UI of the screen 194.
  • the floating screen can also be a reduced version of unequal proportions.
  • the long side of the UI refers to the longer side of the screen
  • the wide side of the UI refers to the shorter side of the screen.
  • the user since the position of the floating screen is within the user's maximum operable range, the user can operate at any position on the floating screen, which makes it more convenient for the user to operate the large-screen device with one hand.
  • the NPU is a neural-network (NN) computing processor 110.
  • NN neural-network
  • the internal memory 121 may be used to store computer executable program code, where the executable program code includes instructions.
  • the processor 110 executes various functional applications and data processing of the mobile phone 100 by running instructions stored in the internal memory 121.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program (such as a sound playback function, an image playback function, etc.) required by at least one function, and the like.
  • the data storage area can store data (such as audio data, phone book, etc.) created during the use of the mobile phone 100.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash storage (UFS), and the like.
  • UFS universal flash storage
  • the mobile phone 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. For example, music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into an analog audio signal for output, and is also used to convert an analog audio input into a digital audio signal.
  • the audio module 170 can also be used to encode and decode audio signals.
  • the audio module 170 may be provided in the processor 110, or part of the functional modules of the audio module 170 may be provided in the processor 110.
  • the speaker 170A also called “speaker” is used to convert audio electrical signals into sound signals.
  • the mobile phone 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the mobile phone 100 answers a call or a voice message, it can receive the voice by bringing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals. When making a call or sending a voice message, the user can make a sound by approaching the microphone 170C through the human mouth, and input the sound signal into the microphone 170C.
  • the mobile phone 100 may be provided with at least one microphone 170C.
  • the earphone interface 170D is used to connect wired earphones.
  • the earphone interface 170D may be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, and a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
  • OMTP open mobile terminal platform
  • the button 190 includes a power-on button, a volume button, and the like.
  • the button 190 may be a mechanical button. It can also be a touch button.
  • the mobile phone 100 can receive key input, and generate key signal input related to user settings and function control of the mobile phone 100.
  • the motor 191 can generate vibration prompts.
  • the motor 191 can be used for incoming call vibration notification, and can also be used for touch vibration feedback.
  • touch operations that act on different applications can correspond to different vibration feedback effects.
  • Acting on touch operations in different areas of the screen 194, the motor 191 can also correspond to different vibration feedback effects.
  • the indicator 192 may be an indicator light, which may be used to indicate the charging status, power change, or to indicate messages, missed calls, notifications, and the like.
  • the SIM card interface 195 is used to connect to the SIM card.
  • the SIM card can be connected to and separated from the mobile phone 100 by inserting into the SIM card interface 195 or pulling out from the SIM card interface 195.
  • the mobile phone 100 may support 1 or N SIM card interfaces, and N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM cards, Micro SIM cards, SIM cards, etc.
  • the same SIM card interface 195 can insert multiple cards at the same time. The types of the multiple cards can be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 may also be compatible with external memory cards.
  • the mobile phone 100 interacts with the network through the SIM card to implement functions such as call and data communication.
  • the mobile phone 100 uses an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the mobile phone 100 and cannot be separated from the mobile phone 100.
  • the above-mentioned software system of the mobile phone 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiment of the present application takes an Android system with a layered architecture as an example to illustrate the software structure of the mobile phone 100 by way of example.
  • FIG. 6 is a block diagram of the software structure of the mobile phone 100 according to an embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Communication between layers through software interface.
  • the Android system is divided into four layers, from top to bottom, the application layer, the application framework layer, the Android runtime and system library, and the kernel layer.
  • the application layer may include a series of application packages. As shown in Figure 6, applications such as camera, gallery, calendar, call, map, navigation, Bluetooth, music, video, short message, etc. can be installed in the application layer.
  • applications such as camera, gallery, calendar, call, map, navigation, Bluetooth, music, video, short message, etc. can be installed in the application layer.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer can include display policy services and display management services.
  • the application framework layer may also include an activity manager, a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, etc.
  • the embodiment of the present application does not impose any limitation on this.
  • the display policy service can obtain data reported by the underlying system, such as image data obtained by a camera, touch points obtained by a touch device, and other data.
  • the mobile phone 100 may receive a trigger operation of the user to obtain hand information.
  • the trigger operation may be an operation of turning on the camera 193, an operation of tapping the screen 194, an operation of unlocking a fingerprint, and an operation of inputting data.
  • the underlying system of the mobile phone 100 sends data such as touch points to the display policy service.
  • the system library and kernel layer below the application framework layer can be referred to as the underlying system.
  • the underlying system includes the underlying display system for providing display services.
  • the underlying display system includes the display in the kernel layer.
  • the underlying system in this application also includes a status monitoring service for obtaining the length of the user's thumb.
  • the status monitoring service can be independently set in the underlying display system, or in the system library and/or kernel layer.
  • the state monitoring service may call a sensor service to start sensors such as the camera 193 and touch devices for detection.
  • the condition monitoring service can send the display strategy service according to the detection data reported by each sensor.
  • the display strategy service calculates the thumb length based on the acquired data.
  • the display strategy service can obtain other data reported by the underlying system, for example, the touch device obtains multiple touch points, and the temperature sensor obtains data such as temperature.
  • the touch control device can also detect the trigger operation of the user to acquire the position of holding the mobile phone.
  • the trigger operation may be an operation of the user tapping the screen 194, sliding the screen 194, turning on the sensor, and so on.
  • the underlying system sends data such as touch points to the display policy service.
  • the condition monitoring service can also call the sensor service (sensor service) to start the touch device, temperature sensor and other sensors of the screen 194 for detection.
  • the condition monitoring service can send the display strategy service according to the detection data reported by each sensor.
  • the display strategy service calculates the position of the holding point when the user holds the mobile phone based on the acquired data.
  • the display strategy service determines the area (ie comfort zone) that the user can operate on the screen 194 according to the length of the user's thumb and the position of the holding point when the user holds the mobile phone, and then informs the display management service to display the floating screen in the comfort zone.
  • the embodiment of this application obtains the length of the user’s thumb and the position of the holding point when the user is holding the mobile phone through the underlying system, and then reports to the display strategy service.
  • the display strategy service calculates based on the length of the thumb and the position of the holding point when holding the mobile phone Out of the position of the comfort zone on the screen 194, the notification display management service displays a floating screen in the comfort zone.
  • the mobile phone When the mobile phone activates the "floating screen" function for the first time, the mobile phone reminds the user to enter hand information. After the mobile phone enters the user's hand information, the processor 110 calculates the length of the user's thumb according to the acquired hand information.
  • the mobile phone 100 turns on the camera 193 to shoot after receiving the “turn on the camera 193” instruction input by the user.
  • the camera 193 obtains an image including the user's hand, and processes the image by the processor 110 to obtain the length of the user's thumb.
  • At least two cameras 193 collect hand information and send it to the processor 110.
  • the processor 110 collects hand information according to the camera 193 to generate an RGB-D image, and then calculates the length of the thumb according to the resolution H(Height)*W(Width) and the image depth information.
  • the method of obtaining the length of the user's thumb in the embodiment of the present application is not limited to the above-mentioned obtaining through the camera 193, and may also be obtained through the fingerprint sensor 180H.
  • the size of human fingerprints is closely related to the size of the palm and the length of the thumb. The larger the fingerprint, the larger the palm, and the longer the thumb; the smaller the fingerprint, the smaller the palm and the shorter the thumb.
  • the fingerprint information of the user's thumb is acquired during the fingerprint identification process of the fingerprint sensor 180H, and then sent to the processor 110. After the processor 110 receives the fingerprint information, it can determine the length of the user's thumb according to the size of the fingerprint.
  • the processor 110 may collect palm size information and fingerprint size information of a large number of users, and then perform statistics and processing to determine the relationship between the thumb length and the fingerprint size. After the processor 110 obtains the user's fingerprint information from the fingerprint sensor 180H, it determines the length of the user's thumb according to the relationship between the length of the thumb and the size of the fingerprint.
  • the length of the user's thumb can also be determined according to the user's specific operation for determining the length of the thumb on the screen 194 (such as the operation shown in FIG. 3, etc.), directly inputting data of the length of the thumb, and the like.
  • the embodiment of the present application does not limit the manner of how to obtain the length of the thumb.
  • This application obtains the length of the user's thumb to determine the size of the floating screen display when the "floating screen" function is subsequently turned on, so as to ensure that the user can operate on various positions of the floating screen when holding the mobile phone.
  • the processor 110 After the processor 110 obtains the length of the user's thumb, it also needs to determine the specific position of the floating screen displayed on the screen 194 according to the position where the user holds the mobile phone 100.
  • the processor 110 will take various positions on the edge of the screen 194 as the center according to the length of the thumb (in FIG. 9 the points O1 and O2 are taken as examples, or any position on the edge of the screen 194 may be used as the center). Center), N circles are formed with the length of the thumb as the radius.
  • the edge position of the screen 194 refers to the position of the screen 194 in contact with the casing of the mobile phone 100.
  • the touch device on the screen 194 sends the user's click position (black dot in the figure) on the display to the processor 110, and the processor 110 can count the number of clicks within a certain period of time and the location of each click The distribution on the screen 194.
  • the statistical time can be set to five minutes, ten minutes, half an hour, one hour, or other time durations, depending on the frequency of clicking on the screen 194 when the user uses the mobile phone before the "floating screen" function is turned on.
  • the processor 110 detects the number of clicks covered by each circle, determines the circle that covers the most clicks, and uses the center position of the circle as the holding point of the user holding the mobile phone 100.
  • the processor 110 may also count the number of N clicks before it is turned on, and determine the distribution of each click position on the screen 194.
  • the number of clicks can be 10, 20, or other numbers, depending on how often the user uses the mobile phone.
  • the processor 110 uses the center O1 as the holding point of the user holding the mobile phone 100.
  • a screen 194 prompts the user to click on a specific position on the screen 194 to determine the holding point logo, and click the specific position to determine The user holds the grip point of the mobile phone 100.
  • a subtitle "Please confirm the origin of the hand-held position (please click on the white area around the screen 194)" is displayed in the middle of the screen 194, and the background around the screen 194 is displayed in white or other eye-catching colors.
  • the center of the screen 194 is gray or other colors that are easily distinguishable from the colors around the screen 194 to prompt the user to click on the white or other eye-catching color areas around the screen 194.
  • the touch device sends the clicked position to the processor 110, and the processor 110 uses the position as the holding point of the user holding the mobile phone 100.
  • the processor 110 may also determine the holding point of the user holding the mobile phone 100 according to the temperature sensor 180J detecting the temperature of the hand, the pressure sensor detecting the holding pressure, and performing a specific operation on the screen 194 as shown in FIG. 3.
  • the embodiment of the present application does not limit the method of how to determine the position of the holding point where the user holds the mobile phone.
  • the processor 110 determines on the screen 194 a comfort zone that can be operated by the thumb of the user.
  • the comfort zone is a fan-shaped area on the edge of the screen 194 where the holding point for holding the mobile phone 100 is a circle point and the length of the thumb is a radius.
  • the above-mentioned fan-shaped area forms an irregular shape on the screen, as shown in FIG. 10.
  • the initial display size of the floating screen is the largest shape in the comfort zone on the screen 194, as shown in FIG. 5.
  • the lower right corner of the floating screen coincides with the lower right corner of the screen 194, and the upper left corner of the floating screen is located on the edge of the comfort zone.
  • the processor 110 may also determine the initial display size of the floating screen this time according to the size of the floating screen used by the user in history.
  • the aspect ratio of the floating screen displayed on the screen 194 is still the same as that of the screen 194, but the longer side of the floating screen is the same as that of the screen 194.
  • the shorter side of the UI is parallel, and the shorter side of the floating screen is parallel to the longer side of the UI of the screen 194.
  • the aspect ratio of the floating screen is the same as the UI of the screen 194, the lower left corner of the floating screen coincides with the lower right corner of the screen 194, and the upper right corner of the floating screen is located at the edge of the comfort zone, so that the floating screen formed has the largest area.
  • the size of the floating screen is related to the length of the thumb and also related to the position of the holding point. As shown in Figure 5, if the length of the thumb is longer, the comfort zone formed is larger, and the floating screen can be formed in the comfort zone; if the length of the thumb is shorter, the comfort zone formed is smaller. The smaller the floating screen formed by the comfort zone. As shown in Figure 12, if the user's gesture of holding the mobile phone changes, the holding point changes from the position of the lower right corner of the screen 194 to a certain position in the middle of the right side of the screen 194. The comfort zone is larger than the comfort zone formed in Figure 12, and the floating screen that can be formed in the comfort zone is larger than the floating screen in Figure 8. Therefore, the size of the floating screen formed on the UI of the screen 194 varies depending on the position where the user holds the mobile phone.
  • the content displayed on the floating screen may be different from the content displayed on the UI of the screen 194, for example, as shown in Figure 11(b), at this time, regardless of the user’s chat, video, etc. on the WeChat interface on the floating screen
  • the operation is to perform operations such as "fast forward” and “rewind” on the video in the UI of the screen 194.
  • the processor 110 receives the operation instruction
  • the feedback of the operation instruction is performed on the floating screen or the UI of the screen 194. display.
  • the content displayed on the floating screen may also be the same as the content displayed on the UI of the screen 194.
  • the processor 110 displays the feedback of the operation instruction on both the floating screen and the UI of the screen 194. Keep the content displayed on the two interfaces the same.
  • the processor 110 obtains the position of the holding point when the user holds the mobile phone with the left hand and the position of the holding point when the user holds the mobile phone with the right hand, and then combines them
  • the length of the user’s thumb generally speaking, the two hands of the user are the same size, so the left and right hands are not distinguished here
  • the processor 110 uses the area where the two comfort zones overlap as the area where the floating screen is displayed. The floating screen displayed in this way allows the user to operate with the left hand, or the user to operate with the right hand.
  • the processor 110 forms a right comfort zone on the screen 194 with the lower right corner of the screen 194 as the dot and the length of the thumb as the radius of a 1/4 circular area, forming a circle with the lower left corner of the screen 194 as the dot and the length of the thumb It is the left comfort zone of a circular area of 1/4 of the radius.
  • the overlap between the right comfort zone and the left comfort zone is the overlap area.
  • the processor 110 executes the “floating screen” function, it displays a floating screen in the overlapping area on the screen 194, as shown in FIG. 14.
  • the bottom edge of the floating screen coincides with the bottom edge of the screen 194.
  • the upper left corner of the floating screen is located in the comfort zone formed by the lower right corner of the screen 194 as the holding point.
  • the upper right corner of the floating screen is located on the edge of the comfort zone formed by the lower left corner of the screen 194 as the gripping point, so that the floating screen formed in this way has the largest area.
  • the processor 110 calculates the user's operable range and position on the screen 194 by obtaining the length of the user's thumb and the position of the holding point, and then displays the floating screen in this area.
  • the position is within the user's maximum operable range, so the user can operate at any position on the floating screen, so that the user can operate the large-screen device with one hand.
  • FIG. 15 is a flowchart of a method for following a hand on a floating screen according to an embodiment of the application. As shown in FIG. 15, the specific implementation process of the mobile phone 100 executing the floating screen and the hand is as follows:
  • step S1501 after the mobile phone 100 determines the specific position and size of the activated floating screen on the screen 194, the floating screen is displayed on the corresponding position on the screen 194.
  • step S1502 the mobile phone 100 detects the position where the user holds the mobile phone in real time.
  • the processor 110 determines the position where the user holds the mobile phone changes, that is, the position of the holding point changes. After determining the position of the new holding point O2, the processor 110 determines a new comfort zone on the screen 194 according to the length of the thumb and the new holding point O2.
  • the mobile phone 100 re-detecting the position of the holding point can use the methods shown in Figures 3, 9-10 and the corresponding descriptions to determine the holding point of the user holding the mobile phone 100, and the temperature and pressure of the hand can also be detected according to the temperature sensor 180J.
  • the sensor detects the holding pressure and other methods to determine the holding point of the user holding the mobile phone 100.
  • step S1503 the mobile phone 100 determines whether the position where the user holds the mobile phone has changed; when it is detected that the position where the user holds the mobile phone 100 has not changed, step S1502 is executed; when it is detected that the position where the user holds the mobile phone 100 has changed, execute Step S1504;
  • step S1504 the mobile phone 100 obtains the position of the new holding point where the user holds the mobile phone, and determines the position of the new comfort zone on the screen 194 based on the length of the user's thumb.
  • the mobile phone 100 After the mobile phone 100 re-determines the specific position and size of the activated floating screen on the screen 194 according to the length of the thumb and the position of the new holding point O2, that is, the position of the new comfort zone on the screen 194 is determined. Then the user can drag the floating screen, double-click or long press the new comfort zone to move the floating screen to the new comfort zone.
  • Step S1505 the mobile phone judges whether the floating screen overlaps with the new comfort zone; when the floating screen overlaps with the new comfort zone, perform step S1506; when the floating screen does not overlap with the new comfort zone, perform step S1507.
  • step S1506 the mobile phone drags the floating screen to a corresponding position in the new comfort zone according to the drag of the user.
  • the mobile phone when a new comfort zone in another color is displayed on the screen 194, the mobile phone indicates to the user that the position where the mobile phone 100 is held has changed. If the floating screen currently displayed on screen 194 is not in the new comfort zone, and part of it overlaps with the new comfort zone, the phone can display "Please drag the floating screen to this area" in the new comfort zone. "Subtitles allow users to drag the floating screen to the dragged position in the comfort zone by dragging the floating screen. The user can drag the floating screen to any position in the new comfort zone by tapping the screen 194, drag the floating screen to any position in the new comfort zone, or drag a part of the floating screen to the new comfort zone, or even drag the floating screen to Other locations on screen 194.
  • the floating screen can be moved to the new comfort zone, and this application is not limited here.
  • the last displayed position of the dragged floating screen is the end position of the drag operation.
  • the distance and trajectory that the floating screen moves on the screen are the distance and trajectory that the user drags on the screen.
  • step S1507 the mobile phone moves the floating screen to the new comfort zone according to the user's specific gestures such as double-tapping, long-pressing, drawing a small circle, etc. in the new comfort zone.
  • the mobile phone when a new comfort zone of another color is displayed on the screen 194, the mobile phone indicates to the user that the position where the mobile phone 100 is held has changed. If the floating screen currently displayed on screen 194 is not in the new comfort zone and has no overlap with the new comfort zone, the mobile phone 100 can display the "please double-click this area" subtitle in the new comfort zone at this time The method allows the user to double-tap in the new comfort zone. After receiving the "double-tap" operation instruction, the mobile phone will move the floating screen to the new comfort zone.
  • the size of the floating screen displayed in the new comfort zone can be the same as the size of the previously displayed floating screen, or it can be changed to display in the largest size shape in the new comfort zone.
  • the mobile phone can also move the floating screen in response to specific gestures such as a long press or drawing a small circle, or it can move the floating screen in response to a drag of the user.
  • This application is not limited here.
  • the embodiment of the application detects the position of the user holding the mobile phone in real time, and when the position of the user holding the mobile phone is detected to change, the comfort zone is re-determined, and then the floating screen is moved to the new comfort zone to ensure that the user is single The hand can be operated to any position in the floating screen.
  • FIG. 18 is a flowchart of a method for resolving the UI conflict between the floating screen and the screen 194 according to an embodiment of the application.
  • the specific implementation process of the mobile phone 100 when the floating screen is in conflict with the UI of the screen 194 is as follows:
  • step S1801 when the mobile phone 100 receives the user's operation, the clicked position is the position where the UI of the screen 194 overlaps the floating screen.
  • step S1802 the mobile phone 100 reminds the user whether to close the floating screen; when the user chooses to close the floating screen, step S1803 is executed; when the user chooses not to close the floating screen, step S1804 is executed.
  • step S1803 the mobile phone 100 closes the floating screen function after receiving the user's instruction to close the floating screen.
  • the user can directly operate the UI of the screen, and the mobile phone 100 directly responds to operation events on the UI of the screen 194.
  • the method of closing the floating screen can be by clicking the floating screen close button (similar to the "X" in the upper right corner of the windows program), shortcut key form (such as double-clicking the power button), etc.
  • the embodiments of the present application are not limited herein.
  • step S1804 the mobile phone 100 activates the function of selecting the UI of the screen 194 in the corresponding position or the button in the floating screen by pressing the length of time.
  • step S1805 the mobile phone 100 detects the pressing time on the screen 194 when the user clicks on the screen 194, and determines whether the pressing time exceeds the set threshold; when the pressing time exceeds the set threshold, step S1806 is executed; when the pressing time does not exceed the set threshold When the threshold is reached, step S1807 is executed.
  • the processor 110 may also determine whether the user operates the floating screen interface or the UI of the screen 194 according to the time of pressing each operation on the screen 194 during the operation. When the pressing time exceeds the set threshold, the processor 110 determines the operation event as an operation on the UI of the screen 194; when the pressing time does not exceed the set threshold, the processor 110 determines the operation event as an operation on the floating screen .
  • a virtual indication mark is set on the UI of the screen 194 and the floating screen, which is used to change the virtual indication mark from one color to another color when the user's operation is received on the UI and the floating screen of the screen 194 to indicate The user uses this operation to click on the UI or floating screen of the screen 194.
  • step S1806 the mobile phone 100 responds to the operation event on the UI of the screen 194.
  • step S1807 the mobile phone 100 responds to the operation event on the floating screen.
  • the mobile phone 100 in the embodiment of the present application can also detect the pressing force of the user on the screen 194 when the user clicks on the screen 194, and determine whether the pressing force exceeds a set threshold.
  • the mobile phone 100 responds to the operation event on the UI of the screen 194; when the force of pressing the screen 194 is not greater than the preset threshold, the mobile phone 100 responds to the operation event on the floating screen.
  • This application is not limited here.
  • the mobile phone 100 when the user performs an operation on the UI of the screen 194, the clicked position is where the UI of the screen 194 overlaps the floating screen.
  • the mobile phone 100 first reminds the user whether to close the floating screen, if When the user does not close the floating screen, the mobile phone 100 determines whether the click is an operation on the UI of the screen 194 or an operation on the floating screen according to the pressing time of the user.
  • the user can zoom in and out of the floating screen.
  • the processor 110 zooms in the size of the floating screen display.
  • the position of the enlarged floating screen may be in the comfort zone or outside the comfort zone, which is not limited in the embodiment of the present application.
  • the floating screen can be enlarged or reduced through a specific operation, so that the size of the displayed floating screen is the size required by the user.
  • the embodiment of the application discloses an electronic device including a processor, and a memory, an input device, and an output device connected to the processor.
  • the input device and the output device can be integrated into one device.
  • the touch device of the screen can be used as the input device
  • the display of the screen can be used as the output device.
  • the above electronic device may include: a screen 2201, the screen 2201 including a touch device 2206 and a display 2207; one or more processors 2202; one or more memories 2203; one or more Sensor 2208; one or more application programs (not shown); and one or more computer programs 2204.
  • the above-mentioned devices can be connected through one or more communication buses 2205.
  • the one or more computer programs 2204 are stored in the aforementioned memory 2203 and configured to be executed by the one or more processors 2202, and the one or more computer programs 2204 include instructions, and the aforementioned instructions can be used to execute the aforementioned implementations.
  • the steps in the example Among them, all relevant content of the steps involved in the above method embodiments can be cited in the functional description of the corresponding physical device, and will not be repeated here.
  • the foregoing processor 2202 may specifically be the processor 110 shown in FIG. 1
  • the foregoing memory 2203 may specifically be the internal memory 121 and/or the external memory 120 shown in FIG. 1
  • the foregoing screen 2201 may specifically be the processor 110 shown in FIG.
  • the above-mentioned sensor 2208 may specifically be a gyroscope sensor 180B, an acceleration sensor 180E, and a proximity sensor 180G in the sensor module 180 shown in FIG. 1, and may also be one or more of an infrared sensor, a Hall sensor, etc.
  • the embodiment of this application does not impose any restriction on this.
  • the functional units in the various embodiments of the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit can be implemented in the form of hardware or software functional unit.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • the technical solutions of the embodiments of the present application are essentially or the part that contributes to the prior art, or all or part of the technical solutions can be embodied in the form of software products, and the computer software products are stored in a storage
  • the medium includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage media include: flash memory, mobile hard disk, read-only memory, random access memory, magnetic disk or optical disk and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Hardware Design (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种单手操作的方法和电子设备。其中,所述方法包括:显示第一界面,该界面为屏幕上显示的用户界面;检测到用户的第一触发操作;根据第一触发操作,显示悬浮界面。在用户打开"悬浮屏"功能后,电子设备根据获取的用户大拇指长度和用户握持电子设备的位置,确定出一个用户单手可以操作到的区域,然后将悬浮界面呈现在该区域内,以方便用户进行操作。

Description

一种单手操作的方法和电子设备
本申请要求于2019年11月29日提交中国专利局、申请号为201911203234.5、申请名称为“一种单手操作的方法和电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本发明涉及终端技术领域,尤其涉及一种单手操作的方法和电子设备。
背景技术
随着科技的演进和创新,终端的用途越来越广,但终端的用户界面(user interface,UI)是适应终端的屏幕大小的。对于屏幕较大的终端,部分用户在操作终端时,可能出现手掌过小点击不方便(够不到导致误触等)的情况。现有的单手操作模式只能将终端的UI缩小成固定大小,并且缩小的UI位置固定不变,或者由用户手动设置UI的尺寸,不能根据用户自动的进行缩放。
发明内容
本申请的实施例采用如下技术方案:
第一方面,本申请提供一种单手操作的方法,由电子设备执行,包括:显示第一界面,所述第一界面占据所述电子设备的屏幕的全部;检测到用户的第一触发操作;根据所述第一触发操作,显示悬浮界面,所述悬浮界面的尺寸小于所述第一界面的尺寸,所述悬浮界面位于在所述屏幕中用户单手操作的第一区域内,所述第一区域由用户握持所述电子设备的位置和用户的大拇指的长度确定。其中,第一界面为手机的屏幕UI,悬浮界面为用户开启“悬浮屏”功能时显示的界面,第一触发操作为用户开启“悬浮屏”功能,第一区域为用户单手操作时在屏幕上可操作的区域,即舒适区。
本申请实施例在用户单手使用电子设备过程中,当用户手指不能够在电子设备的屏幕中任意位置进行操作时,开启“悬浮屏”功能,通过获取用户的大拇指长度和用户握持电子设备的位置,计算出一个舒适区,该舒适区即为用户手指在屏幕上可操作到的区域,然后将悬浮界面显示在舒适区内,用户通过在悬浮界面上进行操作,以实现用户单手操作大屏设备。
在另一个可能的实现中,所述方法还包括:当所述悬浮界面为缩小的所述第一界面时,所述第一界面在所述屏幕上不再显示。
本申请实施例中,悬浮界面为缩小版的屏幕的UI,如果悬浮界面显示的内容与屏幕的UI显示的内容相同,或为屏幕的UI所要显示的内容,此时可以让原来的屏幕的UI不显示任何内容,即屏幕上除了悬浮屏以外的区域为黑屏,以减少电量的消耗,同时避免因屏幕的UI显示的内容干扰用户注意力。
在另一个可能的实现中,在显示所述悬浮界面之前,所述方法还包括:开启至少一 个摄像头,并提示用户拍摄手部的图像;获取所述用户的大拇指的长度包括:根据所述手部的图像,计算出所述大拇指的长度。
本申请实施例中,一种获取大拇指长度的方式为:当用户第一次开启“悬浮屏”功能或者在设置“悬浮屏”的功能(例如,设置触发开启的手势等)时,提示用户开启摄像头,在开启摄像头,通过摄像头获取用户手部图像,然后根据手部图像计算出用户大拇指长度。得到大拇指长度后,将其存储在存储器中,后续用户再次打开“悬浮屏”功能时,就不需要另行获取大拇指长度。
另外,考虑到用户在第一次打开“悬浮屏”功能时,不方便进行获取大拇指长度操作,所以在启动电子设备、摄像头、指纹传感器等器件时,提示用户录入大拇指长度,以便后续用户开启“悬浮屏”功能时不需要进行获取大拇指长度操作。
类似的,下述的获取大拇指长度和握持位置的时机也可以是在第一次开启“悬浮屏”功能或设置“悬浮屏”的功能时,或者其他时机,本发明实施例对此不作限定。
在另一个可能的实现中,在显示所述悬浮界面之前,所述方法还包括:提示用户单手握持所述电子设备时在所述屏幕上画弧形,并获取所述弧形的轨迹;获取所述用户的大拇指的长度包括:根据所述弧形的轨迹,计算出所述大拇指的长度。例如,可以参考实施例中图3相关的描述。
本申请实施例中,一种获取大拇指长度的方式为:在屏幕的UI提示用户进行指纹识别,在用户将手指放置在指纹传感器上时,指纹传感器采集用户的指纹,然后根据指纹的大小,并结合指纹大小与大拇指长度之间的关系,确定出用户大拇指长度。
另外,还可以在用户打开“悬浮屏”功能时,还可以利用电子设备之前注册指纹锁屏时已获取的指纹,这样避免用户进行获取指纹操作,以提升用户体验。
在另一个可能的实现中,在显示所述悬浮界面之前,所述方法还包括:提示用户单手握持所述电子设备时在所述屏幕上画弧形,并获取所述弧形的轨迹;获取所述用户的大拇指的长度包括:根据所述弧形的轨迹,计算出所述用户握持所述电子设备的位置。
本申请实施例中,一种获取大拇指长度的方式为:在屏幕的UI提示用户进行画一个圆弧,在用户画出一个圆弧后,处理器根据圆弧的曲率计算出圆弧的半径,来确定出用户大拇指长度。
当然,本申请获取大拇指长度的方式,不仅限于上述三种方式,还可以进行用于确定大拇指长度的特定操作、直接输入大拇指长度的数据等方式,来确定用户大拇指长度。
在另一个可能的实现中,在显示所述悬浮界面之前,所述方法还包括:提示用户单手握持所述电子设备时在所述屏幕上画弧形,并获取所述弧形的轨迹;获取所述用户握持所述电子设备的位置包括:根据所述弧形的轨迹,计算出所述用户握持所述电子设备的位置。
本申请实施例中,一种获取用户握持电子设备的位置的方式为:在电子设备获取大拇指长度时,用户在屏幕的UI上画出圆弧后,处理器不仅根据圆弧的曲率计算出圆弧的半径,来确定出用户大拇指长度,还可以根据计算得到的圆心,确定用户握持电子设备的位置。这样避免电子设备获取用户的大拇指长度和用户握持电子设备的位置时,让用户进行多次特定操作,从而提升用户体验。
在另一个可能的实现中,在显示所述悬浮界面之前,所述方法还包括获取用户在一段时间内在所述屏幕上进行操作的至少一个操作点;获取所述用户握持所述电子设备的位置包括:根据所述至少一个操作点,计算出所述用户握持所述电子设备的位置。
在另一个可能的实现中,所述根据所述操作点,计算出所述用户握持所述电子设备的位置,包括:根据所述至少一个操作点在屏幕上的位置,确定N个圆中覆盖所述至少一个操作点最多数量的一个圆的圆心为握持点位置,所述N个圆为在所述屏幕上以所述屏幕边缘位置上的N个位置为圆心和以所述大拇指长度为半径的圆,N为大于2的整数,所述握持点为所述用户握持手机时,掌心或大拇指处与所述屏幕的边缘处相接触的位置。
本申请实施例中,一种获取用户握持电子设备的位置的方式为:在开启“悬浮屏”后,处理器统计开启“悬浮屏”功能之前的一段时间内点击数量和各个点击位置在屏幕194上分布情况,检测以屏幕边缘各个位置为圆心、以大拇指长度为半径的圆覆盖操作点的数量,确定覆盖最多点击的圆,将该圆的圆心位置作为用户手持电子设备的握持点,从而确定用户握持电子设备的位置。由于这种方式获取用户的握持电子设备的位置,由处理器进行执行,所以不需要用户执行指定的操作,从而提升用户体验。
当然,本申请获取用户握持电子设备的位置的方式,不仅限于上述两种方式,还可以根据温度传感器180J检测手的温度、压力传感器检测握持压力、直接输入握持手机时掌心或大拇指与屏幕边缘位置相接触的位置等方式,来确定用户握持电子设备的位置。
在另一个可能的实现中,确定所述第一区域,包括:根据所述用户握持所述电子设备的位置,确定用户握持手机时掌心或大拇指处与所述屏幕的边缘处相接触的握持点;将所述握持点为圆心、所述大拇指长度为半径在所述屏幕上形成的区域作为所述第一区域。
在另一个可能的实现中,确定所述第一区域,包括:根据至少两个所述用户握持所述电子设备的位置,确定至少两个用户握持手机时掌心或大拇指处与所述屏幕的边缘处相接触的握持点;将至少两个握持点为圆心、所述大拇指长度为半径在所述屏幕上形成的至少两个区域之间重叠区域作为所述第一区域。
本申请实施例中,考虑到用户两只手轮流交换使用电子设备时,此时处理器获取用户左手握持手机的位置和用户右手握持手机的位置,然后结合用户的大拇指长度,在屏幕上形成一个以左手握持点产生的舒适区和一个以右手握持点产生的舒适区。处理器将两个舒适区重叠的区域作为显示悬浮屏的区域。这样显示的悬浮屏可以让用户使用左手进行操作,也可以让用户使用右手进行操作。
在另一个可能的实现中,所述悬浮界面以最大尺寸的形状显示于所述第一区域内,所述悬浮界面的形状与所述第一界面的形状相同。此时呈现的悬浮界面至少有一个角位于舒适区的边缘上。通过以最大尺寸的形状在舒适区内呈现,以方便用户观看悬浮界面的内容和对悬浮界面内的应用进行操作。
在另一个可能的实现中,所述方法还包括:检测到用户握持所述电子设备的位置发生变化,根据更新的用户握持所述电子设备的位置和所述大拇指的长度确定第二区域,并在所述第一界面上显示所述第二区域。其中,第二区域为用户握持所述电子设备的位置发生变化时在屏幕上可操作的区域,即新的舒适区。
在另一个可能的实现中,所述方法还包括:检测到用户在所述第二区域内进行点击 操作,响应于所述点击操作,将所述悬浮界面显示在所述第二区域内。
本申请实施例中,在确定新的舒适区后,检测用户在悬浮界面内的操作,当用户在新的舒适区内进行双击、长按、画个小圆圈等特定的手势,将悬浮界面移动到新的舒适区内。
在另一个可能的实现中,所述方法还包括:检测到用户在所述第二区域内进行点击操作,当所述悬浮界面与所述第二区域不重叠时,响应于所述点击操作,将所述悬浮界面显示在所述第二区域内。
在另一个可能的实现中,所述方法还包括:检测到用户将所述悬浮界面从所述第一区域移动到所述第二区域的拖拽操作,响应于所述拖拽操作,在所述拖拽操作的结束位置显示所述悬浮界面。
本申请实施例中,在确定新的舒适区后,检测用户在悬浮界面内的操作,当用户进行拖拽操作后,将所述悬浮界面拖拽到新的舒适区内的拖拽操作的结束位置,且悬浮屏在屏幕上移动的距离和轨迹即为用户在屏幕上拖拽的距离和轨迹。
在另一个可能的实现中,所述方法还包括:检测到用户将所述悬浮界面从所述第一区域移动到所述第二区域的拖拽操作,当所述悬浮界面与所述第二区域重叠时,响应于所述拖拽操作,在所述拖拽操作的结束位置显示所述悬浮界面。
在另一个可能的实现中,所述方法还包括:检测到作用于第一位置的操作,其中,所述悬浮界面包括第一控件,所述第一界面包括第二控件,所述第一控件在所述屏幕上显示的位置为所述第一位置,所述第二控件在所述屏幕上显示的位置为第二位置,且所述第一位置与所述第二位置至少部分重叠;提示是否关闭所述悬浮界面的显示;若收到关闭显示的指令,不再显示所述悬浮界面,所述第一界面对应的应用响应所述作用于所述第一位置的操作,并根据所述第二控件显示对应的界面。其中,第一控件为电子设备上的屏幕中的触控器件,第二控件为触控器件的一部分,第一位置为电子设备上的屏幕中的显示器显示的位置,第二位置为显示器显示的位置的一部分。
本申请实施例中,当用户在屏幕的UI上进行操作时,点击的位置为屏幕的UI与悬浮界面相重叠处,为了避免操作相冲突,用户可以在操作之前关闭“悬浮屏”功能,也可以在用户点击后,电子设备提醒用户是否关闭悬浮界面,在接收到用户关闭指令后,关闭“悬浮屏”功能。
在另一个可能的实现中,所述方法还包括:若未收到所述关闭显示的指令,判断触发所述第一控件的操作的压力值或时长中的至少一个是否大于设定值;若所述触发所述第一控件的操作的压力值或时长中的至少一个大于所述设定值,所述第一界面对应的应用响应所述作用于所述第一位置的操作,并根据所述第二控件显示对应的界面;若所述触发所述第一控件的操作的压力值或时长中的至少一个不大于所述设定值,所述悬浮界面对应的应用响应所述作用于所述第一位置的操作,并根据所述第一控件在所述悬浮界面占据的区域内显示对应的界面。
本申请实施例中,如果用户不关闭“悬浮屏”功能时,电子设备则根据用户进行操作时,对屏幕按压时间或力度等因素来判断用户此次点击为对屏幕的UI的操作还是对悬浮界面的操作,从而确定此次点击对应的响应事件。
在另一个可能的实现中,所述方法还包括:检测到作用于所述悬浮界面内同一时间 有至少两个操作点,且所述至少两个操作点位置随时间的变化相距越来越远时,将所述悬浮界面的尺寸放大;检测到作用于所述悬浮界面内同一时间有至少两个操作点,且所述至少两个操作点位置随时间的变化相距越来越近时,将所述悬浮界面的尺寸缩小。
本申请实施例中,可以根据用户对悬浮界面的放大操作和缩小操作,将悬浮界面的尺寸进行放大或缩小,以实现根据用户的需要,显示不同尺寸的悬浮界面。
第二方面,本申请提供一种单手操作的装置,该装置执行如第一方面中任一项所述的方法。
第三方面,本申请提供一种电子设备,包括:屏幕,用于显示第一界面,所述第一界面占据所述电子设备的屏幕的全部,且根据所述第一触发操作,显示悬浮界面;一个或多个处理器;一个或多个存储器;一个或多个传感器;以及一个或多个计算机程序,其中所述一个或多个计算机程序被存储在所述一个或多个存储器中,所述一个或多个计算机程序包括指令,当所述指令被所述电子设备执行时,使得所述电子设备执行如第一方面中任一项所述的方法。
第四方面,本申请提供一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,其特征在于,当所述指令在电子设备上运行时,使得所述电子设备执行如第一方面中任一项所述的方法。
第五方面,本申请提供一种包含指令的计算机程序产品,其特征在于,当所述计算机程序产品在电子设备上运行时,使得所述电子设备执行如第一方面中任一项所述的方法。
可以理解地,上述提供的第三方面所述的电子设备、第四方面所述的计算机存储介质,以及第五方面所述的计算机程序产品均用于执行上文所提供的对应的方法,因此,其所能达到的有益效果可参考上文所提供的对应的方法中的有益效果,此处不再赘述。
附图说明
图1为本申请实施例提供的一种电子设备的结构示意图;
图2为触摸点和握持点在手掌中的位置示意图;
图3为用户单手在屏幕上进行操作的示意图;
图4为舒适区在屏幕上的位置示意图;
图5为悬浮屏在舒适区内显示的位置示意图;
图6为本申请实施例提供的手机的软件结构框图;
图7为摄像头获取用户手掌信息的示意图;
图8为本申请实施例提供的一种确定用户握持手机时握持点的位置示意图;
图9为本申请实施例提供的一种确定用户握持手机时握持点的位置示意图;
图10为本申请实施例提供的不同握持点产生的舒适区形状示意图;
图11(a)为本申请实施例提供的横屏握持手机时悬浮屏在舒适区内位置示意图;
图11(b)为本申请实施例提供的横屏握持手机时屏幕的UI和悬浮屏中显示的内容之间的关系示意图;
图12为本申请实施例提供的握持点在屏幕边缘的中间位置时悬浮屏在舒适区内位置示意图;
图13为本申请实施例提供的一种确定舒适区的位置示意图;
图14为本申请实施例提供的悬浮屏在舒适区内位置示意图;
图15为本申请实施例提供的一种悬浮屏跟手方法的流程图;
图16为本申请实施例提供的一种移动悬浮屏场景的示意图;
图17为本申请实施例提供的一种移动悬浮屏场景的示意图;
图18为本申请实施例提供的一种解决悬浮屏与屏幕的UI相冲突的方法的流程图;
图19为本申请实施例提供的包括状态指示标识的屏幕的UI和悬浮屏的示意图;
图20为本申请实施例提供的放大悬浮操作示意图;
图21为本申请实施例提供的缩小悬浮操作示意图;
图22为本申请实施例提供的一种电子设备的结构示意图。
具体实施方式
下面将结合附图对本实施例的实施方式进行详细描述。
本申请实施例提供的一种单手操作的方法,可应用于手机、平板电脑、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、手持计算机、上网本、个人数字助理(personal digital assistant,PDA)、可穿戴设备、虚拟现实设备等具有屏幕194的电子设备中,本申请实施例对此不做任何限制。
以手机100为上述电子设备举例,图1示出了手机的结构示意图。
手机100可以包括处理器110、外部存储器接口120、内部存储器121、通用串行总线(universal serial bus,USB)接口130、充电管理模块140、电源管理模块141、电池142、天线1、天线2、射频模块150、通信模块160、音频模块170、扬声器170A、受话器170B、麦克风170C、耳机接口170D、传感器模块180、按键190、马达191、指示器192、摄像头193、屏幕194、以及用户标识模块(subscriber identification module,SIM)卡接口195等。
可以理解的是,本申请实施例示意的结构并不构成对手机100的具体限定。在本申请另一些实施例中,手机100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP)、调制解调处理器110、图形处理器110(graphics processing unit,GPU)、图像信号处理器110(image signal processor,ISP)、控制器、存储器、视频编解码器、数字信号处理器110(digital signal processor,DSP)、基带处理器110、和/或神经网络处理器110(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器110中。
其中,控制器可以是手机100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口、集成电路内置音频(inter-integrated circuit sound,I2S)接口、脉冲编码调制(pulse code modulation,PCM)接口、通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口、移动产业处理器110接口(mobile industry processor interface,MIPI)、通用输入输出(general-purpose input/output,GPIO)接口、用户标识模块(subscriber identity module,SIM)接口、和/或通用串行总线(universal serial bus,USB)接口等。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。
I2S接口可以用于音频通信。在一些实施例中,处理器110可以包含多组I2S总线。处理器110可以通过I2S总线与音频模块170耦合,实现处理器110与音频模块170之间的通信。PCM接口也可以用于音频通信,将模拟信号抽样、量化和编码。
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。
MIPI接口可以被用于连接处理器110与屏幕194,摄像头193等外围器件。MIPI接口包括摄像头193串行接口(camera serial interface,CSI)、显示屏串行接口(display serial interface,DSI)等。在GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。在一些实施例中,GPIO接口可以用于连接处理器110与摄像头193、屏幕194、通信模块160、音频模块170、传感器模块180等。GPIO接口还可以被配置为I2C接口、I2S接口、UART接口、MIPI接口等。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口、Micro USB接口、USB Type C接口等。。
可以理解的是,本申请实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对手机100的结构限定。在本申请另一些实施例中,手机100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块140可以通过USB接口130接收有线充电器的充电输入。
电源管理模块141用于连接电池142、充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110、内部存储器121、外部存储器、屏幕194、摄像头193、通信模块160等供电。电源管理模块141还可以用于监测电池容量、电池循环次数、电池健康状态(漏电,阻抗)等参数。手机100的无线通信功能可以通过天线1、天线2、射频模块150、通信模块160、调制解调处理器110以及基带处理器110等实现。
天线1和天线2用于发射和接收电磁波信号。手机100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。射频模块150可以提供应用在手机100上的包括2G/3G/4G/5G等无线通信的解决方案。射频模块150可以包括至少一个滤波器、开关、功率放大器、低噪声放大器(low noise amplifier,LNA)等。射频模块150可以由天线1接收电磁波,并对接收 的电磁波进行滤波、放大等处理,传送至调制解调处理器110进行解调。射频模块150还可以对经调制解调处理器110调制后的信号放大,经天线1转为电磁波辐射出去。
调制解调处理器110可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器110处理。低频基带信号经基带处理器110处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过屏幕194显示图像或视频。通信模块160可以提供应用在手机100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络)、蓝牙(Bluetooth,BT)、全球导航卫星系统(global navigation satellite system,GNSS)、调频(frequency modulation,FM)、近距离无线通信技术(near field communication,NFC)、红外技术(infrared,IR)等无线通信的解决方案。通信模块160可以是集成至少一个通信处理模块的一个或多个器件。通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,手机100的天线1和射频模块150耦合,天线2和通信模块160耦合,使得手机100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM)、通用分组无线服务(general packet radio service,GPRS)、码分多址接入(code division multiple access,CDMA)、宽带码分多址(wideband code division multiple access,WCDMA)、时分码分多址(time-division code division multiple access,TD-SCDMA)、长期演进(long term evolution,LTE)、5G、BT、GNSS、WLAN、NFC、FM、和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS)、全球导航卫星系统(global navigation satellite system,GLONASS)、北斗卫星导航系统(beidou navigation satellite system,BDS)、准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
手机100可以通过ISP、摄像头193、视频编解码器、GPU、屏幕194,以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头193感光元件上,光信号转换为电信号,摄像头193感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点、亮度和肤色进行算法优化。ISP还可以对拍摄场景的曝光、色温等参数优化。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,手机100可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器110用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当手机100在频点选择时,数字信号处理器110用于对频点能量进行傅 里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。手机100可以支持一种或多种视频编解码器。这样,手机100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1、MPEG2、MPEG3、MPEG4等。
在本申请实施例中,可使用摄像头采集用户的手部信息。例如,当用户首次启动“悬浮屏”功能时,手机100提示用户录入手部信息。处理器110在用户输入“打开摄像头193”的指令后,打开摄像头193进行拍摄,以获取用户的手部信息其中,手部信息可以包括手掌大小、各个手指的长度、各个手指的指纹等信息。本申请实施例通过获取用户的手部信息,主要是为了得到用户的大拇指长度。
大拇指的长度为用户的大拇指在进行操作时与屏幕194相接触的位置(后续称为触摸点)和用户采用图3所示的单手持手机100的姿势时,用户的手掌与屏幕194边缘相接触的位置(后续称为握持点)之间的距离。如图2所示,触摸点一般在大拇指的指纹区,如图2中M点所示;握持点一般在掌心或掌心偏右的区域中某一点的位置,如图2中虚线区域内的A、B、C和D点位置。这样大拇指的长度即为M点到A点、B点、C点或D点的距离。
对于男性用户来说,其手掌比较大,在握持手机100时,握持点大多在图2中A点和B点的位置,对于女性用户来说,其手掌比较小,在握持手机100时,握持点大多在图2中C点和D点的位置。所以处理器110在计算大拇指的长度时,根据获取的图像中手掌的大小,来选定握持点。也即手掌越大,选取的握持点在虚线区域内越靠近左边;手掌越小,选取的握持点在虚线区域内越靠近右边。
在一种实施例中,处理器110控制至少两个摄像头193中一个摄像头193获取包括用户手部的红绿蓝(red green blue,RGB)图像,控制至少两个摄像头193中另一个摄像头193获取包括用户手部的图像深度信息,然后根据至少两个摄像头193获取RGB图像和图像深度信息,计算得到红绿蓝深度图像(red green blue-depth map,RGB-D)。
作为一种举例,处理器110根据RGB-D图像识别出触摸点M和握持点A(以握持点A为例),然后结合RGB图像的分辨率H(Height)*W(Width),计算出触摸点M和握持点A在RGB图像中的位置M1(Xp,Yp)和A1(Xf,Yf。然后根据图像深度信息,计算出触摸点M和握持点A在RGB-D图像中位置M2(Xp,Yp,Zp)和A2(Xf,Yf,Zf)。
处理器110将在RGB-D坐标下的坐标点转换为空间中的笛卡尔坐标系下的坐标点,计算过程具体如下:
Figure PCTCN2020128000-appb-000001
Figure PCTCN2020128000-appb-000002
其中,Cx、Cy、Fx和Fy为获取RGD图像的摄像头的内参数据,Cx,Cy为图像原点相对于光圈中心成像点的纵横偏移量(单位:像素),Fx=f/dx,其中f为相机的焦距,dx为x方向的一个像素占多少长度单位,Fy=f/dy,其中f为相机的焦距,dy为y方向的一个像素占多少长度单位。
由上述公式(1)和公式(2)计算得到空间中以获取RGD图像的摄像头为原点的笛卡尔坐标系下的触摸点M3(Xsp,Ysp,Zsp)和握持点A3(Xsf,Ysf,Zsf)后,计算大拇指长度d为:
Figure PCTCN2020128000-appb-000003
其中,分辨率中H(Height)表示图像中在第一方向上占的点数的单位,W(Width)表示图像中在第二方向上占的点数的单位,第一方向一般为与手机屏幕较短的边平行,第二方向与手机屏幕较长的边平行。
在一种实施例中,处理器110控制至少两个摄像头193中两个摄像头193获取两张包括用户手部的RGB图像,然后根据双目摄像头193原理计算出RGB-D图像。处理器110得到包括用户手部的RGB-D图像后,根据RGB图像的分辨率H(Height)*W(Width)和图像深度信息,计算出大拇指的长度。
本申请实施例通过获取用户的手部信息,以获取大拇指的长度,用于在后续开启“悬浮屏”功能时,确定悬浮屏显示的尺寸大小,以保证用户在握持手机时,都能够对悬浮屏的各个位置进行操作。
手机100通过GPU、屏幕194,以及应用处理器等实现定位功能和显示功能。
GPU为图像处理的微处理器,连接屏幕194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。在本申请实施例中,屏幕194中可包括显示器和触控器件。显示器用于向用户输出显示内容,触控器件用于接收用户在屏幕194上输入的触摸事件。处理器110还可以接收屏幕194发送的触控事件后,确定用户进行操作时的点击位置。其中,触控器件可以为触摸传感器1810K。
在手机100获取用户大拇指长度后,,处理器110获取用户在屏幕194上进行操作的点击位置,然后结合大拇指长度,确定用户握持手机时握持点的位置。
例如,如图3所示,在用户开启“悬浮屏”功能后,屏幕194上显示一个用于指示用户在屏幕194上划一个圆弧的提示。当用户的掌心或大拇指关节处正好与屏幕194的边缘相接触,并按照指示在屏幕194上划一个圆弧(如图3中虚线所示)后,触控器件将接收到的进行操作的点击位置发送给处理器110。处理器110接收到点击位置后,根据圆弧的曲率计算出该圆弧在屏幕194四周的边缘上的圆心位置,将该圆心位置作为用户握持手机时握 持点的位置。在一种实施例中,如果用户握持手机时,用户的掌心或大拇指关节处没有与屏幕194的边缘相接触时,也即用户的掌心或大拇指关节处在手机边缘以外,或在屏幕194上面。当用户按照指示在屏幕194上划一个圆弧后,触控器件将接收到的进行操作的点击位置发送给处理器110。处理器110接收到点击位置后,根据圆弧的曲率计算出该圆弧的圆心位置,此时的圆心在屏幕内部或在屏幕194以外的位置。此时处理器根据用户画出的圆弧,对圆弧的曲率进行增大或减少,以使计算得到的圆心在在屏幕194四周的边缘上的位置。
在一种实施例中,处理器110在得到用户握持手机时握持点的位置后,然后根据握持点的位置(也即圆弧的圆心位置)和大拇指长度,将以该握持点的位置作为圆心,以大拇指长度作为半径,在屏幕194上确定一个用户大拇指可操作到的区域(下文将该区域称为舒适区)如图4所示的虚线区域。以保证得到的舒适区为用户最大可操作到的区域。
然后处理器110在屏幕194上的舒适区内呈现一个悬浮屏,如图5所示。本申请实施例中,悬浮屏为屏幕194的UI的缩小版的UI,例如,可以是等比例的缩小版,例如,该UI的长宽比与屏幕194的UI的长宽比相同。本领域技术人员也可以理解,悬浮屏也可以是不等比例的缩小版。其中,UI的长边是指屏幕的较长的边,UI的宽边是指屏幕的较短的边。
本申请实施例中,由于悬浮屏呈现的位置是在用户最大可操作范围以内,所以用户可以在该悬浮屏上任意位置进行操作,实现更加方便用户单手操作大屏设备。
NPU为神经网络(neural-network,NN)计算处理器110,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展手机100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐、视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行手机100的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等。存储数据区可存储手机100使用过程中所创建的数据(比如音频数据、电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、通用闪存存储器(universal flash storage,UFS)等。
手机100可以通过音频模块170、扬声器170A、受话器170B、麦克风170C、耳机接口170D、以及应用处理器等实现音频功能。例如音乐播放、录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。手机100可以通过扬声器170A收听音乐,或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当手机100接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”或“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。手机100可以设置至少一个麦克风170C。耳机接口170D用于连接有线耳机。耳机接口170D可以是USB接口130,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
按键190包括开机键、音量键等。按键190可以是机械按键。也可以是触摸式按键。手机100可以接收按键输入,产生与手机100的用户设置以及功能控制有关的键信号输入。
马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照、音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于屏幕194不同区域的触摸操作,马达191也可对应不同的振动反馈效果。指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息、未接来电、通知等。
SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和手机100的接触和分离。手机100可以支持1个或N个SIM卡接口,N为大于1的正整数。SIM卡接口195可以支持Nano SIM卡、Micro SIM卡、SIM卡等。同一个SIM卡接口195可以同时插入多张卡。所述多张卡的类型可以相同,也可以不同。SIM卡接口195也可以兼容不同类型的SIM卡。SIM卡接口195也可以兼容外部存储卡。手机100通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,手机100采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在手机100中,不能和手机100分离。
上述手机100的软件系统可以采用分层架构、事件驱动架构、微核架构、微服务架构或云架构。本申请实施例以分层架构的Android系统为例,示例性说明手机100的软件结构。
图6是本申请实施例的手机100的软件结构框图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层、应用程序框架层、安卓运行时(Android runtime)和系统库,以及内核层。
其中,应用程序层可以包括一系列应用程序包。如图6所示,应用程序层内可以安装相机、图库、日历、通话、地图、导航、蓝牙、音乐、视频、短信息等应用程序。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。如图6所示,应用程序框架层可以包括显示策略服务和显示管理服务。当然,应用程序框架层中还可以包括活动管理器、窗口管理器,内容提供器,视图系统,电话管理器,资源管理器,通知管理器等,本申请实施例对此不作任何限制。
其中,显示策略服务可获取底层系统上报的数据,例如,摄像头获取的图片数据、触控器件获取的触摸点等数据。
具体地,当用户开启“悬浮屏”功能时,手机100可以接收到用户获取手部信息的触发操作。例如,该触发操作可以为用户开启摄像头193操作、点击屏幕194的操作、指纹 解锁操作、输入数据等操作。接收到该操作后,手机100的底层系统向显示策略服务发送触摸点等数据。
仍如图6所示,应用程序框架层以下的系统库和内核层等可称为底层系统,底层系统中包括用于提供显示服务的底层显示系统,例如,底层显示系统包括内核层中的显示驱动以及系统库中的surface manager等。并且,本申请中的底层系统还包括用于获取用户大拇指长度的状态监测服务,该状态监测服务可独立设置在底层显示系统内,也可设置在系统库和/或内核层内。
示例性的,状态监测服务可调用传感器服务(sensor service)启动摄像头193、触控器件等传感器进行检测。状态监测服务可根据各个传感器上报的检测数据发送显示策略服务。显示策略服务根据获取的数据计算出大拇指长度。
然后,显示策略服务可获取底层系统上报的其它数据,例如触控器件获取多个触摸点、温度传感器获取温度等数据。
具体地,触控器件还可以检测到用户获取握持手机的位置的触发操作。例如,该触发操作可以为用户点击屏幕194操作、滑动屏幕194操作、开启传感器等操作。接收到该操作后,底层系统向显示策略服务发送触摸点等数据。
状态监测服务还可调用传感器服务(sensor service)启动屏幕194的触控器件、温度传感器等传感器进行检测。状态监测服务可根据各个传感器上报的检测数据发送显示策略服务。显示策略服务根据获取的数据计算出用户握持手机时握持点的位置。
显示策略服务根据用户大拇指长度和用户握持手机时握持点的位置,确定屏幕194上用户可以操作到的区域(也即舒适区),然后通知显示管理服务在舒适区内显示悬浮屏。
本申请实施例通过底层系统获取用户大拇指长度和用户握持手机时握持点的位置,然后上报到显示策略服务,显示策略服务根据大拇指长度和握持手机时握持点的位置,计算出舒适区在屏幕194中的位置,在通知显示管理服务在舒适区内显示悬浮屏。
下面将通过几个具体实施例来讲述本申请上述方案的实现过程。
实施例一
当手机首次启动“悬浮屏”功能时,手机提醒用户录入手部信息。手机在录入用户的手部信息后,处理器110根据获取的手部信息,计算出用户的大拇指长度。
示例性的,如图7所示,手机100在接收到用户输入的“打开摄像头193”指令后,打开摄像头193进行拍摄。当用户将手掌放置在手机100的摄像头193的前方,摄像头193获取包括用户手部的图像后,通过处理器110进行处理,以获取用户大拇指长度。
本申请实施例中,至少两个摄像头193采集手部信息后,发送给处理器110。处理器110根据摄像头193采集手部信息,生成的RGB-D图像,然后根据分辨率H(Height)*W(Width)和图像深度信息,计算出大拇指的长度。
需要特别说明的是,本申请实施例获取用户的大拇指长度的方式不仅限于上述通过摄像头193获取,还可以通过指纹传感器180H来获取。一般而言,人类的指纹大小与手掌大小和大拇指长度有密切关联。指纹越大,手掌越大,大拇指长度越长;指纹越小,手掌越小,大拇指长度越短。具备指纹传感器180H的手机,在指纹传感器180H进行指纹识别过程中,获取到用户的拇指指纹信息后,发送给处理器110。处理器110接收到指纹信息后,可以根据指纹的大小,确定用户的大拇指长度。
作为一种举例,处理器110可以采集大量用户的手掌大小信息和指纹大小信息,然后进行统计并处理,确定大拇指长度和指纹大小之间的关系。在处理器110从指纹传感器180H处得到用户指纹信息后,根据大拇指长度和指纹大小之间的关系,确定该用户的大拇指长度。
另外,还可以根据用户在屏幕194上进行用于确定大拇指长度的特定操作(例如图3所示的操作等)、直接输入大拇指长度的数据等方式,来确定用户大拇指长度。在此,本申请实施例对如何获取大拇指长度的方式上不作限定。
本申请通过获取用户的大拇指长度,用于在后续开启“悬浮屏”功能时,确定悬浮屏显示的尺寸大小,以保证用户在握持手机时,都能够对悬浮屏的各个位置进行操作。
处理器110在获取到用户的大拇指长度后,还需要根据用户的握持手机100的位置,来确定悬浮屏显示在屏幕194的具体位置。
示例性的,如图8所示,处理器110会根据大拇指长度,以屏幕194边缘上各个位置为圆心(图9中以O1和O2点为例,还可以是屏幕194边缘上任意位置作为圆心),以大拇指长度为半径形成N个的圆。其中,屏幕194边缘位置是指屏幕194的与手机100外壳相接触的位置。
在用户使用手机过程中,屏幕194的触控器件将用户对显示器进行操作的点击位置(图中黑色圆点)发送给处理器110,处理器110可以统计一定时间内点击数量和各个点击位置在屏幕194上分布情况。其中,统计的时间可以设置为五分钟、十分钟、半个小时、一个小时或其它时间时长,具体根据在开启“悬浮屏”功能之前用户使用手机时,对屏幕194进行操作的点击频率有关。然后处理器110检测各个圆覆盖点击数量,确定覆盖最多点击的圆,将该圆的圆心位置作为用户手持手机100的握持点。
另外,处理器110还可以在开启“悬浮屏”功能时,统计在开启之前N个点击数量,并确定各个点击位置在屏幕194上分布情况。其中,点击数量可以为10、20或其它数量,具体根据用户使用手机频率有关。
其中,如图8所示,由于以O1为圆心的圆覆盖的点击的数量比以O2为圆心的圆覆盖的点击的数量多,处理器110将圆心O1作为用户手持手机100的握持点。
示例性的,如图9所示,在用户开启“悬浮屏”功能时,屏幕194上显示一个提示用户点击屏幕194特定位置来确定握持点的标识,通过让获取点击该特定位置,来确定用户手持手机100的握持点。
其中,如图9所示,屏幕194中间位置上显示一个“请确定手持位置原点(请点击屏幕194四周的白色区域)”字幕,并将屏幕194四周的位置背景显示白色或其它醒目的颜色,屏幕194中心位置为灰色或其它易与屏幕194四周颜色区分的颜色,以提示用户点击屏幕194四周白色或其它醒目的颜色区域。当用户在屏幕194四周的白色或其它醒目的颜色区域点击了一下或两下,触控器件将该点击位置发送给处理器110,处理器110将该位置作为用户手持手机100的握持点。
另外,处理器110还可以根据温度传感器180J检测手的温度、压力传感器检测握持压力、如图3所示在屏幕194上进行特定操作等方式来确定用户手持手机100的握持点。在此,本申请实施例对如何确定用户握持手机的握持点的位置方法不作限定。
处理器110在得到用户握持手机100的握持点后,结合大拇指长度,在屏幕194上确定 一个用户大拇指可操作到的舒适区。其中,舒适区是以在屏幕194边缘的握持手机100的握持点为圆点、以大拇指长度为半径的扇形区域。或者,如果大拇指长度大于屏幕的较短的边的长度时,上述扇形区域在屏幕上形成的不规则形状,如图10所示。
示例性的,如图3所示,当用户右手握持手机100,且握持点在手机屏幕194的右下角位置。此时在屏幕194上形成一个以屏幕194右下角为圆点和大拇指长度为半径的1/4圆形区域,即为舒适区,如图4所示。处理器110执行开启“悬浮屏”功能时,在屏幕194上的舒适区内显示一个悬浮屏。
在一些实施例中,悬浮屏初始显示的尺寸为在屏幕194上的舒适区中最大形状,如图5所示。其中,悬浮屏的右下角和屏幕194右下角重合,悬浮屏的左上角位于舒适区的边缘上。当然,处理器110还可以根据用户历史习惯使用的悬浮屏尺寸,来确定此次悬浮屏初始显示的尺寸。
如果用户横屏握持手机时,如图11(a)所示,此时在屏幕194上呈现的悬浮屏的长宽比仍与屏幕194相同,但是悬浮屏的较长的边与屏幕194的UI的较短的边平行,悬浮屏的较短的边与屏幕194的UI的较长的边平行。其中,由于悬浮屏的长宽比和屏幕194的UI相同,所以悬浮屏的左下角和屏幕194右下角重合,悬浮屏的右上角位于舒适区的边缘上,这样形成的悬浮屏的面积最大。
作为一种举例,如图11(b)所示,用户横屏握持手机在屏幕194的UI上观看“爱奇艺”中电影视频时,“微信”来了消息,如果用户并不想退出视频,则可以开启“悬浮屏”功能,然后让悬浮屏显示微信聊天界面。此时为了方便用户在悬浮屏中微信上进行聊天,悬浮屏显示的微信界面与屏幕UI相垂直。
在一些实施例中,悬浮屏的大小跟大拇指的长度有关,也跟握持点的位置有关。如图5所示,如果大拇指的长度越长,形成的舒适区越大,可在舒适区形成的悬浮屏就越大;如果大拇指的长度越短,形成的舒适区越小,可在舒适区形成的悬浮屏就越小。如图12所示,如果用户握持手机的手势发生变化时,握持点从屏幕194的右下角的位置变化到屏幕194的右侧的中间的某位置时,此时的在屏幕194上形成的舒适区比图12中形成的舒适区要大,可在舒适区形成的悬浮屏比图8中的悬浮屏就要大。因此,用户握持手机的位置不同,在屏幕194的UI上形成的悬浮屏的大小也不同。
在一些实施例中,悬浮屏上显示的内容可以与屏幕194的UI显示的内容不相同,例如图11(b)所示,此时无论用户在悬浮屏中的微信界面上进行聊天、视频等操作,还是在屏幕194的UI中的视频上进行“快进”、“倒退”等操作,处理器110在接收到操作指令后,将该操作指令的反馈在悬浮屏或屏幕194的UI都进行显示。
当然,悬浮屏上显示的内容也可以与屏幕194的UI显示的内容相同。此时无论用户在悬浮屏上进行操作,还是在屏幕194的UI上进行操作,处理器110在接收到操作指令后,将该操作指令的反馈在悬浮屏和屏幕194的UI上都进行显示,使两个界面显示的内容保持相同。
本申请实施例中,考虑到用户两只手轮流交换使用手机时,此时处理器110获取用户左手握持手机时握持点的位置和用户右手握持手机时握持点的位置,然后结合用户的大拇指长度(一般来说,用户的两只手大小相同,所以在此不分左右手),在屏幕194上形成一个以左手握持点产生的舒适区和一个以右手握持点产生的舒适区。处理器110将两个 舒适区重叠的区域作为显示悬浮屏的区域。这样显示的悬浮屏可以让用户使用左手进行操作,也可以让用户使用右手进行操作。
示例性的,如图13所示,当用户右手握持手机,得到右握持点,当用户左手握持手机,得到左握持点,两个握持点分别在屏幕194的右下角和左下角的位置。此时处理器110在屏幕194形成一个以屏幕194右下角为圆点和大拇指长度为半径的1/4圆形区域的右舒适区,形成一个以屏幕194左下角为圆点和大拇指长度为半径的1/4圆形区域的左舒适区。其中,右舒适区与左舒适区之间重叠地方为重叠区域。
处理器110执行开启“悬浮屏”功能时,在屏幕194上的重叠区域内显示一个悬浮屏,如图14所示。其中,由于悬浮的长宽比和屏幕194的UI相同,所以悬浮屏的下底边与屏幕194底边重合,悬浮屏的左上角位于以屏幕194的右下角为握持点形成的舒适区的边缘上,悬浮屏的右上角位于以屏幕194的左下角为握持点形成的舒适区的边缘上,这样形成的悬浮屏的面积最大。
本申请实施例中,处理器110通过获取用户的大拇指长度和握持点的位置,计算出用户在屏幕194上可操作范围和位置,然后在该区域内显示悬浮屏,由于悬浮屏呈现的位置是在用户最大可操作范围以内,所以用户可以在该悬浮屏上任意位置进行操作,实现用户单手操作大屏设备。
实施例二
图15为本申请实施例提供的一种悬浮屏跟手方法的流程图。如图15所示,手机100在执行悬浮屏跟手的具体实现流程如下:
步骤S1501,手机100确定开启的悬浮屏在屏幕194上呈现的具体位置和大小后,在屏幕194上相应的位置上呈现出悬浮屏。
步骤S1502,手机100实时检测用户握持手机的位置。
具体的,手机100在根据大拇指长度和握持点O1位置确定开启的悬浮屏在屏幕194上呈现的具体位置和大小后,屏幕194上相应的位置上呈现出悬浮屏。当用户握持手机的位置发生变化时,也即握持点的位置发生变化,此时处理器110重新检测用户握持手机的握持点的位置。处理器110在确定新的握持点O2位置后,根据大拇指长度和新的握持点O2,在屏幕194上确定一个新的舒适区。
其中,手机100重新检测握持点的位置可采用如图3、9-10及其相应的描述的方式来确定用户手持手机100的握持点,还可根据温度传感器180J检测手的温度、压力传感器检测握持压力等方式来确定用户手持手机100的握持点。
步骤S1503,手机100判断用户握持手机的位置是否发生变化;当检测到用户握持手机100的位置没有发生变化时,执行步骤S1502;当检测到用户握持手机100的位置发生变化时,执行步骤S1504;
步骤S1504,手机100获取用户握持手机的新的握持点的位置,并结合用户的大拇指长度,确定出新的舒适区在屏幕194上的位置。
手机100根据大拇指长度和新的握持点O2位置,重新确定开启的悬浮屏在屏幕194上呈现的具体位置和大小后,也即确定新的舒适区在屏幕194上的位置。然后用户可以采用拖拽悬浮屏、双击或长按新的舒适区等方式,将悬浮屏移动到新的舒适区内。
另外,手机根据大拇指长度和新的握持点O2位置,确定新的舒适区在屏幕194上的位 置后,让在屏幕194上的新的舒适区位置上显示其它颜色、周围显示界限等其它方式,以让用户知道新的舒适区在屏幕19上的位置。步骤S1505,手机判断悬浮屏是否与新的舒适区有重叠的地方;当悬浮屏与新的舒适区有重叠的地方,执行步骤S1506;当悬浮屏与新的舒适区没有重叠的地方,执行步骤S1507。
步骤S1506,手机根据用户的拖拽,将悬浮屏拖拽到新的舒适区内的相应位置。
示例性的,如图16所示,当在屏幕194显示其它颜色的新的舒适区时,此时手机向用户表明握持手机100的位置发生了变化。如果当前在屏幕194上显示的悬浮屏不在新的舒适区内,且有部分与新的舒适区相重叠时,此时手机可以通过在新的舒适区内显示“请拖拽悬浮屏到该区域”字幕的方式,让用户通过拖拽悬浮屏方式将悬浮屏拖拽到舒适区内用户拖拽的位置。用户通过点击屏幕194,进行拖拽悬浮屏,可以将悬浮屏拖拽到新的舒适区内任意位置,也可以将悬浮屏一部分拖拽到新的舒适区内,甚至可以将悬浮屏拖拽到屏幕194上其它位置。
当然,当用户在新的舒适区内进行双击、长按、画个小圆圈等特定的手势,可以将悬浮屏移动到新的舒适区内,本申请在此不限。
另外,拖拽后的悬浮屏最后显示的位置为拖拽操作的结束位置。其中,当用户在屏幕上拖拽悬浮屏时,悬浮屏在屏幕上移动的距离和轨迹即为用户在屏幕上拖拽的距离和轨迹。
步骤S1507,手机根据用户在新的舒适区内进行双击、长按、画个小圆圈等特定的手势,将悬浮屏移动到新的舒适区内。
示例性的,如图17所示,当在屏幕194显示其它颜色的新的舒适区时,此时手机向用户表明握持手机100的位置发生了变化。如果当前在屏幕194上显示的悬浮屏不在新的舒适区内,且与新的舒适区没有任何重叠的地方时,此时手机100可以通过在新的舒适区内显示“请双击该区域”字幕的方式,让用户在新的舒适区内进行双击,手机在接收到“双击”操作指令后,将悬浮屏移动到新的舒适区内。
此时,在新的舒适区显示的悬浮屏的尺寸可以与之前显示的悬浮屏的尺寸相同,也可以变化成在新的舒适区内以最大尺寸的形状显示。
当然,手机也可以响应于长按、画个小圆圈等特定的手势移动悬浮屏,也可以响应于用户的拖拽的移动悬浮屏。本申请在此不限。
本申请实施例通过实时检测用户的握持手机的位置,在检测到用户握持手机的位置发生变化时,重新确定舒适区,让后将悬浮屏移动到新的舒适区内,以保证用户单手可以操作到悬浮屏内的任意位置。
实施例三
图18为本申请实施例提供的一种解决悬浮屏与屏幕194的UI相冲突的方法的流程图。如图18所示,手机100在执行悬浮屏与屏幕194的UI相冲突的具体实现流程如下:
步骤S1801,手机100接收到用户进行操作时,点击的位置为屏幕194的UI与悬浮屏相重叠的位置。
步骤S1802,手机100提醒用户是否关闭悬浮屏;当用户选择关闭悬浮屏时,执行步骤S1803;当用户选择不关闭悬浮屏时,执行步骤S1804。
本申请实施例中,当用户在屏幕194上进行操作过程中,由于悬浮屏所处的位置在屏 幕194的UI上,如果用户在屏幕194的UI上进行操作的位置正好与悬浮屏的位置重合,此时造成悬浮屏与屏幕194的UI相冲突。为了解决用户在屏幕194的UI上进行操作时,所处的位置正好与悬浮屏相重合,用户可以主动的将悬浮屏关闭,或将悬浮屏移动到别的位置。从而避免悬浮屏干扰用户对屏幕194的UI进行操作。
步骤S1803,手机100在接收到用户关闭悬浮屏指令后,关闭悬浮屏功能。
在用户关闭“悬浮屏”功能后,就可以直接对屏幕的UI进行操作,手机100也就直接响应屏幕194的UI上的操作事件。其中,关闭悬浮屏的方法可以通过点击悬浮屏关闭按钮(类似windows程序右上角的“X”)、快捷键形式(如双击电源键)等方式,本申请实施例在此不作限定。
步骤S1804,手机100启动通过按压时间长短来选择对应位置中的屏幕194的UI或悬浮屏中的按钮的功能。
步骤S1805,手机100检测用户点击屏幕194时的对屏幕194按压时间,判断按压时间是否超过设定的阈值;当按压时间超过设定的阈值时,执行步骤S1806;当按压时间不超过设定的阈值时,执行步骤S1807。
处理器110还可以根据进行操作时各个操作在屏幕194上按压的时间,来确定用户是对悬浮屏界面进行操作还是对屏幕194的UI进行操作。当按压时间超过设定阈值时,处理器110将该操作事件判定为对屏幕194的UI进行操作;当按压时间不超过设定阈值时,处理器110将该操作事件判定为对悬浮屏进行操作。
另外,在屏幕194的UI和悬浮屏的上设置有虚拟指示标识,用于在屏幕194的UI和悬浮屏上接收到用户的操作时,虚拟指示标识从一个颜色变化为另一个颜色,以提示用户用于此次操作是点击在屏幕194的UI或悬浮屏上。
示例性的,如图19所示,在手机启动悬浮屏时,在屏幕194的UI和悬浮屏的上方均呈现三个圆形虚拟指示标识。当用户在屏幕194的UI的与悬浮屏重叠的位置上进行操作时,处理器110判断当此次点击的时间是否超过预设时间。当此次点击的时间超过预设时间时,屏幕194的UI中的指示标识从白色变化为其它颜色,而悬浮屏中的指示标识的颜色不变;当此次点击的时间不超过预设时间时,悬浮屏中的指示标识从白色变化为其它颜色,而屏幕194的UI中的指示标识的颜色不变。步骤S1806,手机100响应屏幕194的UI上的操作事件。
步骤S1807,手机100响应悬浮屏上的操作事件。
另外,本申请实施例中手机100还可以检测用户点击屏幕194时的对屏幕194按压力度,判断按压力度是否超过设定的阈值。当按压屏幕194的力度大于预设的阈值,手机100响应屏幕194的UI上的操作事件;当按压屏幕194的力度不大于预设的阈值,手机100响应悬浮屏上的操作事件。本申请在此不限。
本申请实施例中,当用户在屏幕194的UI上进行操作时,点击的位置为屏幕194的UI与悬浮屏相重叠处,为了避免操作相冲突,手机100先提醒用户是否关闭悬浮屏,如果用户不关闭悬浮屏的情况下,手机100根据用户按压时间判定此次点击为对屏幕194的UI的操作还是对悬浮屏的操作。
实施例四
如果开启悬浮屏功能后,在屏幕194上呈现的悬浮屏的大小不符合用户要求的尺寸时, 用户可以对悬浮屏进行放大和缩小。
示例性的,如图20所示,当用户需要将原始显示的悬浮屏进行放大时,将两只手指放置在悬浮屏上,然后将两只手指在屏幕194上进行滑动,且滑动方向为相互背离的方向。此时处理器110接收到放大指令后,将悬浮屏显示的尺寸进行放大。其中,被放大后的悬浮屏的位置可以在舒适区内,也可以超出舒适区范围,本申请实施例在此不作限定。
示例性的,如图21所示,当用户需要将原始显示的悬浮屏进行缩小时,将两只手指放置在悬浮屏上,然后将两只手指在屏幕194上进行滑动,且滑动方向为相互靠拢的方向。此时处理器110接收到缩小指令后,将悬浮屏显示的尺寸进行缩小。
本申请实施例中,如果用户对原始现实的悬浮屏的尺寸不满意,可以通过特定的操作,对悬浮屏进行放大或缩小操作,实现显示的悬浮屏的尺寸为用户所需要的尺寸。
本申请实施例公开了一种电子设备,包括处理器,以及与处理器相连的存储器、输入设备和输出设备。其中,输入设备和输出设备可集成为一个设备,例如,可将屏幕的触控器件作为输入设备,将屏幕的显示器作为输出设备。
此时,如图22所示,上述电子设备可以包括:屏幕2201,所述屏幕2201包括触控器件2206和显示器2207;一个或多个处理器2202;一个或多个存储器2203;一个或多个传感器2208;一个或多个应用程序(未示出);以及一个或多个计算机程序2204,上述各器件可以通过一个或多个通信总线2205连接。其中该一个或多个计算机程序2204被存储在上述存储器2203中并被配置为被该一个或多个处理器2202执行,该一个或多个计算机程序2204包括指令,上述指令可以用于执行上述实施例中的各个步骤。其中,上述方法实施例涉及的各步骤的所有相关内容均可以援引到对应实体器件的功能描述,在此不再赘述。
示例性的,上述处理器2202具体可以为图1所示的处理器110,上述存储器2203具体可以为图1所示的内部存储器121和/或外部存储器120,上述屏幕2201具体可以为图1所示的屏幕194,上述传感器2208具体可以为图1所示的传感器模块180中的陀螺仪传感器180B、加速度传感器180E、接近光传感器180G,还可以是红外传感器、霍尔传感器等一项或多项,本申请实施例对此不做任何限制。
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。上述描述的系统,装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请实施例各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或处理器执行本申请各个 实施例所述方法的全部或部分步骤。而前述的存储介质包括:快闪存储器、移动硬盘、只读存储器、随机存取存储器、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请实施例的具体实施方式,但本申请实施例的保护范围并不局限于此,任何在本申请实施例揭露的技术范围内的变化或替换,都应涵盖在本申请实施例的保护范围之内。因此,本申请实施例的保护范围应以所述权利要求的保护范围为准。

Claims (22)

  1. 一种单手操作的方法,由电子设备执行,其特征在于,包括:
    显示第一界面,所述第一界面占据所述电子设备的屏幕的全部;
    检测到用户的第一触发操作;
    根据所述第一触发操作,显示悬浮界面,所述悬浮界面的尺寸小于所述第一界面的尺寸,所述悬浮界面位于在所述屏幕中用户单手操作的第一区域内,所述第一区域由用户握持所述电子设备的位置和用户的大拇指的长度确定。
  2. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    当所述悬浮界面为缩小的所述第一界面时,所述第一界面在所述屏幕上不再显示。
  3. 根据权利要求1-2任一项所述的方法,其特征在于,在显示所述悬浮界面之前,所述方法还包括:
    开启至少一个摄像头,并提示用户拍摄手部的图像;
    获取所述用户的大拇指的长度包括:根据所述手部的图像,计算出所述大拇指的长度。
  4. 根据权利要求1-2任一项所述的方法,其特征在于,在显示所述悬浮界面之前,所述方法还包括:
    开启指纹传感器,并提示用户采集指纹信息;
    获取所述用户的大拇指的长度包括:根据指纹大小与大拇指长度之间的关系,确定出所述大拇指的长度。
  5. 根据权利要求1-2任一项所述的方法,其特征在于,在显示所述悬浮界面之前,所述方法还包括:
    提示用户单手握持所述电子设备时在所述屏幕上画弧形,并获取所述弧形的轨迹;
    获取所述用户的大拇指的长度包括:根据所述弧形的轨迹,计算出所述大拇指的长度。
  6. 根据权利要求1-5任一项所述的方法,其特征在于,在显示所述悬浮界面之前,所述方法还包括:
    提示用户单手握持所述电子设备时在所述屏幕上画弧形,并获取所述弧形的轨迹;
    获取所述用户握持所述电子设备的位置包括:根据所述弧形的轨迹,计算出所述用户握持所述电子设备的位置。
  7. 根据权利要求1-5任一项所述的方法,其特征在于,在显示所述悬浮界面之前,所述方法还包括获取用户在一段时间内在所述屏幕上进行操作的至少一个操作点;
    获取所述用户握持所述电子设备的位置包括:根据所述至少一个操作点,计算出所述用户握持所述电子设备的位置。
  8. 根据权利要求7所述的方法,其特征在于,所述根据所述操作点,计算出所述用户握持所述电子设备的位置,包括:
    根据所述至少一个操作点在屏幕上的位置,确定N个圆中覆盖所述至少一个操作点最多数量的一个圆的圆心为握持点位置,所述N个圆为在所述屏幕上以所述屏幕边缘位置上的N个位置为圆心和以所述大拇指长度为半径的圆,N为大于2的整数,所述握持点为所述用户握持手机时,掌心或大拇指处与所述屏幕的边缘处相接触的位置。
  9. 根据权利要求1-8任一项所述的方法,其特征在于,确定所述第一区域,包括:
    根据所述用户握持所述电子设备的位置,确定用户握持手机时掌心或大拇指处与所述屏幕的边缘处相接触的握持点;
    将所述握持点为圆心、所述大拇指长度为半径在所述屏幕上形成的区域作为所述第一区域。
  10. 根据权利要求1-8任一项所述的方法,其特征在于,确定所述第一区域,包括:
    根据至少两个所述用户握持所述电子设备的位置,确定至少两个用户握持手机时掌心或大拇指处与所述屏幕的边缘处相接触的握持点;
    将至少两个握持点为圆心、所述大拇指长度为半径在所述屏幕上形成的至少两个区域之间重叠区域作为所述第一区域。
  11. 根据权利要求1-10任一项所述的方法,其特征在于,所述悬浮界面以最大尺寸的形状显示于所述第一区域内,所述悬浮界面的形状与所述第一界面的形状相同。
  12. 根据权利要求1-11任一项所述的方法,其特征在于,所述方法还包括:
    检测到用户握持所述电子设备的位置发生变化,根据更新的用户握持所述电子设备的位置和所述大拇指的长度确定第二区域,并在所述第一界面上显示所述第二区域。
  13. 根据权利要求12所述的方法,其特征在于,所述方法还包括:
    检测到用户在所述第二区域内进行点击操作,响应于所述点击操作,将所述悬浮界面显示在所述第二区域内。
  14. 根据权利要求12所述的方法,其特征在于,所述方法还包括:
    检测到用户在所述第二区域内进行点击操作,当所述悬浮界面与所述第二区域不重叠时,响应于所述点击操作,将所述悬浮界面显示在所述第二区域内。
  15. 根据权利要求12所述的方法,其特征在于,所述方法还包括:
    检测到用户将所述悬浮界面从所述第一区域移动到所述第二区域的拖拽操作,响应于所述拖拽操作,在所述拖拽操作的结束位置显示所述悬浮界面。
  16. 根据权利要求12所述的方法,其特征在于,所述方法还包括:
    检测到用户将所述悬浮界面从所述第一区域移动到所述第二区域的拖拽操作,当所述悬浮界面与所述第二区域重叠时,响应于所述拖拽操作,在所述拖拽操作的结束位置显示所述悬浮界面。
  17. 根据权利要求1-16任一项所述的方法,其特征在于,所述方法还包括:
    检测到作用于第一位置的操作,其中,所述悬浮界面包括第一控件,所述第一界面包括第二控件,所述第一控件在所述屏幕上显示的位置为所述第一位置,所述第二控件在所述屏幕上显示的位置为第二位置,且所述第一位置与所述第二位置至少部分重叠;
    提示是否关闭所述悬浮界面的显示;
    若收到关闭显示的指令,不再显示所述悬浮界面,所述第一界面对应的应用响应所述作用于所述第一位置的操作,并根据所述第二控件显示对应的界面。
  18. 根据权利要求17所述的方法,其特征在于,所述方法还包括:
    若未收到所述关闭显示的指令,判断触发所述第一控件的操作的压力值或时长中的至少一个是否大于设定值;
    若所述触发所述第一控件的操作的压力值或时长中的至少一个大于所述设定值,所 述第一界面对应的应用响应所述作用于所述第一位置的操作,并根据所述第二控件显示对应的界面;
    若所述触发所述第一控件的操作的压力值或时长中的至少一个不大于所述设定值,所述悬浮界面对应的应用响应所述作用于所述第一位置的操作,并根据所述第一控件在所述悬浮界面占据的区域内显示对应的界面。
  19. 根据权利要求1-18任一项所述的方法,其特征在于,所述方法还包括:
    检测到作用于所述悬浮界面内同一时间有至少两个操作点,且所述至少两个操作点位置随时间的变化相距越来越远时,将所述悬浮界面的尺寸放大;
    检测到作用于所述悬浮界面内同一时间有至少两个操作点,且所述至少两个操作点位置随时间的变化相距越来越近时,将所述悬浮界面的尺寸缩小。
  20. 一种电子设备,其特征在于,包括:
    屏幕,用于显示第一界面,所述第一界面占据所述电子设备的屏幕的全部,且根据所述第一触发操作,显示悬浮界面;
    一个或多个处理器;
    一个或多个存储器;
    一个或多个传感器;
    以及一个或多个计算机程序,其中所述一个或多个计算机程序被存储在所述一个或多个存储器中,所述一个或多个计算机程序包括指令,当所述指令被所述电子设备执行时,使得所述电子设备执行如权利要求1-19中任一项所述的显示方法。
  21. 一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,其特征在于,当所述指令在电子设备上运行时,使得所述电子设备执行如权利要求1-19中任一项所述的显示方法。
  22. 一种包含指令的计算机程序产品,其特征在于,当所述计算机程序产品在电子设备上运行时,使得所述电子设备执行如权利要求1-19中任一项所述的显示方法。
PCT/CN2020/128000 2019-11-29 2020-11-11 一种单手操作的方法和电子设备 WO2021104015A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP20892299.7A EP4053688A4 (en) 2019-11-29 2020-11-11 ONE-HAND OPERATION METHOD AND ELECTRONIC DEVICE
US17/780,678 US20230009389A1 (en) 2019-11-29 2020-11-11 One-hand operation method and electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911203234.5A CN111124201A (zh) 2019-11-29 2019-11-29 一种单手操作的方法和电子设备
CN201911203234.5 2019-11-29

Publications (1)

Publication Number Publication Date
WO2021104015A1 true WO2021104015A1 (zh) 2021-06-03

Family

ID=70497209

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/128000 WO2021104015A1 (zh) 2019-11-29 2020-11-11 一种单手操作的方法和电子设备

Country Status (4)

Country Link
US (1) US20230009389A1 (zh)
EP (1) EP4053688A4 (zh)
CN (1) CN111124201A (zh)
WO (1) WO2021104015A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114489433A (zh) * 2022-04-14 2022-05-13 北京创新乐知网络技术有限公司 一种基于安卓的网络社区资料管理方法及装置
EP4155865A1 (en) * 2021-09-24 2023-03-29 HTC Corporation Virtual image display device and setting method for input interface thereof
WO2023125052A1 (zh) * 2021-12-27 2023-07-06 Oppo广东移动通信有限公司 一种显示方法及相关装置

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111124201A (zh) * 2019-11-29 2020-05-08 华为技术有限公司 一种单手操作的方法和电子设备
CN113672133A (zh) * 2020-05-13 2021-11-19 华为技术有限公司 一种多指交互方法及电子设备
TWI769739B (zh) * 2021-03-15 2022-07-01 華碩電腦股份有限公司 可攜式電子裝置與其單手觸控操作方法
TW202238351A (zh) 2021-03-15 2022-10-01 華碩電腦股份有限公司 電子裝置
CN117999537A (zh) * 2022-06-20 2024-05-07 北京小米移动软件有限公司 电子显示设备及其显示控制方法、装置,存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103619243A (zh) * 2011-06-24 2014-03-05 株式会社村田制作所 移动设备
US20140145975A1 (en) * 2012-11-26 2014-05-29 Samsung Electro-Mechanics Co., Ltd. Touchscreen device and screen zoom method thereof
CN106445354A (zh) * 2016-11-24 2017-02-22 北京小米移动软件有限公司 终端设备的触摸控制方法及装置
CN109164965A (zh) * 2018-08-10 2019-01-08 奇酷互联网络科技(深圳)有限公司 移动终端及其缩小屏幕界面的方法、装置和可读存储介质
CN110262749A (zh) * 2019-06-27 2019-09-20 北京思维造物信息科技股份有限公司 一种网页操作方法、装置、容器、设备及介质
CN111124201A (zh) * 2019-11-29 2020-05-08 华为技术有限公司 一种单手操作的方法和电子设备

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103955339A (zh) * 2014-04-25 2014-07-30 华为技术有限公司 一种终端操作方法及终端设备
CN105302459B (zh) * 2015-10-15 2020-04-17 Oppo广东移动通信有限公司 终端的单手控制方法及装置
CN105744054A (zh) * 2015-12-31 2016-07-06 宇龙计算机通信科技(深圳)有限公司 一种移动终端的控制方法及移动终端

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103619243A (zh) * 2011-06-24 2014-03-05 株式会社村田制作所 移动设备
US20140145975A1 (en) * 2012-11-26 2014-05-29 Samsung Electro-Mechanics Co., Ltd. Touchscreen device and screen zoom method thereof
CN106445354A (zh) * 2016-11-24 2017-02-22 北京小米移动软件有限公司 终端设备的触摸控制方法及装置
CN109164965A (zh) * 2018-08-10 2019-01-08 奇酷互联网络科技(深圳)有限公司 移动终端及其缩小屏幕界面的方法、装置和可读存储介质
CN110262749A (zh) * 2019-06-27 2019-09-20 北京思维造物信息科技股份有限公司 一种网页操作方法、装置、容器、设备及介质
CN111124201A (zh) * 2019-11-29 2020-05-08 华为技术有限公司 一种单手操作的方法和电子设备

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4053688A4

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4155865A1 (en) * 2021-09-24 2023-03-29 HTC Corporation Virtual image display device and setting method for input interface thereof
US11644972B2 (en) 2021-09-24 2023-05-09 Htc Corporation Virtual image display device and setting method for input interface thereof
WO2023125052A1 (zh) * 2021-12-27 2023-07-06 Oppo广东移动通信有限公司 一种显示方法及相关装置
CN114489433A (zh) * 2022-04-14 2022-05-13 北京创新乐知网络技术有限公司 一种基于安卓的网络社区资料管理方法及装置
CN114489433B (zh) * 2022-04-14 2022-06-21 北京创新乐知网络技术有限公司 一种基于安卓的网络社区资料管理方法及装置

Also Published As

Publication number Publication date
EP4053688A1 (en) 2022-09-07
EP4053688A4 (en) 2023-01-11
US20230009389A1 (en) 2023-01-12
CN111124201A (zh) 2020-05-08

Similar Documents

Publication Publication Date Title
WO2021104015A1 (zh) 一种单手操作的方法和电子设备
EP3896946B1 (en) Display method for electronic device having flexible screen and electronic device
WO2021052214A1 (zh) 一种手势交互方法、装置及终端设备
WO2020224449A1 (zh) 一种分屏显示的操作方法及电子设备
WO2020134869A1 (zh) 电子设备的操作方法和电子设备
US20230046708A1 (en) Application Interface Interaction Method, Electronic Device, and Computer-Readable Storage Medium
WO2021036771A1 (zh) 具有可折叠屏幕的电子设备及显示方法
US20220283610A1 (en) Electronic Device Control Method and Electronic Device
WO2021052279A1 (zh) 一种折叠屏显示方法及电子设备
WO2022100610A1 (zh) 投屏方法、装置、电子设备及计算机可读存储介质
WO2020134877A1 (zh) 一种皮肤检测方法及电子设备
WO2021057343A1 (zh) 一种对电子设备的操作方法及电子设备
WO2021185250A1 (zh) 图像处理方法及装置
CN112584251B (zh) 一种显示方法及电子设备
WO2021052344A1 (zh) 一种参数的调整方法及电子设备
WO2021180089A1 (zh) 界面切换方法、装置和电子设备
EP3893495A1 (en) Method for selecting images based on continuous shooting and electronic device
WO2021082564A1 (zh) 一种操作提示的方法和电子设备
WO2021052407A1 (zh) 一种电子设备操控方法及电子设备
WO2021238370A1 (zh) 显示控制方法、电子设备和计算机可读存储介质
WO2021057699A1 (zh) 具有柔性屏幕的电子设备的控制方法及电子设备
WO2021208723A1 (zh) 全屏显示方法、装置和电子设备
WO2022028537A1 (zh) 一种设备识别方法及相关装置
WO2023241209A9 (zh) 桌面壁纸配置方法、装置、电子设备及可读存储介质
WO2020221062A1 (zh) 一种导航操作方法及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20892299

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020892299

Country of ref document: EP

Effective date: 20220531

NENP Non-entry into the national phase

Ref country code: DE