CN108958614B - Display control method and terminal - Google Patents

Display control method and terminal Download PDF

Info

Publication number
CN108958614B
CN108958614B CN201810725301.9A CN201810725301A CN108958614B CN 108958614 B CN108958614 B CN 108958614B CN 201810725301 A CN201810725301 A CN 201810725301A CN 108958614 B CN108958614 B CN 108958614B
Authority
CN
China
Prior art keywords
terminal
target
distance
user
finger
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810725301.9A
Other languages
Chinese (zh)
Other versions
CN108958614A (en
Inventor
肖石文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201810725301.9A priority Critical patent/CN108958614B/en
Publication of CN108958614A publication Critical patent/CN108958614A/en
Application granted granted Critical
Publication of CN108958614B publication Critical patent/CN108958614B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/043Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using propagating acoustic waves

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention discloses a display control method and a terminal, relates to the technical field of communication, and aims to solve the problem that the convenience of a user for using the terminal is poor due to the fact that the user needs to readjust the holding direction to normally use the terminal when the user reversely holds the terminal. The display control method is applied to a terminal and comprises the following steps: acquiring a target image, wherein the target image is a projected image of fingers of a user on a screen of a terminal when the user holds the terminal; determining a target parameter value according to the target image, wherein the target parameter value is used for indicating the orientation of the finger on the screen; and under the condition that the target parameter value meets the first condition, controlling the display interface of the terminal to display reversely relative to the screen. The method can be applied to a scene that a user holds the terminal and uses the terminal.

Description

Display control method and terminal
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to a display control method and a terminal.
Background
With the wider application range of the terminal, the user's demand for convenience in using the terminal is increasing.
At present, as the screen occupation ratio (i.e., the ratio of the touch screen area to the front shell area of the terminal) of the terminal is larger and larger, the proportion of the front shell of the terminal in the front of the terminal is smaller and smaller, i.e., the top and the bottom of the front of the terminal may be very similar in appearance. Thus, when the user uses the terminal, the user may not pay attention to the top and the bottom of the front of the terminal, that is, when the user picks up the terminal, the situation that the bottom of the front of the terminal is above and below (that is, the user reversely holds the terminal) may occur, and in a general situation, the display interface of the terminal can only be switched between a vertical screen (that is, the top of the front of the terminal is above and below) and a horizontal screen, so that when the user reversely holds the terminal, the user may need to readjust the holding direction to normally use the terminal, thereby resulting in poor convenience of the user in using the terminal.
Disclosure of Invention
The embodiment of the invention provides a display control method and a terminal, and aims to solve the problem that the convenience of a user for using the terminal is poor due to the fact that the user needs to readjust the holding direction to normally use the terminal when the user reversely holds the terminal.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides a display control method, where the display control method is applied to a terminal, and the method includes: acquiring a target image, wherein the target image is a projected image of fingers of a user on a screen of a terminal when the user holds the terminal; determining a target parameter value according to the target image, wherein the target parameter value is used for indicating the direction of the finger on the screen; and under the condition that the target parameter value meets the first condition, controlling the display interface of the terminal to display reversely relative to the screen.
In a second aspect, an embodiment of the present invention provides a terminal, which includes an obtaining module, a determining module, and a control module. The acquisition module is used for acquiring a target image, wherein the target image is a projected image of fingers of a user on a screen of the terminal when the user holds the terminal; the determining module is used for determining a target parameter value according to the target image acquired by the acquiring module, wherein the target parameter value is used for indicating the direction of the finger on the screen; the control module is used for controlling the display interface of the terminal to display reversely relative to the screen under the condition that the determining module determines that the target parameter value meets the first condition.
In a third aspect, an embodiment of the present invention provides a terminal, where the terminal includes a processor, a memory, and a computer program stored in the memory and being executable on the processor, and the computer program, when executed by the processor, implements the steps of the display control method in the first aspect.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the steps of the display control method in the first aspect.
In the embodiment of the invention, by acquiring a projection image of a finger of a user on a screen of the terminal when the user holds the terminal (i.e. the target image), and determining the orientation of the finger on the screen according to the projection image, the display interface of the terminal can be controlled to be displayed in a reverse direction relative to the screen (i.e. the display interface of the terminal is displayed in a forward direction relative to the user) in the case that the orientation of the finger on the screen meets a first condition (for example, the case indicates that the display interface of the terminal is displayed in a reverse direction relative to the user). Compared with the prior art, the embodiment of the invention can automatically control the display interface of the terminal to be adjusted to be displayed forward relative to the user when the user reversely holds the terminal (namely, the display interface of the terminal is displayed reversely relative to the user), and the terminal can be normally used without the user readjusting the holding direction, so that the convenience of the user in using the terminal is improved.
Drawings
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention;
FIG. 2 is a schematic diagram illustrating a display control method according to an embodiment of the present invention;
fig. 3 is one of schematic interfaces of an application of the display control method according to the embodiment of the present invention;
fig. 4 is a second schematic interface diagram of an application of the display control method according to the embodiment of the present invention;
fig. 5 is a third schematic interface diagram of an application of the display control method according to the embodiment of the present invention;
FIG. 6 is a fourth schematic view of an interface applied by the display control method according to the embodiment of the present invention;
fig. 7 is a second schematic diagram of a display control method according to an embodiment of the invention;
FIG. 8 is a fifth schematic view of an interface applied by the display control method according to the embodiment of the present invention;
fig. 9 is a third schematic diagram illustrating a display control method according to an embodiment of the invention;
FIG. 10 is a sixth schematic view of an interface applied to a display control method according to an embodiment of the present invention;
FIG. 11 is a fourth schematic diagram illustrating a display control method according to an embodiment of the present invention;
fig. 12 is a seventh schematic interface diagram of an application of the display control method according to the embodiment of the present invention;
fig. 13 is a schematic structural diagram of a terminal according to an embodiment of the present invention;
fig. 14 is a hardware diagram of a terminal according to an embodiment of the present invention;
fig. 15 is a second hardware diagram of the terminal according to the embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The term "and/or" herein is an association relationship describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. The symbol "/" herein denotes a relationship in which the associated object is or, for example, a/B denotes a or B.
The terms "first" and "second," and the like, in the description and in the claims of the present invention are used for distinguishing between different objects and not for describing a particular order of the objects. For example, the first distance, the second distance, etc. are for distinguishing different distances, and are not for describing a particular order of the distances.
In the embodiments of the present invention, words such as "exemplary" or "for example" are used to mean serving as examples, illustrations or descriptions. Any embodiment or design described as "exemplary" or "e.g.," an embodiment of the present invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In the description of the embodiments of the present invention, unless otherwise specified, "a plurality" means two or more, for example, a plurality of elements means two or more elements, and the like.
The following first explains some of the nouns or terms referred to in the claims and the specification of the present invention.
A display interface of the terminal may be divided into vertical screen display and horizontal screen display with respect to a screen (also referred to as a touch screen or a display screen, etc.), and the vertical screen display may be divided into forward display and reverse display. The following is a detailed description of the front display, the back display, and the landscape display.
The display interface of the terminal displays forward relative to the screen: there is a division of the top (e.g., the end where the earpiece is located) and the bottom (e.g., the end where the microphone is located) for the screen of the terminal. The display interface (such as text, pictures or videos) of the terminal is forward when viewed along the top-to-bottom direction of the terminal, and this display form is referred to as forward display of the display interface of the terminal with respect to the screen. For example, typically, when a user is holding the terminal (i.e., top up and bottom down), the display interface of the terminal is displayed forward with respect to the screen and forward with respect to the user; when the user holds the terminal upside down (i.e., top down and bottom up), the display interface of the terminal is displayed in a forward direction with respect to the screen and in a reverse direction with respect to the user.
The display interface of the terminal is reversely displayed relative to the screen: if the display interface (e.g., text, picture or video) of the terminal is reversed as viewed along the top-to-bottom direction of the terminal, this display form is referred to as the display interface of the terminal being reversed with respect to the screen.
Displaying a display interface of the terminal relative to a screen in a horizontal mode: the display interface (e.g., text, picture or video) of the terminal is landscape as viewed along the top-to-bottom direction of the terminal, which is referred to as a landscape display of the display interface of the terminal with respect to the screen.
It should be noted that, because the display interface of the terminal can be switched back and forth on the screen of the terminal when being displayed in the horizontal screen, it can be considered that there is no directional relationship between the display interface and the screen when being displayed in the horizontal screen. Therefore, the forward display and the reverse display of the display interface of the terminal relative to the screen in the embodiment of the invention are explained by taking the vertical screen display of the display interface as an example.
It should be noted that, in the embodiment of the present invention, the meanings of the operation interface and the display interface are similar, and both of the operation interface and the display interface are human-computer interaction interfaces on the terminal. And the two can be exchanged at any time, the display interface is described from the terminal angle, and the operation interface is described from the user angle.
The embodiment of the invention provides a display control method and a terminal, wherein a projection image (namely, the target image) of a finger of a user on a screen of the terminal when the user holds the terminal is obtained, the orientation of the finger on the screen is determined according to the projection image, and the display interface of the terminal is controlled to be displayed reversely relative to the screen (namely, the display interface of the terminal is displayed forwardly relative to the user) under the condition that the orientation of the finger on the screen meets a first condition (for example, the condition shows that the display interface of the terminal is displayed reversely relative to the user). Compared with the prior art, the embodiment of the invention can automatically control the display interface of the terminal to be adjusted to be displayed forward relative to the user when the user reversely holds the terminal (namely, the display interface of the terminal is displayed reversely relative to the user), and the terminal can be normally used without the user readjusting the holding direction, so that the convenience of the user in using the terminal is improved.
The terminal in the embodiment of the present invention may be a terminal having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present invention are not limited in particular.
The following describes a software environment to which the display control method provided by the embodiment of the present invention is applied, by taking an android operating system as an example.
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention. In fig. 1, the architecture of the android operating system includes 4 layers, which are respectively: an application layer, an application framework layer, a system runtime layer, and a kernel layer (specifically, a Linux kernel layer).
The application program layer comprises various application programs (including system application programs and third-party application programs) in an android operating system.
The application framework layer is a framework of the application, and a developer can develop some applications based on the application framework layer under the condition of complying with the development principle of the framework of the application.
The system runtime layer includes libraries (also called system libraries) and android operating system runtime environments. The library mainly provides various resources required by the android operating system. The android operating system running environment is used for providing a software environment for the android operating system.
The kernel layer is an operating system layer of an android operating system and belongs to the bottommost layer of an android operating system software layer. The kernel layer provides kernel system services and hardware-related drivers for the android operating system based on the Linux kernel.
Taking an android operating system as an example, in the embodiment of the present invention, a developer may develop a software program for implementing the display control method provided in the embodiment of the present invention based on the system architecture of the android operating system shown in fig. 1, so that the display control method may operate based on the android operating system shown in fig. 1. That is, the processor or the terminal may implement the display control method provided by the embodiment of the present invention by running the software program in the android operating system.
The terminal in the embodiment of the invention can be a mobile terminal or a non-mobile terminal. For example, the mobile terminal may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted terminal, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile terminal may be a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiment of the present invention is not particularly limited.
The execution main body of the display control method provided in the embodiment of the present invention may be the terminal, or may also be a functional module and/or a functional entity capable of implementing the display control method in the terminal, which may be specifically determined according to actual use requirements, and the embodiment of the present invention is not limited. The following takes a terminal as an example to exemplarily explain a display control method provided by the embodiment of the present invention.
As shown in fig. 2, an embodiment of the present invention provides a display control method, which may be applied to a terminal, and which may include S201 to S204 described below.
S201, the terminal acquires a target image.
The target image may be a projected image of a finger of the user on a screen of the terminal when the user holds the terminal.
In the embodiment of the invention, the terminal can acquire images in real time so as to acquire the projection images of the fingers of the user on the screen of the terminal when the user holds the terminal, and the projection images are used as target images.
Optionally, the state of the terminal held by the user may be a single-hand (e.g., left-hand or right-hand) holding state, that is, the thumb of the user is located above the screen of the terminal, and the other fingers are located below the terminal; in this case, the terminal may acquire a projected image of the user's thumb on the screen of the terminal as the target image. Or the state that the user holds the terminal can be a two-hand holding state, namely the user holds the terminal with two hands, namely the thumbs of the left hand and the right hand of the user are both positioned above the screen of the terminal, and other fingers of the left hand and the right hand are positioned below the terminal; in this case, the terminal may acquire projection images of the two thumbs of the user on the screen of the terminal as the target images. It should be noted that the embodiment of the present invention does not exclude the case where the user does not hold the terminal, that is, the terminal may further obtain a projection image of the finger of the user on the screen of the terminal in the case where the user does not hold the terminal (for example, the terminal is placed on a desktop), as the target image.
It should be noted that the projection image of the thumb or the fingers other than the thumb on the screen of the terminal may be used as the target image, and may be determined according to actual use requirements, and the embodiment of the present invention is not limited.
In the embodiment of the present invention, the terminal may acquire the target image by using a sensor (for example, an ultrasonic sensor) built in a lower portion of a screen of the terminal.
For example, the user may hold the terminal with the left hand (as shown in (a) of fig. 3) or the terminal with the right hand (as shown in (b) of fig. 3) and rest the thumb above the screen of the terminal, in which case the terminal may acquire a projected image of the thumb of the user on the screen of the terminal using a built-in ultrasonic sensor and take the projected image as a target image. How to control the display interface display of the terminal will be specifically described below by taking the example of the user holding the terminal with the left hand.
S202, the terminal determines a target parameter value according to the target image.
Wherein the target parameter value may be used to indicate the orientation of the user's finger on the screen.
In the embodiment of the present invention, the terminal may analyze the obtained projection image (i.e., the target image) of the finger of the user on the screen of the terminal when the user holds the terminal, and obtain a target parameter value that may be used to indicate the orientation of the finger on the screen from the projection image.
Illustratively, as shown in fig. 4, the terminal may acquire a target image, and select a first target point and a second target point from the target image, where the first target point may be any point in a root region of a finger (as shown by a rectangular hatched region in (a) in fig. 4), and the second target point may be any point in a fingertip region of the finger (as shown by a circular hatched region in (a) in fig. 4). It should be noted that the area sizes of the root region and the fingertip region of the finger may be determined according to actual use requirements, and the embodiment of the present invention is not limited.
Alternatively, in the embodiment of the present invention, the target parameter value may include a first distance and a second distance, and as shown in (a) of fig. 4, assuming that the bottom AB of the terminal is taken as a reference line (denoted as AB), the first distance may be a vertical distance (denoted as H1) between the first target point (i.e., any point in the base region of the finger) and the reference line AB, and the second distance may be a vertical distance (denoted as H2) between the second target point (i.e., any point in the tip region of the finger) and the reference line AB. The first distance H1 and the second distance H2 may be used to indicate the orientation of the finger on the screen, in particular, if H1 < H2, the finger is indicated to be on the screen towards the top CD of the terminal, which means that the user is holding the terminal; if H1 > H2, indicating that the finger is on the screen towards the bottom AB of the terminal, this means that the user is holding the terminal backwards (as shown in (a) of FIG. 4); if H1 ═ H2, the finger is indicated to be oriented on the screen substantially parallel to the top CD or bottom AB of the terminal. Thus, the terminal can determine the orientation of the finger on the screen by comparing the numerical magnitudes of the first distance H1 and the second distance H2, and can determine whether the user is holding the terminal or not according to the orientation of the finger on the screen.
Alternatively, in the embodiment of the present invention, the target parameter values may be a third distance and a fourth distance, and as shown in (a) of fig. 4, assuming that the top CD of the terminal is taken as a reference line, the third distance may be a vertical distance (denoted as H3) between the first target point (i.e., any point in the base region of the finger) and the reference line CD, and the fourth distance may be a vertical distance (denoted as H4) between the second target point (i.e., any point in the fingertip region of the finger) and the reference line CD. The process of determining the orientation of the finger on the screen according to the numerical values of H3 and H4 is similar to the above process of determining the orientation of the finger on the screen according to the numerical values of H1 and H2, except that the reference lines used by the two are different, so the determination results are different. Specifically, if H3 < H4, it indicates that the finger is on the screen toward the bottom AB of the terminal, which means that the user reversely holds the terminal (as shown in (a) of fig. 4); if H3 > H4, indicate that the finger is on the screen towards the top CD of the terminal, which means that the user is holding the terminal; if H3 ═ H4, then the finger is indicated to be pointing on the screen substantially parallel to the bottom AB or top CD of the terminal.
Alternatively, the first target point may be a point (denoted as O1) having a maximum vertical distance from the reference line AB in the root region, such that the first distance may be the vertical distance between the first target point O1 and the reference line AB, and the second target point may be a point (denoted as O2) having a maximum vertical distance from the reference line AB in the fingertip region, such that the second distance may be the vertical distance between the second target point O2 and the reference line AB. Alternatively, the first distance may be a vertical distance between a center point in the base region of the finger and the reference line AB, and the second distance may be a vertical distance between a center point in the tip region of the finger and the reference line AB. The third distance and the fourth distance may also be determined in the above manner, and are not described again. The method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
It should be noted that, in the embodiment of the present invention, the reference line AB or the reference line CD is taken as an example for illustration, and of course, AC or BD may also be taken as a reference line, which may be determined according to actual use requirements, and the embodiment of the present invention is not limited thereto.
Optionally, in the embodiment of the present invention, the target parameter value may be obtained in a manner of establishing a coordinate system.
Illustratively, referring to fig. 4 (a), a screen coordinate system may be established with the top (e.g., the above-mentioned reference line CD) or the bottom (e.g., the above-mentioned reference line AB) of the terminal as an abscissa, and with the side edge (e.g., the above-mentioned reference line AC or the reference line BD) of the terminal as an ordinate, and with an intersection of the abscissa and the ordinate as an origin (denoted as O). Therefore, the terminal can acquire the target parameter value by determining the coordinates of the finger of the user in the screen coordinate system.
Exemplarily, referring to (a) in fig. 4, a screen coordinate system may be established with the reference line AB as an abscissa and the reference line BD as an ordinate, and with an intersection of the reference line AB and the reference line BD as an origin. If the terminal determines that the coordinates of the first target point (i.e., any point in the base region of the finger) in the screen coordinate system are (X1, Y1) and the coordinates of the second target point (i.e., any point in the tip region of the finger) in the screen coordinate system are (X2, Y2), the terminal may acquire the target parameter values, i.e., the first distance H1 is Y1 and the second distance H2 is Y2.
Optionally, in an embodiment of the present invention, the target parameter value may also be a first included angle, and the first included angle may be an included angle located on a side of a bottom of the terminal, among included angles between the first straight line and the first edge of the terminal.
For example, as shown in fig. 4 (b), the first straight line may be a straight line EF in which the central axis of the user's finger is located, and the first edge may be an edge AC of a side surface that the user contacts with the terminal when holding the terminal, so that the first included angle is shown as α in fig. 4 (b). The first included angle α may be used to indicate the orientation of the finger on the screen, specifically, if α < 90 °, the finger is indicated to be oriented on the screen toward the bottom AB of the terminal, which means that the user reversely holds the terminal (as shown in fig. 4 (b)); if α > 90 °, indicating that the finger is on the screen towards the top CD of the terminal, this means that the user is holding the terminal; if α is 90 °, the finger is indicated to be directed on the screen towards the bottom AB or the top CD substantially parallel to the terminal. Therefore, the terminal can determine the orientation of the finger on the screen according to the first included angle α, and then can determine whether the user is holding the terminal forward or backward according to the orientation of the finger on the screen.
Optionally, in the embodiment of the present invention, the target parameter value may also be a second included angle.
Illustratively, as shown in fig. 4 (b), the second angle may be an angle on the top side of the terminal, among angles between the first straight line EF and the first edge AC of the terminal, as shown by β in fig. 4 (b). The second angle β may be used to indicate the orientation of the finger on the screen, specifically, if β > 90 °, it indicates that the finger is on the screen towards the bottom AB of the terminal, which means that the user holds the terminal back (as shown in (b) of fig. 4); if β < 90 °, indicating that the finger is on the screen towards the top CD of the terminal, this means that the user is holding the terminal; if β is 90 °, the finger is indicated to be directed on the screen substantially parallel to the bottom AB or the top CD of the terminal.
S203, the terminal judges whether the target parameter value meets a first condition.
In the embodiment of the present invention, the terminal determines whether the target parameter value satisfies the first condition, and if the terminal determines that the target parameter value satisfies the first condition, the terminal may continue to execute the following S204; if the terminal determines that the target parameter value does not satisfy the first condition, the terminal may return to continue executing S201 described above.
Alternatively, in the embodiment of the present invention, in a case where the target parameter value is a first distance (i.e., a vertical distance from any point in the root region of the finger to the bottom of the terminal) and a second distance (i.e., a vertical distance from any point in the fingertip region of the finger to the bottom of the terminal), the first condition may be that the first distance is greater than the second distance. In this case, the terminal may determine whether the first distance and the second distance satisfy the first condition, that is, if the terminal determines that the first distance is greater than the second distance, it indicates that the finger is directed to the bottom of the terminal on the screen, which indicates that the user reversely holds the terminal, and the terminal may continue to perform S204 described below; if the terminal determines that the first distance is less than or equal to the second distance, the terminal may return to continue to execute the above S201.
Optionally, in this embodiment of the present invention, when the target parameter value is a third distance (i.e., a vertical distance from any point in the root region of the finger to the top of the terminal) and a fourth distance (i.e., a vertical distance from any point in the fingertip region of the finger to the top of the terminal), the first condition may be that the third distance is smaller than the fourth distance. In this case, the terminal may determine whether the third distance and the fourth distance satisfy the first condition, that is, if the terminal determines that the third distance is less than the fourth distance, it indicates that the finger is directed to the bottom of the terminal on the screen, which indicates that the user reversely holds the terminal, and the terminal may continue to perform S204 described below; if the terminal determines that the third distance is greater than or equal to the fourth distance, the terminal may return to continue to execute the above S201.
Alternatively, in an embodiment of the present invention, in a case where the target parameter value is a first included angle (i.e., α in (b) in fig. 4), the first condition may be that the first included angle is smaller than 90 °. In this case, the terminal may determine whether the first included angle satisfies the first condition, that is, if the first included angle is smaller than 90 °, it indicates that the finger faces the bottom of the terminal on the screen, which indicates that the user reversely holds the terminal, and the terminal may continue to perform S204 described below; if the first included angle is greater than or equal to 90 °, the terminal may return to continue to perform S201 described above.
Alternatively, in the embodiment of the present invention, in the case that the target parameter value is a second included angle (i.e., β in (b) of fig. 4), the first condition may be that the second included angle is greater than 90 °. In this case, the terminal may determine whether the second included angle satisfies the first condition, and if the second included angle is greater than 90 °, it indicates that the finger faces the bottom of the terminal on the screen, which indicates that the user reversely holds the terminal, and the terminal may continue to perform S204 described below; if the second angle is less than or equal to 90 °, the terminal may return to continue to perform S201 described above.
Optionally, in an embodiment of the present invention, the first condition may be that the first distance is greater than the second distance, and a time for continuously acquiring the target image is greater than or equal to a first time threshold (hereinafter, referred to as T). Alternatively, the first condition may be that the third distance is smaller than the fourth distance, and the time for continuously acquiring the target image is greater than or equal to the first time threshold T. Alternatively, the first condition may be that the first included angle is smaller than 90 °, and the time for continuously acquiring the target image is greater than or equal to the first time threshold T. Or the first condition may be that the second included angle is greater than 90 °, and the time for continuously acquiring the target image is greater than or equal to the first time threshold T.
It should be noted that the value of the first time threshold T may be determined according to actual usage requirements, and the embodiment of the present invention is not limited.
And S204, reversely displaying the display interface of the terminal control terminal relative to the screen.
In the embodiment of the present invention, if the target parameter value satisfies the first condition (i.e., indicating that the user reversely holds the terminal), the terminal may control the display interface of the terminal to reversely display (i.e., reversely display the vertical screen) with respect to the screen, i.e., the terminal may control the display interface of the terminal to rotate by 90 ° or 180 °, so that, from the perspective of the user, even if the user reversely holds the terminal, the display interface of the terminal is still forward, and the user does not need to readjust the holding direction, thereby improving the convenience of the user in using the terminal.
For example, as shown in (a) of fig. 5, in the case that the display interface of the terminal is displayed in a forward direction with respect to the screen, if the user holds the terminal backwards, the display interface of the terminal is displayed in a reverse direction from the user's perspective, which is inconvenient for the user to operate. As shown in (b) of fig. 5, after determining that the target parameter value satisfies the first condition, the terminal may control the display interface of the terminal to display in a reverse direction with respect to the screen, that is, the terminal may control the display interface of the terminal to rotate by 180 °, so that from the perspective of a user, even if the user reversely holds the terminal, the display interface of the terminal is still in a forward direction, thereby improving convenience of the user in using the terminal.
It should be noted that, in the embodiment of the present invention, the terminal may control the display interface of the terminal to rotate clockwise by 180 degrees, and may also control the display interface of the terminal to rotate counterclockwise by 180 degrees, which may be specifically determined according to actual use requirements, and the embodiment of the present invention is not limited.
For example, as shown in (a) of fig. 6, in the case that the display interface of the terminal is displayed horizontally with respect to the screen, if the user holds the terminal upside down, the display interface of the terminal is still displayed horizontally from the perspective of the user, which is inconvenient for the user to operate. As shown in (b) of fig. 6, after determining that the target parameter value satisfies the first condition, the terminal may control the display interface of the terminal to display in a reverse direction with respect to the screen, that is, the terminal may control the display interface of the terminal to rotate 90 ° counterclockwise, so that from the perspective of the user, even if the user holds the terminal backwards, the display interface of the terminal is still in a forward direction, thereby improving the convenience of the user in using the terminal.
Optionally, under the condition that the display interface of the terminal is displayed in a reverse direction relative to the vertical screen of the screen, if the user holds the terminal reversely, the display interface of the terminal is displayed in a forward direction from the perspective of the user, so that the display interface does not need to be adjusted. The terminal may control the display interface of the terminal to still display in a reverse direction with respect to the screen after determining that the target parameter value satisfies the first condition.
It should be noted that, in the embodiment of the present invention, when the user holds the terminal with the right hand, the terminal acquires the target image, determines the target parameter value according to the target image, and controls the display interface of the terminal to display reversely relative to the screen when it is determined that the target parameter value satisfies the first condition, which is the same as the above-described case where the user holds the terminal with the left hand, specifically, reference may be made to the above-described case where the user holds the terminal with the left hand, and details are not described here.
According to the display control method provided by the embodiment of the invention, the projection image (namely the target image) of the finger of the user on the screen of the terminal when the user holds the terminal is obtained, the orientation of the finger on the screen is determined according to the projection image, and the display interface of the terminal is controlled to be displayed reversely relative to the screen (namely the display interface of the terminal is displayed forwardly relative to the user) under the condition that the orientation of the finger on the screen meets the first condition (for example, the condition indicates that the display interface of the terminal is displayed reversely relative to the user). Compared with the prior art, the embodiment of the invention can automatically control the display interface of the terminal to be adjusted to be displayed forward relative to the user when the user reversely holds the terminal (namely, the display interface of the terminal is displayed reversely relative to the user), and the terminal can be normally used without the user readjusting the holding direction, so that the convenience of the user in using the terminal is improved.
Optionally, with reference to fig. 2, as shown in fig. 7, after S203, the display control method provided in the embodiment of the present invention may further include S205 described below.
S205, under the condition that the target parameter value does not meet the first condition, controlling the display interface of the terminal to display in the forward direction relative to the screen.
In the embodiment of the invention, the terminal can judge that the target parameter value does not meet the first condition when the user is holding the terminal, and then the terminal can control the display interface of the terminal to display forward relative to the screen.
For example, in the case that the display interface of the terminal is displayed in a forward direction with respect to the vertical screen of the screen, if the user is holding the terminal, the display interface of the terminal is displayed in the forward direction from the perspective of the user, and thus the display interface does not need to be adjusted. In this case, the terminal may control the display interface of the terminal to still be displayed in a forward direction with respect to the screen after determining that the target parameter value does not satisfy the first condition.
For example, in the case that the display interface of the terminal is displayed in a reverse direction with respect to the vertical screen of the screen, if the user is holding the terminal, the display interface of the terminal is displayed in a reverse direction from the perspective of the user, and thus the display interface needs to be adjusted. In this case, the terminal may control the display interface of the terminal to display in a forward direction with respect to the screen after determining that the target parameter value does not satisfy the first condition. For example, the terminal may rotate the display interface of the terminal counterclockwise (or clockwise) by 180 ° such that the display interface of the terminal is displayed in a forward direction with respect to the screen.
For example, in the case that the display interface of the terminal is displayed horizontally with respect to the screen, if the user is holding the terminal, the display interface of the terminal is displayed horizontally from the user's perspective, and thus the display interface needs to be adjusted. In this case, the terminal may control the display interface of the terminal to display in a forward direction with respect to the screen after determining that the target parameter value does not satisfy the first condition. For example, the terminal may rotate the display interface of the terminal by 90 ° counterclockwise (or clockwise) so that the display interface of the terminal is displayed forward with respect to the screen.
As shown in fig. 8, in the case that the terminal displays a full-screen operation interface, when a user holds the terminal with a single hand (taking a left-hand holding terminal as an example in fig. 8), a finger (e.g., a thumb) is relatively easy to touch an application icon (e.g., an application icon such as dialing, a contact, etc. in fig. 8) on the screen of the terminal, which is closer to the finger, and is not easy to touch an application icon (e.g., an application icon such as a photo, shopping, and wireless, etc. in fig. 8), which is farther from the finger, on the screen of the terminal, thereby causing inconvenience in operation of.
In view of the above problem, optionally, as shown in fig. 9 in conjunction with fig. 2, after the above step S201, the display control method provided in the embodiment of the present invention may further include the following steps S206 to S208. How to control the display interface display of the terminal is described in detail below by taking the left hand of the user holding the terminal as an example.
S206, the terminal determines the holding state of the user holding the terminal according to the target image.
The holding state may include a one-handed holding state and a non-one-handed holding state.
In the embodiment of the invention, the terminal can analyze the acquired projection image (namely the target image) of the fingers of the user on the screen of the terminal when the user holds the terminal so as to determine the holding state of the user holding the terminal. If a single finger is included in the target image, the terminal may determine that the holding state in which the user holds the terminal is a one-hand holding state, as shown in fig. 8, the holding state in which the user holds the terminal is a one-hand holding state. If a plurality of fingers are included in the target image, the terminal may determine that the holding state in which the user holds the terminal is a non-one-hand holding state.
Alternatively, S206 may be implemented by S206a and S206b, which are described below.
S206a, the terminal determines the holding parameters according to the target image.
The holding parameter may be used to indicate a holding state of the terminal held by the user.
S206b, the terminal determines the holding state of the terminal held by the user according to the holding parameter.
Wherein the holding parameters may include at least one of a telescopic state, a target distance and a target time. The telescopic state may be a telescopic state of fingers when the user holds the terminal. The target distance is a vertical distance (shown as d in (a) in fig. 10) between any point in a fingertip region of a finger (shown as a circular shaded region in (a) in fig. 10) and a first edge of the terminal, which is an edge of a side surface that the user touches the terminal when holding the terminal (shown as GH in (a) in fig. 10). The target time may be a time during which the terminal continuously acquires the target image. It should be noted that the area size of the fingertip area of the finger may be determined according to actual use requirements, and the embodiment of the present invention is not limited.
Optionally, the target distance may be a maximum value of a vertical distance between any one point in the fingertip area of the finger and the first edge of the terminal, or may be a vertical distance between a center point in the fingertip area of the finger and the first edge of the terminal, which may be determined specifically according to actual use requirements, and the embodiment of the present invention is not limited.
In the embodiment of the invention, the terminal can determine the holding parameters of the terminal held by the user by analyzing the target image, and determine whether the state of fingers on the screen of the terminal when the user holds the terminal meets the single-hand holding condition by determining whether at least one item of the holding parameters meets the single-hand holding condition, so that whether the holding state of the terminal held by the user is the single-hand holding state or the non-single-hand holding state can be judged.
For example, the above-mentioned one-hand holding condition may include: the finger is in a stretched state, the target distance is greater than or equal to a distance threshold (hereinafter D0), and the target time is greater than or equal to a second time threshold (hereinafter T0). The values of the distance threshold D0 and the second time threshold T0 may be determined according to actual use requirements, and the embodiment of the present invention is not limited.
As shown in fig. 10 (a), assuming that the user holds the terminal with the left hand and operates the terminal with the left hand, the thumb of the user protrudes from the left side of the terminal in a straightened state, and the distance that the thumb protrudes from the left side in the straightened state is d (i.e., the above-mentioned target distance). In this case, the terminal may acquire a target image including a thumb, acquire holding parameters such as a stretching state, a target distance D, and a target time of the thumb according to the target image, determine whether the holding parameters satisfy a one-handed holding condition, that is, determine whether the stretching state of the thumb is a straightened state, whether the target distance D is greater than or equal to a distance threshold D0, and whether the target time is greater than or equal to a second time threshold T0, and determine whether the holding state of the terminal held by the user is a one-handed holding state or a non-one-handed holding state according to a determination result.
Optionally, in an embodiment of the present invention, the one-hand holding condition may further include: the extended state of the finger is a straightened state, and the angle formed by the pointing direction of the finger and the first edge (shown as GH in (a) in fig. 10) is in the range of [90 ° - Δ, 90 ° + Δ ] (i.e. the pointing direction of the finger is approximately perpendicular to the first edge), where Δ is a very small error value relative to 90 °, for example, Δ may be 5 °, and may be determined according to the actual use requirement, and the embodiment of the present invention is not limited.
It should be noted that, when the user holds the terminal with one hand and does not have any operation, the fingers of the user are usually in a bent state, and the length of the fingers extending from one side of the terminal is shorter than the length of the fingers in a straight state, that is, the one-hand holding condition cannot be satisfied, so that the terminal cannot misjudge the holding state of the terminal held by the user.
S207, the terminal judges whether the holding state of the user holding the terminal is a one-hand holding state.
In the embodiment of the present invention, if the terminal determines that the holding state of the user holding the terminal is one-handed holding, the terminal may continue to execute the following S208; if the terminal determines that the holding state of the user holding the terminal is the non-one-handed holding state, the terminal may return to continue executing the above S201.
And S208, the display interface of the terminal control terminal is reduced to the target area on the screen for display.
The size of the target area along a first direction is smaller than or equal to a target distance, the target distance is a distance between any one point in a finger tip area of a finger and a first edge of the terminal, the first edge is an edge of a side face which is in contact with the terminal when a user holds the terminal, and the first direction is a direction perpendicular to the side face.
For example, as shown in fig. 10 (a), an edge of a side surface (i.e., a first edge of the terminal) which is in contact with the terminal when the user holds the terminal is GH, the first direction is a direction perpendicular to the side surface (as shown by MN in fig. 10 (a)), and a distance between any one point in a fingertip area of a finger and the first edge GH of the terminal (i.e., the target distance, which is the maximum value in the distances in the figure) is d.
In the embodiment of the present invention, after determining that the holding state of the terminal held by the user is the one-handed holding state (that is, the posture of the finger of the user satisfies the one-handed holding condition), the terminal may control the display interface of the terminal (that is, the full-screen operation interface) to be reduced to a target area on the screen for display, where the target area may be referred to as a comfortable operation area (also referred to as a one-handed operation area) of the terminal operated by the user, and the reduced display interface may be referred to as a one-handed operation interface. Illustratively, the display interface of the terminal is reduced from a full-screen operation interface as shown in (a) in fig. 10 to a display in a lower left corner area on the screen as shown in (b) in fig. 10 (i.e. reduced to a one-handed operation interface), and then the user can operate on the one-handed operation interface. Therefore, the terminal can control the full-screen operation interface of the terminal to be switched to the single-hand operation interface, so that the reduced display interface is more convenient for the user to operate by one hand, and the convenience of the user in using the terminal is further improved.
Optionally, in this embodiment of the present invention, a size of the target area along the first direction MN (that is, a width of the reduced display interface) may be equal to or smaller than the target distance d, which may be determined according to an actual use requirement, and this embodiment of the present invention is not limited.
Optionally, in the embodiment of the present invention, when the terminal is displayed on the one-handed operation interface, if the terminal determines that the holding state of the user holding the terminal is two-handed holding, the terminal may control the display interface of the terminal to be switched from the one-handed operation interface to the full-screen operation interface. Therefore, the terminal can respond to the input of the user, the display interface of the terminal is controlled to be switched between the single-hand operation interface and the full-screen operation interface, the terminal can display the display interfaces with different sizes under the triggering of the input of the user, namely the terminal can adapt to different requirements of the user in different scenes, the display interface of the terminal is controlled to be switched between the single-hand operation interface and the full-screen operation interface, and therefore the flexibility of the user in using the terminal is improved.
Optionally, in this embodiment of the present invention, an exit icon (for example, an "x" icon in the upper right corner of the one-handed operation interface as shown in (b) in fig. 10) may be set on the one-handed operation interface, and in a case that the terminal displays the one-handed operation interface, the user may trigger the exit icon (for example, click on the exit icon) to enable the terminal to control the display interface to be switched from the one-handed operation interface to the full-screen operation interface.
Optionally, in the embodiment of the present invention, after the terminal switches the full-screen operation interface to the one-handed operation interface, if the terminal does not detect the input of the user within a preset time period (which may be determined according to an actual use requirement), the terminal may control the one-handed operation interface to be switched to the full-screen operation interface.
In the embodiment of the present invention, after the terminal controls the full-screen operation interface to be switched to the one-handed operation interface, the method for the terminal to control the full-screen operation interface to be switched back from the one-handed operation interface includes, but is not limited to, the above-mentioned methods, which may be determined specifically according to actual use requirements, and the embodiment of the present invention is not limited.
It should be noted that, in the embodiment of the present invention, when the user holds the terminal with the right hand, the terminal may control the display interface of the terminal to switch between the full-screen operation interface and the one-hand operation interface, and a specific process is similar to the above-mentioned case where the user holds the terminal with the left hand. It is understood that the user holds the terminal with the left hand and the user holds the terminal with the right hand, which are different in that if the user holds the terminal with the left hand, the terminal reduces the area where the one-handed operation interface is displayed (i.e., the above-mentioned target area) to the lower left corner area on the screen of the terminal. If the user holds the terminal with the right hand, the terminal reduces the area displaying the one-handed operation interface (i.e., the above-mentioned target area) to the lower right corner area on the screen of the terminal. For a specific process of switching the display interface of the terminal control terminal between the full-screen operation interface and the one-hand operation interface, reference may be made to the above description of the situation where the user holds the terminal with the left hand, which is not described herein again.
Alternatively, as shown in fig. 11 in conjunction with fig. 9, S201 described above may be implemented by S209 described below.
S209, the terminal controls the ultrasonic sensor to acquire a target image, and the working frequency of the ultrasonic sensor is the first frequency.
In an embodiment of the present invention, the terminal may control the ultrasonic sensor to acquire the target image, and the ultrasonic sensor may transmit a low-frequency ultrasonic signal when operating at a first frequency (hereinafter, referred to as a low frequency, for example, several tens to several hundreds of KHZ) or may transmit a high-frequency ultrasonic signal when operating at a second frequency (hereinafter, referred to as a high frequency, for example, several MHZ). Since the low-frequency ultrasonic signal has a stronger penetration characteristic than the high-frequency ultrasonic signal, when the ultrasonic sensor operates at a low frequency, it is possible to recognize the shape of a finger and acquire a target image without the finger of the user touching the screen of the terminal, thereby allowing the user to operate the terminal without contact above the screen of the terminal (hereinafter referred to as an "air-spaced operation function").
Optionally, in the embodiment of the present invention, the ultrasonic sensor may be embedded below a screen of the terminal, and the ultrasonic sensor may include a plurality of ultrasonic sensing units. Illustratively, as shown in fig. 12 (a), a plurality of ultrasonic sensing units U are uniformly distributed below the screen of the terminal. The terminal can periodically scan each ultrasonic sensing unit U and acquire a projection image of the user's finger on the screen of the terminal.
For example, as shown in (b) of fig. 12, in the case where the ultrasonic sensor operates at a low frequency, if the finger of the user is placed at a certain distance (e.g., D1) above the screen S of the terminal and the projection area of the finger of the user on the screen S of the terminal is an area from P1 to P2, the terminal may acquire the projection image of the finger of the user on the screen S of the terminal through the sensor sensing unit U in the range from P1 to P2 and use the projection image as the target image, thereby implementing the finger space operation function.
For example, as shown in (c) of fig. 12, in the case that the ultrasonic sensor operates at a low frequency, if a finger of a user touches on the screen S of the terminal and a projection area of the finger (e.g., a fingertip) of the user on the screen S of the terminal is an area from P3 to P4, the terminal may acquire a projection image of the finger of the user on the screen S of the terminal through the sensor sensing unit U in a range from P3 to P4, thereby implementing a touch operation function of the terminal.
Optionally, as shown in fig. 11, after S209 described above, the display control method provided in the embodiment of the present invention may further include S210 to S212 described below.
S210, the terminal detects the input of the user in the fingerprint acquisition area.
In the embodiment of the present invention, if the terminal detects that the user inputs in the fingerprint acquisition area, the terminal may continue to execute the following S211; if the terminal does not detect the user' S input in the fingerprint acquisition area, the terminal may perform S206-S208 described above.
S211, the terminal controls the working frequency of the ultrasonic sensor to be switched to a second frequency, and the second frequency is larger than the first frequency.
In the embodiment of the invention, the ultrasonic sensor can emit a high-frequency ultrasonic signal when working at the second frequency (i.e. high frequency, for example, several MHZ), and in this case, the terminal can acquire the fingerprint information of the user through the ultrasonic sensor, thereby realizing the fingerprint identification function.
S212, the terminal controls the ultrasonic sensor to collect fingerprint information of the user.
In the embodiment of the invention, the terminal can control the ultrasonic sensor to periodically detect the input (such as pressing) of the user in the fingerprint acquisition area on the screen of the terminal. In the case that the terminal detects the user's input in the fingerprint collection area, if the ultrasonic sensor operates at the first frequency (i.e., a low frequency, for example, several tens to several hundreds of KHZ), the terminal may control the operating frequency of the ultrasonic sensor to switch from the first frequency to the second frequency (i.e., a high frequency, for example, several MHZ), and then the terminal may control the ultrasonic sensor to collect the user's fingerprint information. For example, in a user fingerprint unlocking scene or a fingerprint payment scene, after detecting the input of the user in the fingerprint acquisition area, the terminal may determine whether the acquired fingerprint information of the user matches the pre-stored fingerprint information, and if the fingerprint information matches, the terminal may unlock the screen or complete the payment.
As shown in fig. 13, an embodiment of the present invention provides a terminal 300, where the terminal 300 may include an obtaining module 301, a determining module 302, and a control module 303.
An obtaining module 301, configured to obtain a target image, where the target image is a projection image of a finger of a user on a screen of a terminal when the user holds the terminal; a determining module 302, configured to determine a target parameter value according to the target image acquired by the acquiring module 301, where the target parameter value is used to indicate an orientation of a finger on a screen; a control module 303, configured to control the display interface of the terminal to be displayed in a reverse direction with respect to the screen if the determination module 302 determines that the target parameter value satisfies the first condition.
Optionally, in an embodiment of the present invention, the target parameter value may include a first distance and a second distance, where the first condition is that the first distance is greater than the second distance, the first distance is a vertical distance between the first target point and the bottom of the terminal, and the second distance is a vertical distance between the second target point and the bottom of the terminal; or, the target parameter values are a third distance and a fourth distance, the first condition is that the third distance is smaller than the fourth distance, the third distance is a vertical distance between the first target point and the top of the terminal, and the fourth distance is a vertical distance between the second target point and the top of the terminal; the first target point is any point in the root area of the finger, and the second target point is any point in the tip area of the finger.
Optionally, in an embodiment of the present invention, the target parameter value is a first included angle, where the first condition is that the first included angle is smaller than 90 °, and the first included angle is an included angle located on one side of a bottom of the terminal, among included angles between the first straight line and the first edge of the terminal; or, the target parameter value is a second included angle, the first condition is that the second included angle is greater than 90 °, and the second included angle is an included angle located on one side of the top of the terminal in included angles between the first straight line and the first edge of the terminal; the first straight line is a straight line where a central axis of a finger is located, and the first edge is an edge of a side face which is in contact with the terminal when the user holds the terminal.
Optionally, in this embodiment of the present invention, the determining module 302 is further configured to determine, after the obtaining module 301 obtains the target image, a holding state in which the user holds the terminal according to the target image, where the holding state includes a one-hand holding state and a non-one-hand holding state; the control module 303 is further configured to, if the determination module 302 determines that the holding state is the one-handed holding state, narrow the display interface of the terminal to a target area on the screen for display, where a size of the target area along a first direction is smaller than or equal to a target distance, where the target distance is a distance between any one of fingertip areas of fingers and a first edge of the terminal, the first edge is an edge of a side surface that is in contact with the terminal when the user holds the terminal, and the first direction is a direction perpendicular to the side surface.
Optionally, in an embodiment of the present invention, the determining module 302 is specifically configured to determine a holding parameter according to the target image, where the holding parameter is used to indicate a holding state; determining the holding state according to the holding parameters; the holding parameters comprise at least one of a stretching state, a target distance and a target time, the stretching state is the stretching state of fingers when a user holds the terminal, and the target time is the time for continuously acquiring the target image.
Optionally, in this embodiment of the present invention, the control module 303 is further configured to control the display interface of the terminal to display forward relative to the screen when the target parameter value does not satisfy the first condition.
Optionally, in this embodiment of the present invention, the obtaining module 301 is specifically configured to control the ultrasonic sensor to obtain the target image, where a working frequency of the ultrasonic sensor is a first frequency. The control module 303 is further configured to control the operating frequency of the ultrasonic sensor to be switched to a second frequency when the input of the user in the fingerprint acquisition area is detected, where the second frequency is greater than the first frequency; and controlling the ultrasonic sensor to acquire the fingerprint information of the user.
The terminal provided by the embodiment of the present invention can implement each process implemented by the terminal in the above method embodiments, and is not described herein again to avoid repetition.
According to the terminal provided by the embodiment of the invention, the projection image (namely the target image) of the finger of the user on the screen of the terminal when the user holds the terminal is obtained, the orientation of the finger on the screen is determined according to the projection image, and the terminal can control the display interface of the terminal to display reversely relative to the screen (namely the display interface of the terminal displays forward relative to the user) under the condition that the orientation of the finger on the screen meets the first condition (for example, the condition indicates that the display interface of the terminal displays reversely relative to the user). Compared with the prior art, the embodiment of the invention can automatically control the display interface of the terminal to be adjusted to be displayed forward relative to the user when the user reversely holds the terminal (namely, the display interface of the terminal is displayed reversely relative to the user), and the terminal can be normally used without the user readjusting the holding direction, so that the convenience of the user in using the terminal is improved.
Fig. 14 is a schematic diagram of a hardware structure of a terminal for implementing various embodiments of the present invention. As shown in fig. 14, the terminal 100 includes, but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the terminal configuration shown in fig. 14 is not intended to be limiting, and that the terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The processor 110 is configured to control the sensor 105 to acquire a target image, where the target image is a projected image of a finger of a user on a screen of the terminal when the user holds the terminal; determining a target parameter value according to the target image, wherein the target parameter value is used for indicating the orientation of the finger of the user on the screen of the terminal; and controlling the display unit 106 to display the display interface of the terminal in a reverse direction with respect to the screen of the terminal in case the target parameter value satisfies the first condition.
In an embodiment of the present invention, the sensor 105 may include an ultrasonic sensor. The ultrasonic sensor may operate at a high frequency (e.g., several MHZ) or a low frequency (e.g., several tens to several hundreds KHZ).
According to the terminal provided by the embodiment of the invention, the projection image (namely the target image) of the finger of the user on the screen of the terminal when the user holds the terminal is obtained, the orientation of the finger on the screen is determined according to the projection image, and the display interface of the terminal can be controlled to be displayed reversely relative to the screen (namely the display interface of the terminal is displayed forwardly relative to the user) under the condition that the orientation of the finger on the screen meets the first condition (for example, the condition indicates that the display interface of the terminal is displayed reversely relative to the user). Compared with the prior art, the embodiment of the invention can automatically control the display interface of the terminal to be adjusted to be displayed forward relative to the user when the user reversely holds the terminal (namely, the display interface of the terminal is displayed reversely relative to the user), and the terminal can be normally used without the user readjusting the holding direction, so that the convenience of the user in using the terminal is improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 101 may be used for receiving and sending signals during a message transmission or call process, and specifically, after receiving downlink data from a base station, the downlink data is processed by the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through a wireless communication system.
The terminal provides wireless broadband internet access to the user through the network module 102, such as helping the user send and receive e-mails, browse web pages, access streaming media, and the like.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output as sound. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
The input unit 104 is used to receive an audio or video signal. The input unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the graphics processor 1041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the network module 102. The microphone 1042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode.
The terminal 100 also includes at least one sensor 105, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or a backlight when the terminal 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 105 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 106 is used to display information input by a user or information provided to the user. The display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal. Specifically, the user input unit 107 includes a touch panel 1071 and other input devices 1072. Touch panel 1071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 1071 (e.g., operations by a user on or near touch panel 1071 using a finger, stylus, or any suitable object or attachment). The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and receives and executes commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Specifically, other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 1071 may be overlaid on the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although in fig. 14, the touch panel 1071 and the display panel 1061 are two independent components to implement the input and output functions of the terminal, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the terminal, and is not limited herein.
The interface unit 108 is an interface for connecting an external device to the terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal 100 or may be used to transmit data between the terminal 100 and the external device.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the terminal, connects various parts of the entire terminal using various interfaces and lines, and performs various functions of the terminal and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the terminal. Processor 110 may include one or more processing units; alternatively, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The terminal 100 may further include a power supply 111 (e.g., a battery) for supplying power to various components, and optionally, the power supply 111 may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the terminal 100 includes some functional modules that are not shown, and thus, the detailed description thereof is omitted.
Optionally, an embodiment of the present invention further provides a terminal, including a central processing unit 112, a micro processing unit 113, a frequency conversion control module 114, a sensor 115, a digital-to-analog converter 116, and a memory 117, as shown in fig. 15.
The central processor 112 may be the processor 110 shown in fig. 14; the sensor 115 may be the sensor 105 shown in fig. 14, the sensor 115 may include an ultrasonic transmitter for transmitting an ultrasonic signal and an ultrasonic receiver for receiving the ultrasonic signal, or the ultrasonic sensor is an ultrasonic transceiver for transceiving an ultrasonic signal; the memory 117 may be the memory 109 shown in fig. 14, and may be used to store a computer program and preset fingerprint information. The computer program stored in the memory 109 can realize the processes of the above-mentioned display control method embodiments when being executed by the processor 110, and can achieve the same technical effects, and the details are not repeated herein to avoid repetition.
Optionally, in an embodiment of the present invention, the frequency conversion control module 114 may be a frequency converter.
As shown in fig. 15, the cpu 112 is configured to send a first instruction for controlling the transducer 115 to transmit and receive the ultrasonic signal at the first frequency or the second frequency to the mcu 113 in response to the user input. And a micro-processing unit 113, configured to receive the first instruction sent by the central processing unit 112, and send, according to the first instruction, a second instruction for performing conversion of the frequency of the sensor 115 from the first frequency to the second frequency or from the second frequency to the first frequency to the frequency conversion control module 114, and send, to the sensor 115, a third instruction for controlling the sensor 115 to send and receive the ultrasonic signal. And a frequency conversion control module 114, configured to receive a second instruction sent by the microprocessor unit 113, and control the frequency of the sensor 115 to convert from the first frequency to the second frequency or from the second frequency to the first frequency according to the second instruction. And a sensor 115, configured to receive the third instruction sent by the microprocessor unit 113, send and receive the ultrasonic signal at the first frequency or the second frequency according to the third instruction, and send the received ultrasonic signal to the digital-to-analog converter 114. And a digital-to-analog converter 114 for receiving the ultrasonic signal transmitted by the sensor 115, converting the ultrasonic signal into an analog ultrasonic signal, and transmitting the analog ultrasonic signal to the microprocessor 113. The micro processing unit 113 is further configured to receive and process the ultrasonic signal in an analog form sent by the digital-to-analog converter 114, and if the ultrasonic signal contains fingerprint information of a user, perform fingerprint information processing on the ultrasonic signal, determine whether the fingerprint information in the processed ultrasonic signal matches the fingerprint information stored in the memory 109, and report the determination result to the central processing unit 112; if the ultrasonic signal contains the image of the user's finger (i.e., the target image), the holding parameter is obtained, and it is determined whether the holding parameter satisfies the first condition, and the determination result is reported to the central processing unit 112. The central processing unit 112 is further configured to receive a result reported by the micro processing unit 113, and execute a corresponding result.
An embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the display control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may include a read-only memory (ROM), a Random Access Memory (RAM), a magnetic or optical disk, and the like.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (13)

1. A display control method is applied to a terminal and is characterized by comprising the following steps:
acquiring a target image, wherein the target image is a projected image of a finger of a user on a screen of the terminal when the user holds the terminal, the projected image is an image acquired by identifying the appearance of the finger under the condition that the finger is not in contact with the screen, and the finger is the thumb of the user;
determining a target parameter value according to the target image, wherein the target parameter value is used for indicating the orientation of the fingers on the screen, the orientation is used for determining the holding direction of the terminal held by a user, and the holding direction is forward holding or backward holding;
under the condition that the target parameter value meets a first condition, controlling a display interface of the terminal to display reversely relative to the screen;
the target parameter value is a distance value or an angle value, the distance value is a vertical distance between any one point in the root region of the finger and the top or the bottom of the terminal, the vertical distance between any one point in the tip region of the finger and the top or the bottom of the terminal, and the angle value is an included angle value between a straight line where a central axis of the finger is located and a first edge of the terminal.
2. The method of claim 1, wherein the target parameter value comprises: a first distance and a second distance, the first condition being that the first distance is greater than the second distance, the first distance being a vertical distance between a first target point and the bottom of the terminal, the second distance being a vertical distance between a second target point and the bottom of the terminal;
alternatively, the first and second electrodes may be,
the target parameter values are a third distance and a fourth distance, the first condition is that the third distance is smaller than the fourth distance, the third distance is a vertical distance between the first target point and the top of the terminal, and the fourth distance is a vertical distance between the second target point and the top of the terminal;
the first target point is any point in the root area of the finger, and the second target point is any point in the tip area of the finger.
3. The method according to claim 1, wherein the target parameter value is a first angle, the first condition is that the first angle is smaller than 90 °, and the first angle is an angle on a bottom side of the terminal out of angles between a first straight line and a first edge of the terminal;
alternatively, the first and second electrodes may be,
the target parameter value is a second included angle, the first condition is that the second included angle is larger than 90 degrees, and the second included angle is an included angle which is positioned on one side of the top of the terminal in included angles between a first straight line and the first edge of the terminal;
the first straight line is a straight line where the central axis of the finger is located, and the first edge is an edge of a side face which is in contact with the terminal when the user holds the terminal.
4. The method of any of claims 1 to 3, wherein after the acquiring the target image, the method further comprises:
determining a holding state of the terminal held by the user according to the target image, wherein the holding state comprises a one-hand holding state and a non-one-hand holding state;
and under the condition that the holding state is a one-hand holding state, reducing a display interface of the terminal to a target area on the screen for display, wherein the size of the target area along a first direction is smaller than or equal to a target distance, the target distance is the distance between any one point in a fingertip area of the fingers and a first edge of the terminal, the first edge is the edge of a side face which is in contact with the terminal when a user holds the terminal, and the first direction is the direction perpendicular to the side face.
5. The method according to claim 4, wherein the determining a holding state of the terminal held by the user according to the target image comprises:
determining holding parameters according to the target image, wherein the holding parameters are used for indicating the holding state;
determining the holding state according to the holding parameters;
the holding parameters comprise at least one of a stretching state, the target distance and target time, the stretching state is the stretching state of the fingers when the user holds the terminal, and the target time is the time for continuously acquiring the target image.
6. The method of claim 1, further comprising:
and controlling a display interface of the terminal to display in a forward direction relative to the screen under the condition that the target parameter value does not meet the first condition.
7. A terminal is characterized by comprising an acquisition module, a determination module and a control module;
the acquisition module is used for acquiring a target image, the target image is a projected image of a finger of a user on a screen of the terminal when the user holds the terminal, the projected image is an image acquired by identifying the appearance of the finger under the condition that the finger does not contact the screen, and the finger is the thumb of the user;
the determining module is configured to determine a target parameter value according to the target image acquired by the acquiring module, where the target parameter value is used to indicate an orientation of the finger on the screen, the orientation is used to determine a holding direction in which a user holds the terminal, and the holding direction is forward holding or backward holding;
the control module is used for controlling the display interface of the terminal to reversely display relative to the screen under the condition that the determination module determines that the target parameter value meets a first condition;
the target parameter value is a distance value or an angle value, the distance value is a vertical distance between any one point in the root region of the finger and the top or the bottom of the terminal, the vertical distance between any one point in the tip region of the finger and the top or the bottom of the terminal, and the angle value is an included angle value between a straight line where a central axis of the finger is located and a first edge of the terminal.
8. The terminal of claim 7, wherein the target parameter value comprises a first distance and a second distance, wherein the first condition is that the first distance is greater than the second distance, wherein the first distance is a vertical distance between the first target point and the bottom of the terminal, and wherein the second distance is a vertical distance between the second target point and the bottom of the terminal;
alternatively, the first and second electrodes may be,
the target parameter values are a third distance and a fourth distance, the first condition is that the third distance is smaller than the fourth distance, the third distance is a vertical distance between the first target point and the top of the terminal, and the fourth distance is a vertical distance between the second target point and the top of the terminal;
the first target point is any point in the root area of the finger, and the second target point is any point in the tip area of the finger.
9. The terminal of claim 7, wherein the target parameter value is a first included angle, the first condition is that the first included angle is smaller than 90 °, and the first included angle is an included angle on a bottom side of the terminal, among included angles between a first straight line and a first edge of the terminal;
alternatively, the first and second electrodes may be,
the target parameter value is a second included angle, the first condition is that the second included angle is larger than 90 degrees, and the second included angle is an included angle which is positioned on one side of the top of the terminal in included angles between a first straight line and the first edge of the terminal;
the first straight line is a straight line where the central axis of the finger is located, and the first edge is an edge of a side face which is in contact with the terminal when the user holds the terminal.
10. The terminal according to any of claims 7 to 9,
the determining module is further configured to determine, after the obtaining module obtains the target image, a holding state in which the user holds the terminal according to the target image, where the holding state includes a one-hand holding state and a non-one-hand holding state;
the control module is further configured to, when the determination module determines that the holding state is a one-hand holding state, control a display interface of the terminal to be reduced to a target area on the screen for display, where a size of the target area along a first direction is smaller than or equal to a target distance, the target distance is a distance between any one point in a fingertip area of the finger and a first edge of the terminal, the first edge is an edge of a side surface that is in contact with the terminal when the user holds the terminal, and the first direction is a direction perpendicular to the side surface.
11. The terminal according to claim 10, wherein the determining module is specifically configured to determine a holding parameter according to the target image, where the holding parameter is used to indicate the holding state; determining the holding state according to the holding parameters;
the holding parameters comprise at least one of a stretching state, the target distance and target time, the stretching state is the stretching state of the fingers when the user holds the terminal, and the target time is the time for continuously acquiring the target image.
12. The terminal of claim 7,
the control module is further configured to control a display interface of the terminal to display in a forward direction relative to the screen when the target parameter value does not satisfy the first condition.
13. A terminal comprising a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the display control method according to any one of claims 1 to 6.
CN201810725301.9A 2018-07-04 2018-07-04 Display control method and terminal Active CN108958614B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810725301.9A CN108958614B (en) 2018-07-04 2018-07-04 Display control method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810725301.9A CN108958614B (en) 2018-07-04 2018-07-04 Display control method and terminal

Publications (2)

Publication Number Publication Date
CN108958614A CN108958614A (en) 2018-12-07
CN108958614B true CN108958614B (en) 2021-03-23

Family

ID=64485610

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810725301.9A Active CN108958614B (en) 2018-07-04 2018-07-04 Display control method and terminal

Country Status (1)

Country Link
CN (1) CN108958614B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110321056A (en) * 2019-07-15 2019-10-11 深圳传音控股股份有限公司 Control moving method, mobile phone and storage medium based on terminal
CN110717463A (en) * 2019-10-12 2020-01-21 深圳芯启航科技有限公司 Non-contact type biological identification method and device
CN112527226A (en) * 2019-12-10 2021-03-19 上海擎感智能科技有限公司 Method, system, medium and intelligent terminal for acquiring display picture suitable for vehicle-mounted screen
CN111796715A (en) * 2020-06-24 2020-10-20 歌尔光学科技有限公司 Detection method and detection device of touch control light film
CN112104784A (en) * 2020-09-16 2020-12-18 珠海格力电器股份有限公司 Display method, display device, computer-readable storage medium, and electronic apparatus
CN117693946A (en) * 2022-04-20 2024-03-12 深圳市大疆创新科技有限公司 Unmanned aerial vehicle control method, image display method, unmanned aerial vehicle and control terminal

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102736742A (en) * 2011-04-01 2012-10-17 宏碁股份有限公司 Portable electronic apparatus and control method for display direction thereof
CN103076960A (en) * 2011-10-26 2013-05-01 华为终端有限公司 Method for controlling screen displaying direction and terminal thereof

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130083074A1 (en) * 2011-10-03 2013-04-04 Nokia Corporation Methods, apparatuses and computer program products utilizing hovering, in part, to determine user interface orientation
US10229657B2 (en) * 2015-06-17 2019-03-12 International Business Machines Corporation Fingerprint directed screen orientation
CN107704179B (en) * 2016-08-08 2020-09-22 宏达国际电子股份有限公司 Method for determining picture display direction and electronic device using same
CN106569677A (en) * 2016-11-11 2017-04-19 努比亚技术有限公司 Screen display direction control apparatus and method
CN107562353A (en) * 2017-07-17 2018-01-09 努比亚技术有限公司 A kind of display interface control method, terminal and computer-readable recording medium
CN107656792B (en) * 2017-10-19 2021-03-16 Oppo广东移动通信有限公司 User interface display method and device and terminal

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102736742A (en) * 2011-04-01 2012-10-17 宏碁股份有限公司 Portable electronic apparatus and control method for display direction thereof
CN103076960A (en) * 2011-10-26 2013-05-01 华为终端有限公司 Method for controlling screen displaying direction and terminal thereof

Also Published As

Publication number Publication date
CN108958614A (en) 2018-12-07

Similar Documents

Publication Publication Date Title
CN108958614B (en) Display control method and terminal
CN107835321B (en) Incoming call processing method and mobile terminal
CN109002243B (en) Image parameter adjusting method and terminal equipment
CN108446058B (en) Mobile terminal operation method and mobile terminal
CN111142991A (en) Application function page display method and electronic equipment
CN109032486B (en) Display control method and terminal equipment
CN109257505B (en) Screen control method and mobile terminal
CN110752981B (en) Information control method and electronic equipment
CN109407949B (en) Display control method and terminal
CN110231972B (en) Message display method and terminal equipment
CN110874147A (en) Display method and electronic equipment
CN111190517B (en) Split screen display method and electronic equipment
CN110753155A (en) Proximity detection method and terminal equipment
CN110958350B (en) Notification message processing method and electronic equipment
CN110058686B (en) Control method and terminal equipment
CN109859718B (en) Screen brightness adjusting method and terminal equipment
CN109189514B (en) Terminal device control method and terminal device
CN108984099B (en) Man-machine interaction method and terminal
CN108833791B (en) Shooting method and device
CN108449490B (en) Terminal control method and terminal
CN107943406B (en) touch point determining method of touch screen and terminal
CN111124235B (en) Screen control method and flexible electronic equipment
CN111246105B (en) Photographing method, electronic device, and computer-readable storage medium
CN109257504B (en) Audio processing method and terminal equipment
CN109828710B (en) Image processing method and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant