CN107741814B - Display control method and mobile terminal - Google Patents

Display control method and mobile terminal Download PDF

Info

Publication number
CN107741814B
CN107741814B CN201710959040.2A CN201710959040A CN107741814B CN 107741814 B CN107741814 B CN 107741814B CN 201710959040 A CN201710959040 A CN 201710959040A CN 107741814 B CN107741814 B CN 107741814B
Authority
CN
China
Prior art keywords
display
real
mobile terminal
target object
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710959040.2A
Other languages
Chinese (zh)
Other versions
CN107741814A (en
Inventor
韦冠宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201710959040.2A priority Critical patent/CN107741814B/en
Publication of CN107741814A publication Critical patent/CN107741814A/en
Application granted granted Critical
Publication of CN107741814B publication Critical patent/CN107741814B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/143Sensing or illuminating at different wavelengths
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The invention provides a display control method and a mobile terminal, and relates to the technical field of communication. The display control method comprises the following steps: acquiring a real-time distance between the face of a mobile terminal operator and a display screen of the mobile terminal; and adjusting the display scale of the target object displayed on the display screen of the mobile terminal based on the real-time distance. The scheme of the invention is used for solving the problem that the use convenience of the mobile terminal is reduced due to the limitation of manual operation of a user in the existing zooming control mode.

Description

Display control method and mobile terminal
Technical Field
The present invention relates to the field of communications technologies, and in particular, to a display control method and a mobile terminal.
Background
With the rapid development of communication technology, the functions of the mobile terminal are more and more diversified, and great convenience is brought to the life of people. Especially for the content displayed by the mobile terminal, the user can zoom in or zoom out to view some content according to personal requirements in the browsing process.
In an existing zooming mode, one mode needs to use two fingers to perform a scratch-out or a close-up operation on a screen of a mobile terminal, and the other mode needs to perform a double-click operation on displayed contents to trigger zooming of the displayed contents. However, the first approach is inconvenient for the user to complete the zoom operation in a scene requiring one-handed operation; the second method is limited to the same operation, and cannot select zooming according to the user's requirement. Therefore, the existing zooming mode has certain limitation, and the use convenience of the mobile terminal is reduced.
Disclosure of Invention
The embodiment of the invention provides a display control method and a mobile terminal, and aims to solve the problem that the use convenience of the mobile terminal is reduced due to the limitation of manual operation of a user in the existing zooming control mode.
In order to solve the technical problem, the invention is realized as follows: a display control method comprising:
acquiring a real-time distance between the face of a mobile terminal operator and a display screen of the mobile terminal;
and adjusting the display scale of the target object displayed on the display screen of the mobile terminal based on the real-time distance.
In a first aspect, an embodiment of the present invention further provides a mobile terminal, including:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring the real-time distance between the face of a mobile terminal operator and the display screen of the mobile terminal;
and the adjusting module is used for adjusting the display proportion of the target object displayed on the display screen of the mobile terminal based on the real-time distance.
In a second aspect, an embodiment of the present invention further provides a mobile terminal, including a processor, a memory, and a computer program stored on the memory and executable on the processor, where the computer program, when executed by the processor, implements the steps of the display control method described above.
In a third aspect, the present invention further provides a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps of the display control method described above.
In this way, in the embodiment of the invention, the real-time distance between the face of the operator of the mobile terminal and the display screen of the mobile terminal is acquired; and then, based on the real-time distance, adjusting the display scale of the target object displayed on the display screen of the mobile terminal to complete the viewing required by the user. The display scale of the target object is adjusted mainly according to the real-time distance between the display screen and the face of the operator, so that the method is suitable for more scenes, the limitation of manual zooming control operation in the existing mode is effectively overcome, and the use convenience of the mobile terminal is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive exercise.
FIG. 1 is a flowchart illustrating a display control method according to an embodiment of the present invention;
FIG. 2 is a second flowchart illustrating a display control method according to an embodiment of the present invention;
FIG. 3 is one of the schematic distance diagrams of a user's face from a display screen;
FIG. 4 is a second schematic diagram illustrating the distance between the user's face and the display screen;
FIG. 5 is a third schematic diagram illustrating the distance between the user's face and the display screen;
FIG. 6 is a third flowchart illustrating a display control method according to a third embodiment of the present invention;
FIG. 7 is a fourth flowchart illustrating a display control method according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention;
fig. 9 is a second schematic structural diagram of a mobile terminal according to an embodiment of the present invention;
fig. 10 is a third schematic structural diagram of a mobile terminal according to an embodiment of the present invention;
fig. 11 is a schematic structural diagram of a mobile terminal according to another embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be described clearly and completely with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without any inventive step, are within the scope of the present invention.
As shown in fig. 1, the display control method according to the embodiment of the present invention includes:
step 101, obtaining a real-time distance between a face of an operator of a mobile terminal and a display screen of the mobile terminal.
In this step, the mobile terminal to which the display control method of the embodiment of the present invention is applied provides a basis for the control of the next step by obtaining the real-time distance between the display screen of the mobile terminal and the face of the operator operating the mobile terminal.
And 102, adjusting the display scale of the target object displayed on the display screen of the mobile terminal based on the real-time distance.
In this step, after the real-time distance between the display screen and the face of the operator is obtained in step 101, the viewing requirement of the user on the target object can be known according to the real-time distance, and then the corresponding control operation is performed, so that the user can view the target object conveniently.
Through the above steps 101 and 102, the display control method of the embodiment of the present invention first obtains a real-time distance between a display screen of the mobile terminal and a face of an operator operating the mobile terminal; and then, based on the real-time distance, adjusting the display scale of the target object displayed on the display screen, and performing corresponding zooming-out or zooming-in operation to complete the viewing required by the user. The display scale of the target object is adjusted mainly according to the real-time distance between the display screen and the face of the operator, so that the method is suitable for more scenes, the limitation of manual zooming control operation in the existing mode is effectively overcome, and the use convenience of the mobile terminal is improved.
It should be appreciated that, if the mobile terminal to which the display control method according to the embodiment of the present invention is applied obtains the distance between the display screen and the face of the operator all the time, resources are wasted. Therefore, the method is effective only when the currently displayed content of the display screen has a zooming requirement, preferably, the mobile terminal confirms the currently displayed content of the display screen, and if the currently displayed content is a preset application type with the zooming requirement, the steps 101 and 102 are executed; otherwise, the above steps are not executed.
The preset application type with the zooming requirement can be a picture, a map, a news application or the like. These application types are often defined as application types with scaling requirements, because they sometimes require some enlargement of their or some part of them for the sake of clarity in some detail. For a music player, an application mall or a game mall, etc., since there is no scene to scale, it is not defined as an application type having a scaling requirement.
When the current display content is a preset application type with a zooming requirement, as shown in fig. 1, the adjustment of the display scale of the target object is realized according to step 101 and step 102. To obtain a more precise distance, specifically, step 101 includes:
and carrying out face recognition on an operator through the structured light module of the mobile terminal, and acquiring a real-time distance between the face of the operator and the display screen after the face is recognized.
The structured light module of the mobile terminal includes two parts: infrared emitter and infrared camera. Here, the infrared emitter emits infrared light with a specific pattern, and the infrared emitter irradiates the surface of the object and then reflects the infrared light, and after the infrared camera receives the reflected infrared light pattern, the distance between the object and the mobile terminal can be analyzed and calculated. And the infrared camera can also carry out facial discernment to the object, consequently, the real-time distance that acquires through the structure light module will be confirmed as the real-time distance between operator's face and the display screen, avoids appearing the error because other shelter from the thing.
After the real-time distance is obtained, the display proportion of the target object can be adjusted based on the real-time distance. Preferably, step 102 comprises:
detecting the change information of the real-time distance;
determining a proportion adjustment parameter of the target object according to the change information;
and adjusting the display ratio of the target object displayed on the display screen of the mobile terminal according to the ratio adjusting parameter.
In this embodiment, after the structured light module is turned on, the distance between the display screen detected in real time and the face of the operator can be acquired through the structured light module, so that the change information of the real-time distance can be further detected correspondingly, then the change information is used to specify the scale adjustment parameter of the target object, and the display scale of the target object displayed on the display screen of the mobile terminal is adjusted according to the scale adjustment parameter to reduce or enlarge the target object.
In order to better understand the change of the distance, in the above embodiment, the step of detecting the change information of the real-time distance includes:
acquiring an initial distance between the face of the operator and a display screen;
determining the initial distance as a reference distance;
and detecting the change information of the real-time distance relative to the reference distance.
Here, an initial distance between the face of the operator and the display screen is first acquired, and then the acquired initial distance is determined as a reference distance, after which change information of the real-time distance with respect to the reference distance can be detected to obtain more definite change information. Wherein, the initial distance often chooses to launch after the structured light module assembly, the distance between the face of the first display screen and operator that obtains.
However, in practical applications, the distance may change in many scenes, for example, when browsing pictures, the user may directly put the mobile terminal on a desktop due to other things, and the distance changes without zooming out or zooming in the target object. If the target object is determined to be displayed in a reduced or enlarged manner simply according to the increase or decrease of the distance, the actual application requirements cannot be met. Therefore, in the embodiment of the present invention, preferably, the mobile terminal performs a corresponding operation by determining that an event occurs in which the distance rapidly changes and rapidly returns to the reference distance.
Thus, the change information includes: the change direction of the real-time distance relative to the reference distance, the time from the increase of the absolute difference value of the real-time distance and the reference distance to the reduction of the absolute difference value to a preset threshold value, and the maximum value of the absolute difference value.
The change direction of the real-time distance relative to the reference distance refers to whether the real-time distance is increased or decreased compared with the reference distance, and if the position of the operator is far away from the display screen of the mobile terminal compared with the reference distance, the change direction is determined to be increased; and if the position of the author is closer to the display screen of the mobile terminal than the reference distance, determining that the change direction is reduced. In the process of returning to the reference distance, considering that the human cannot achieve complete accuracy, a preset threshold is set to avoid misjudgment, and the preset threshold is not a specific value, preferably a range value.
Further, the step of adjusting the display scale of the target object displayed on the display screen of the mobile terminal according to the scale adjustment parameter includes:
if the time is less than a preset time length and the change direction indicates that the change of the real-time distance relative to the reference distance is increased and then decreased, amplifying and displaying the target object according to a first display scale corresponding to the maximum value of the absolute difference value;
if the time is less than the preset time length and the change direction indicates that the change of the real-time distance relative to the reference distance is reduced first and then increased, the target object is displayed in a reduced mode according to a second display scale corresponding to the maximum value of the absolute difference.
Thus, when the time in the change information is shorter than the preset time, it can be determined that an event in which the distance rapidly changes and rapidly returns to the reference distance occurs, and it is necessary to adjust the display scale of the target object and perform an enlargement or reduction operation. And the mobile terminal is preset with corresponding relations between different display proportions and the maximum value of the absolute difference value, and the display proportion to be used is determined according to the maximum value of the absolute difference value in the change information. When the change direction shows that the distance is changed to increase and then decrease relative to the reference distance, the target object is displayed in an enlarged mode according to a first display scale corresponding to the maximum value of the absolute difference in the change information; and when the change direction shows that the distance changes from the reference distance and then increases, the target object is displayed in a reduced mode according to a second display scale corresponding to the maximum value of the absolute difference.
Of course, in order to distinguish the scaling of reduction and enlargement, the respective display ratios are determined in the corresponding reduction ratio preset relationship or enlargement ratio preset relationship in combination with the change direction.
For example, a user a operates a mobile terminal to browse a picture, and since the picture is a preset application type with a zooming requirement, as shown in fig. 2:
step 201, identifying the face of an operator (user A) through a structured light module of the mobile terminal, and acquiring the real-time distance between the face of the operator and a display screen.
In step 202, the acquired initial distance is determined as a reference distance.
When the structured light module is opened, the detected distance between the first display screen and the face of the operator is L1 as shown in fig. 3.
Step 203, determining the change information of the real-time distance relative to the reference distance.
The opened structure light module detects the distance between the face and the display screen when the user a browses the picture in real time, and obtains the change information relative to the reference distance determined in step 202, where the change information includes: the change direction of the real-time distance relative to the reference distance, the time from the increase of the absolute difference value of the real-time distance and the reference distance to the reduction of the absolute difference value to a preset threshold value, and the maximum value of the absolute difference value.
Step 204, if the time is less than the preset time length and the change direction indicates that the change of the real-time distance relative to the reference distance is increased first and then decreased, amplifying and displaying the target object according to the display scale corresponding to the maximum value of the absolute difference; and if the time is less than the preset time length and the change direction indicates that the change of the real-time distance relative to the reference distance is reduced firstly and then increased, reducing and displaying the target object according to the display scale corresponding to the maximum value of the absolute difference value.
Through the change information, the event that the distance is rapidly changed and rapidly returns to the reference distance can be determined by knowing that the time is shorter than the preset time length, and then the corresponding operation on the target object is completed by combining the maximum value of the change direction and the absolute difference value. Assuming that the change of the real-time distance from the reference distance is increased and then decreased, and as shown in fig. 4, the face of the user a moves to the farthest distance L2 from the display screen, the corresponding display scale is determined by | L2-L1| in the preset relationship of the magnification scale, and then the picture is magnified according to the display scale. Assuming that the distance is changed to decrease and then increase from the reference distance, and as shown in fig. 5, the face of the user a moves to a distance L3 closest to the display screen, a corresponding display scale is determined from the preset relation of the reduction scale by | L3-L1| and then the picture is reduced in accordance with the display scale.
In addition, in the embodiment of the invention, the mobile terminal can also perform corresponding operation by judging the event that the distance change occurs and the zooming instruction is triggered. Therefore, before the step of determining the scale adjustment parameter of the target object according to the variation information, the method further includes:
detecting whether a zooming instruction is received;
and if a zooming instruction input by a user is received, executing the step of determining the proportion adjusting parameter of the target object according to the change information.
Wherein the change information includes: the change direction of the real-time distance relative to the reference distance and the maximum value of the absolute difference value of the real-time distance and the reference distance.
Therefore, when the zooming instruction is received and the user is aware of the zooming requirement on the target object, corresponding operation is performed according to the changing direction of the real-time distance relative to the reference distance and the maximum value of the absolute difference value of the real-time distance and the reference distance. For convenience of user input, the trigger mode of the zoom instruction may be a touch of a finger of an operator on the display screen, or a pressing of a virtual key or a physical key (e.g., a fingerprint identification key disposed on the front, side, or back of the mobile terminal), and so on, which will not be described herein again.
Further specifically, the step of adjusting the display scale of the target object displayed on the display screen of the mobile terminal according to the scale adjustment parameter includes:
when the change direction indicates that the change of the real-time distance relative to the reference distance is increased, amplifying and displaying the target object according to a third display scale corresponding to the maximum value of the absolute difference value;
and when the change direction shows that the change of the real-time distance relative to the reference distance is reduced, zooming out the target object according to a fourth display scale corresponding to the maximum value of the absolute difference value.
Thus, after receiving the zoom instruction, the display scale to be used can be determined from the maximum value of the absolute difference in the variation information based on the correspondence between the different display scales preset in the mobile terminal and the maximum value of the absolute difference. Then, when the change direction indicates that the change of the real-time distance relative to the reference distance is increased, amplifying and displaying the target object according to a third display scale corresponding to the maximum value of the absolute difference; and when the change direction indicates that the change of the real-time distance relative to the reference distance is reduced, reducing and displaying the target object according to a fourth display scale corresponding to the maximum value of the absolute difference. Similarly, in order to distinguish the scaling of reduction and enlargement, the respective display ratios are determined in the corresponding reduction ratio preset relationship or enlargement ratio preset relationship in combination with the change direction.
For example, the user B operates the mobile terminal to browse a map, and since the map is a preset application type with zooming requirements, as shown in fig. 6:
step 601, identifying the face of an operator (user B) through a structured light module of the mobile terminal, and acquiring the real-time distance between the face of the operator and a display screen.
In step 602, the acquired initial distance is determined as a reference distance.
When the structured light module is opened, the distance between the first display screen and the face of the operator is detected.
Step 603, determining the change information of the real-time distance relative to the reference distance.
The opened structured light module detects in real time the distance between the face and the display screen of the user B when browsing the map, and obtains change information with respect to the reference distance determined in step 602, where the change information includes: the change direction of the real-time distance relative to the reference distance, and the maximum value of the absolute difference value of the real-time distance and the reference distance.
Step 604, when a zoom instruction input by a user is received and the change direction indicates that the change of the real-time distance relative to the reference distance is increased, amplifying and displaying the target object according to the display scale corresponding to the maximum value of the absolute difference; and when the change direction indicates that the change of the real-time distance relative to the reference distance is reduced, reducing and displaying the target object according to the display scale corresponding to the maximum value of the absolute difference value.
After receiving the zooming instruction, the user can know that the user has the zooming requirement on the map, and then the corresponding operation on the target object is completed by combining the changing direction in the changing information and the maximum value of the absolute difference value. Assuming that the change of the real-time distance from the reference distance is increased, that is, the position of the face of the user B is farther from the display screen than the reference distance, the corresponding display scale is determined in the preset relationship of the magnification scale by the maximum value of the absolute difference in the change information, and then the map is magnified according to the display scale. Assuming that the change of the real-time distance relative to the reference distance is reduced, namely the position of the face of the user B is closer to the display screen than the reference distance, the corresponding display scale is determined in the preset relation of the reduction scale according to the maximum value of the absolute difference value in the change information, and then the map is reduced according to the display scale.
It should also be appreciated that in the above embodiments, the target object to be reduced or enlarged is more than the currently displayed content. However, what the user needs may be only zooming out or zooming in for a certain part of the target object, and therefore, on the basis of the above embodiment, before the step of determining the scaling parameter of the target object according to the variation information, the method includes:
detecting touch operation of a user in a display screen;
if the touch operation is detected, acquiring an operation position of the touch operation;
detecting whether the operation position is located in a display area of a display assembly according to the current display content of the display screen;
if the operation position is located in a display area of the display assembly, determining the display assembly as a target object;
and if the operation position is not located in the display area of the display assembly, determining the current display content as a target object.
Therefore, when the touch operation of an operator on the display screen is detected, whether the operation position is located in the display area of the display assembly or not can be detected by acquiring the operation position of the operation on the display screen and combining the current display content of the display screen, and then the display assembly is determined as a target object when the operation position is located in the display area of the display assembly; and if the operation position is not located in the display area of the display component, determining the current display content as the target object. And then, performing corresponding operation on the determined target object according to the change information.
The size and the coordinates of the display component in the current display content are often predefined by a developer, and the system can acquire the data information, so that when the operation position touched by the operator is acquired, whether the operation position is located in the display area of the display component can be determined according to whether the operation position is in the coordinate range corresponding to the display component. Of course, the step of determining the target object is not sequential to the step of obtaining the change information of the real-time distance, and the change information may be obtained first and then the target object is determined, or the target object may be determined first and then the change information is obtained, which is not described herein again.
Generally, the selection instruction may be triggered when the user touches the display screen to trigger the zoom instruction.
For example, the user C operates the mobile terminal to browse a news application, and since the news application is a preset application type with a zooming requirement, as shown in fig. 7:
step 701, identifying the face of an operator (user C) through a structured light module of the mobile terminal, and acquiring the real-time distance between the face of the operator and a display screen.
In step 702, the obtained initial distance is determined as a reference distance.
When the structured light module is opened, the distance between the first display screen and the face of the operator is detected.
Step 703, determining the change information of the real-time distance relative to the reference distance.
When the opened structured light module detects that the user C browses the news application in real time, the distance between the face and the display screen obtains the change information of the reference distance determined in the step 702, and the change information includes: the change direction of the real-time distance relative to the reference distance, and the maximum value of the absolute difference value of the real-time distance and the reference distance.
Step 704, when a zoom instruction input by a user is received, acquiring a trigger position of the instruction on a display screen (namely, an operation position of the user); detecting whether the trigger position is located in a display area of the display assembly according to the current display content of the display screen; if the trigger position is located in the display area of the display assembly, determining the display assembly as a target object; and if the trigger position is not located in the display area of the display component, determining the current display content as the target object.
Assuming that the coordinates of the finger touch of the user C are (80, 90), the start coordinates of the area where a certain news item of the news application is located are (60, 70), the length of the news display area is 100, and the width of the news display area is 40, it can be determined that the area where the finger touch is the news item, and the target object to be zoomed is the news item. If the coordinate touched by the finger of the user is (10, 20), the common area touched by the finger is judged to be the common area of all news, and the target object to be zoomed is the content of the whole news application interface of the currently displayed content.
Step 705, when the change direction indicates that the change of the real-time distance relative to the reference distance is increased, amplifying and displaying the target object according to the display scale corresponding to the maximum value of the absolute difference value; and when the change direction indicates that the change of the real-time distance relative to the reference distance is reduced, reducing and displaying the target object according to the display scale corresponding to the maximum value of the absolute difference.
Based on the target object determined in step 704, when the change of the real-time distance from the reference distance is increased, that is, the face of the user C is farther from the display screen than the reference distance, a corresponding display scale is determined in the preset relationship of the magnification scale according to the maximum value of the absolute difference in the change information, and then the target object (the news or the content of the entire news application interface) is magnified according to the display scale. Assuming that the change of the real-time distance from the reference distance is reduced, that is, the position of the face of the user C closer to the display screen than the reference distance, the maximum value of the absolute difference in the change information determines the corresponding display scale in the preset relation of the reduction scale, and then the target object (the news or the content of the whole news application interface) is reduced according to the reduction scale.
In summary, in the display control method according to the embodiment of the present invention, first, a real-time distance between a face of an operator of the mobile terminal and a display screen of the mobile terminal is obtained; and then, based on the real-time distance, adjusting the display scale of the target object displayed on the display screen of the mobile terminal to complete the viewing required by the user. The display scale of the target object is adjusted mainly according to the real-time distance between the display screen and the face of the operator, so that the method can be suitable for more scenes, the limitation of manual zooming control operation in the existing mode is effectively overcome, and the use convenience of the mobile terminal is improved.
Fig. 8 is a block diagram of a mobile terminal of one embodiment of the present invention. The mobile terminal 800 shown in fig. 8 includes an acquisition module 801 and an adjustment module 802.
An obtaining module 801, configured to obtain a distance between a display screen of a mobile terminal and a face of an operator operating the mobile terminal;
and an adjusting module 802, configured to adjust a display scale of a target object displayed on a display screen of the mobile terminal based on the real-time distance.
Optionally, the obtaining module 801 is further configured to:
and carrying out face recognition on an operator through the structured light module of the mobile terminal, and acquiring a real-time distance between the face of the operator and the display screen after the face is recognized.
On the basis of fig. 8, optionally, as shown in fig. 9, the adjusting module 802 includes:
a first detection submodule 8021, configured to detect change information of the real-time distance;
a determining submodule 8022, configured to determine, according to the change information, a scale adjustment parameter of the target object;
the first processing sub-module 8023 is configured to adjust a display scale of the target object displayed on the display screen of the mobile terminal according to the scale adjustment parameter.
Optionally, the first detection submodule 8021 includes:
an acquisition unit 80211 for acquiring an initial distance between the face of the operator and a display screen;
a determination unit 80212 for determining the initial distance as a reference distance;
a detecting unit 80213, configured to detect information about a change of the distance from the reference distance.
Optionally, the change information includes: the change direction of the real-time distance relative to the reference distance, the time from the increase of the absolute difference value of the real-time distance and the reference distance to the reduction of the absolute difference value to a preset threshold value, and the maximum value of the absolute difference value.
Optionally, the first processing sub-module 8023 is further configured to:
if the time is less than a preset time length and the change direction indicates that the change of the real-time distance relative to the reference distance is increased and then decreased, amplifying and displaying the target object according to a first display scale corresponding to the maximum value of the absolute difference value;
if the time is less than the preset time length and the change direction indicates that the change of the real-time distance relative to the reference distance is reduced first and then increased, the target object is displayed in a reduced mode according to a second display scale corresponding to the maximum value of the absolute difference.
Optionally, the adjusting module 802 further includes:
a second detection submodule 8024, configured to detect whether a zoom instruction is received;
a second processing sub-module 8025, configured to execute the step of determining the scale adjustment parameter of the target object according to the change information if a zoom instruction input by a user is received.
Optionally, the change information includes: the change direction of the real-time distance relative to the reference distance and the maximum value of the absolute difference value of the real-time distance and the reference distance.
Optionally, the first processing sub-module 8023 is further configured to:
when the change direction indicates that the change of the real-time distance relative to the reference distance is increased, amplifying and displaying the target object according to a third display scale corresponding to the maximum value of the absolute difference value;
and when the change direction shows that the change of the real-time distance relative to the reference distance is reduced, zooming out the target object according to a fourth display scale corresponding to the maximum value of the absolute difference value.
On the basis of fig. 8, optionally, as shown in fig. 10, the adjusting module 802 further includes:
the third detection submodule 8026 is configured to detect a touch operation of a user in the display screen;
an obtaining submodule 8027, configured to, if a touch operation is detected, obtain an operation position of the touch operation;
a fourth detection submodule 8028, configured to detect whether the operation position is located in a display area of the display assembly according to the current display content of the display screen;
a third processing submodule 8029, configured to determine the display assembly as a target object if the operation position is located in a display area of the display assembly;
a fourth processing sub-module 80210, configured to determine, if the operation position is not located in the display area of the display assembly, that the currently displayed content is the target object.
The mobile terminal 800 can implement each process implemented by the mobile terminal in the method embodiments of fig. 1 to fig. 7, and details are not repeated here to avoid repetition. The display control method of the embodiment of the invention comprises the steps of firstly, acquiring the real-time distance between the face of a mobile terminal operator and a display screen of the mobile terminal; and then, based on the real-time distance, adjusting the display scale of the target object displayed on the display screen of the mobile terminal to finish the required inspection of the user. The display scale of the target object is adjusted mainly according to the real-time distance between the display screen and the face of the operator, so that the method is suitable for more scenes, the limitation of manual zooming control operation in the existing mode is effectively overcome, and the use convenience of the mobile terminal is improved.
Fig. 11 is a schematic diagram of a hardware structure of a mobile terminal for implementing various embodiments of the present invention, where the mobile terminal 1100 includes, but is not limited to: radio frequency unit 1101, network module 1102, audio output unit 1103, input unit 1104, sensor 1105, display unit 1106, user input unit 1107, interface unit 1108, memory 1109, processor 1110, and power supply 1111. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 11 is not intended to be limiting of mobile terminals, and that a mobile terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The processor 1110 is configured to obtain a real-time distance between a face of an operator of the mobile terminal and a display screen of the mobile terminal; and adjusting the display scale of the target object displayed on the display screen of the mobile terminal based on the real-time distance.
Therefore, the mobile terminal firstly acquires the real-time distance between the face of the operator of the mobile terminal and the display screen of the mobile terminal; and then, based on the real-time distance, adjusting the display scale of the target object displayed on the display screen of the mobile terminal to finish the viewing required by the user. The display scale of the target object is adjusted mainly according to the real-time distance between the display screen and the face of the operator, so that the method is suitable for more scenes, the limitation of manual zooming control operation in the conventional mode is effectively overcome, and the use convenience of the mobile terminal is improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 1101 may be configured to receive and transmit signals during a message sending and receiving process or a call process, and specifically, receive downlink data from a base station and then process the received downlink data to the processor 1110; in addition, the uplink data is transmitted to the base station. Typically, the radio frequency unit 1101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 1101 may also communicate with a network and other devices through a wireless communication system.
The mobile terminal provides wireless broadband internet access to the user through the network module 1102, such as assisting the user in sending and receiving e-mails, browsing web pages, accessing streaming media, and the like.
The audio output unit 1103 may convert audio data received by the radio frequency unit 1101 or the network module 1102 or stored in the memory 1109 into an audio signal and output as sound. Also, the audio output unit 1103 may also provide audio output related to a specific function performed by the mobile terminal 1100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 113 includes a speaker, a buzzer, a receiver, and the like.
The input unit 1104 is used to receive audio or video signals. The input Unit 1104 may include a Graphics Processing Unit (GPU) 11041 and a microphone 11042, and the Graphics processor 11041 processes image data of still pictures or video obtained by an image capturing apparatus (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 11006. The image frames processed by the graphic processor 11041 may be stored in the memory 1109 (or other storage medium) or transmitted via the radio frequency unit 1101 or the network module 1102. The microphone 11042 may receive sound and can process such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 1101 in case of the phone call mode.
The mobile terminal 1100 also includes at least one sensor 1105, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that adjusts the brightness of the display panel 11061 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 11061 and/or a backlight when the mobile terminal 1100 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of the mobile terminal (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer and tap); the sensors 1105 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., and are not described in detail herein.
The display unit 1106 is used to display information input by a user or information provided to the user. The Display unit 1106 may include a Display panel 11061, and the Display panel 11061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 1107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 1107 includes a touch panel 11071 and other input devices 11072. The touch panel 11071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 11071 (e.g., operations by a user on or near the touch panel 11071 using a finger, a stylus, or any suitable object or attachment). The touch panel 11071 may include two portions of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, and sends the touch point coordinates to the processor 1110, and receives and executes commands sent from the processor 1110. In addition, the touch panel 11071 may be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The user input unit 1107 may include other input devices 11072 in addition to the touch panel 11071. Specifically, the other input devices 11072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 11071 can be overlaid on the display panel 11061, and when the touch panel 11071 detects a touch operation thereon or nearby, the touch operation is transmitted to the processor 1110 to determine the type of the touch event, and then the processor 1110 provides a corresponding visual output on the display panel 1161 according to the type of the touch event. Although the touch panel 11071 and the display panel 11061 are shown in fig. 11 as two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 11071 and the display panel 11061 may be integrated to implement the input and output functions of the mobile terminal, and is not limited herein.
The interface unit 1108 is an interface through which an external device is connected to the mobile terminal 1100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. Interface unit 1108 may be used to receive input from external devices (e.g., data information, power, etc.) and transmit the received input to one or more elements within mobile terminal 1100 or may be used to transmit data between mobile terminal 110 and external devices.
The memory 1109 may be used to store software programs as well as various data. The memory 1109 may mainly include a storage program area and a storage data area, where the storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 119 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 1110 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 1109 and calling data stored in the memory 119, thereby integrally monitoring the mobile terminal. Processor 1110 may include one or more processing units; preferably, the processor 1110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1110.
The mobile terminal 1100 may also include a power supply 1111 (e.g., a battery) for supplying power to various components, and preferably, the power supply 1111 may be logically connected to the processor 111 via a power management system, so as to manage charging, discharging, and power consumption management functions via the power management system.
In addition, the mobile terminal 1100 includes some functional modules that are not shown, and thus will not be described in detail herein.
Preferably, an embodiment of the present invention further provides a mobile terminal, including a processor 1110, a memory 1109, and a computer program stored in the memory 1109 and capable of running on the processor 1110, where the computer program, when executed by the processor 1110, implements each process of the foregoing display control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
An embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the display control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the above embodiment method can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better embodiment. Based on such understanding, the technical solution of the present invention may be substantially or partially embodied in the form of a software product stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk), and including instructions for enabling a terminal (e.g., a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (16)

1. A display control method, comprising:
acquiring a real-time distance between the face of a mobile terminal operator and a display screen of the mobile terminal;
adjusting the display scale of the target object displayed on the display screen of the mobile terminal based on the real-time distance;
the step of adjusting the display scale of the target object displayed on the display screen of the mobile terminal based on the real-time distance comprises the following steps:
detecting the change information of the real-time distance;
determining a proportion adjustment parameter of the target object according to the change information;
adjusting the display scale of the target object displayed on the display screen of the mobile terminal according to the scale adjusting parameter;
the step of detecting the change information of the real-time distance includes:
acquiring an initial distance between the face of the operator and a display screen;
determining the initial distance as a reference distance;
detecting the change information of the real-time distance relative to the reference distance;
the change information includes: the change direction of the real-time distance relative to the reference distance, the time from the increase of the absolute difference value of the real-time distance and the reference distance to the reduction of the absolute difference value to a preset threshold value, and the maximum value of the absolute difference value.
2. The display control method according to claim 1, wherein the step of acquiring a real-time distance between the face of the operator of the mobile terminal and the display screen of the mobile terminal comprises:
and carrying out face recognition on an operator through the structured light module of the mobile terminal, and acquiring a real-time distance between the face of the operator and the display screen after the face is recognized.
3. The display control method according to claim 1, wherein the step of adjusting the display scale of the target object displayed on the display screen of the mobile terminal according to the scale adjustment parameter comprises:
if the time is less than a preset time length and the change direction indicates that the change of the real-time distance relative to the reference distance is increased first and then decreased, amplifying and displaying the target object according to a first display proportion corresponding to the maximum value of the absolute difference value;
if the time is less than the preset time length and the change direction indicates that the change of the real-time distance relative to the reference distance is reduced first and then increased, the target object is displayed in a reduced mode according to a second display scale corresponding to the maximum value of the absolute difference value.
4. The display control method according to claim 1, wherein the step of determining the scaling parameter of the target object based on the variation information is preceded by:
detecting whether a zooming instruction is received;
and if a zooming instruction input by a user is received, executing the step of determining the proportion adjusting parameter of the target object according to the change information.
5. The display control method according to claim 4, wherein the change information includes: the change direction of the real-time distance relative to the reference distance and the maximum value of the absolute difference value of the real-time distance and the reference distance.
6. The display control method according to claim 5, wherein the step of adjusting the display scale of the target object displayed on the display screen of the mobile terminal according to the scale adjustment parameter comprises:
when the change direction indicates that the change of the real-time distance relative to the reference distance is increased, amplifying and displaying the target object according to a third display scale corresponding to the maximum value of the absolute difference value;
and when the change direction indicates that the change of the real-time distance relative to the reference distance is reduced, performing reduced display on the target object according to a fourth display scale corresponding to the maximum value of the absolute difference.
7. The display control method according to claim 1, wherein the step of determining the scale adjustment parameter of the target object based on the variation information is preceded by:
detecting touch operation of a user in a display screen;
if the touch operation is detected, acquiring an operation position of the touch operation;
detecting whether the operation position is located in a display area of a display assembly according to the current display content of the display screen;
if the operation position is located in a display area of the display assembly, determining the display assembly as a target object;
and if the operation position is not located in the display area of the display component, determining the current display content as a target object.
8. A mobile terminal, comprising:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring the real-time distance between the face of a mobile terminal operator and the display screen of the mobile terminal;
the adjusting module is used for adjusting the display scale of the target object displayed on the display screen of the mobile terminal based on the real-time distance;
the adjustment module includes:
the first detection submodule is used for detecting the change information of the real-time distance;
the determining submodule is used for determining a proportion adjusting parameter of the target object according to the change information;
the first processing submodule is used for adjusting the display scale of the target object displayed on the display screen of the mobile terminal according to the scale adjusting parameter;
the first detection submodule includes:
an acquisition unit configured to acquire an initial distance between a face of the operator and a display screen;
a determination unit configured to determine the initial distance as a reference distance;
a detection unit configured to detect change information of the distance from the reference distance;
the change information includes: the change direction of the real-time distance relative to the reference distance, the time from the increase of the absolute difference value of the real-time distance and the reference distance to the reduction of the absolute difference value to a preset threshold value, and the maximum value of the absolute difference value.
9. The mobile terminal of claim 8, wherein the obtaining module is further configured to:
and carrying out face recognition on an operator through the structured light module of the mobile terminal, and acquiring a real-time distance between the face of the operator and the display screen after the face is recognized.
10. The mobile terminal of claim 8, wherein the first processing sub-module is further configured to:
if the time is less than a preset time length and the change direction indicates that the change of the real-time distance relative to the reference distance is increased first and then decreased, amplifying and displaying the target object according to a first display proportion corresponding to the maximum value of the absolute difference value;
if the time is less than the preset time length and the change direction indicates that the change of the real-time distance relative to the reference distance is reduced first and then increased, the target object is displayed in a reduced mode according to a second display scale corresponding to the maximum value of the absolute difference value.
11. The mobile terminal of claim 8, wherein the adjusting module further comprises:
the second detection submodule is used for detecting whether a zooming instruction is received or not;
and the second processing submodule is used for executing the step of determining the proportion adjusting parameter of the target object according to the change information if a zooming instruction input by a user is received.
12. The mobile terminal of claim 11, wherein the change information comprises: the change direction of the real-time distance relative to the reference distance and the maximum value of the absolute difference value of the real-time distance and the reference distance.
13. The mobile terminal of claim 12, wherein the first processing sub-module is further configured to:
when the change direction indicates that the change of the real-time distance relative to the reference distance is increased, amplifying and displaying the target object according to a third display scale corresponding to the maximum value of the absolute difference value;
and when the change direction indicates that the change of the real-time distance relative to the reference distance is reduced, performing reduced display on the target object according to a fourth display scale corresponding to the maximum value of the absolute difference.
14. The mobile terminal of claim 8, wherein the adjusting module further comprises:
the third detection submodule is used for detecting the touch operation of a user in the display screen;
the obtaining sub-module is used for obtaining the operation position of the touch operation if the touch operation is detected;
the fourth detection submodule is used for detecting whether the operation position is positioned in a display area of the display assembly according to the current display content of the display screen;
the third processing submodule is used for determining the display assembly as a target object if the operation position is located in the display area of the display assembly;
and the fourth processing submodule is used for determining the current display content as the target object if the operation position is not located in the display area of the display assembly.
15. A mobile terminal, characterized in that it comprises a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the display control method according to any one of claims 1 to 7.
16. A computer-readable storage medium, characterized in that a computer program is stored thereon, which computer program, when being executed by a processor, carries out the steps of the display control method according to any one of claims 1 to 7.
CN201710959040.2A 2017-10-16 2017-10-16 Display control method and mobile terminal Active CN107741814B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710959040.2A CN107741814B (en) 2017-10-16 2017-10-16 Display control method and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710959040.2A CN107741814B (en) 2017-10-16 2017-10-16 Display control method and mobile terminal

Publications (2)

Publication Number Publication Date
CN107741814A CN107741814A (en) 2018-02-27
CN107741814B true CN107741814B (en) 2020-03-03

Family

ID=61237608

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710959040.2A Active CN107741814B (en) 2017-10-16 2017-10-16 Display control method and mobile terminal

Country Status (1)

Country Link
CN (1) CN107741814B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108737645A (en) * 2018-04-26 2018-11-02 网易(杭州)网络有限公司 The reminding method and device and storage medium and terminal of message
CN108600647A (en) * 2018-07-24 2018-09-28 努比亚技术有限公司 Shooting preview method, mobile terminal and storage medium
CN110866071B (en) * 2018-08-14 2022-08-30 海能达通信股份有限公司 Map object display method, device, equipment and storage medium
CN111152732A (en) * 2018-11-07 2020-05-15 宝沃汽车(中国)有限公司 Adjusting method of display screen in vehicle, display screen rotating assembly in vehicle and vehicle
CN113535109B (en) * 2021-09-17 2021-12-10 南昌龙旗信息技术有限公司 Display terminal and display method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090023917A (en) * 2007-09-03 2009-03-06 엘지전자 주식회사 User interface
CN101931692A (en) * 2010-07-14 2010-12-29 康佳集团股份有限公司 Mobile terminal and method for automatically adjusting display size of image
CN102955664A (en) * 2011-08-19 2013-03-06 鸿富锦精密工业(深圳)有限公司 Method and system for electronic book display adjustment
CN103677581A (en) * 2012-09-19 2014-03-26 上海斐讯数据通信技术有限公司 Mobile terminal and method for automatically controlling terminal display
CN103793173A (en) * 2014-01-29 2014-05-14 宇龙计算机通信科技(深圳)有限公司 Displaying method and device
CN107203313A (en) * 2017-05-24 2017-09-26 维沃移动通信有限公司 Adjust desktop and show object method, mobile terminal and computer-readable recording medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI547858B (en) * 2011-12-30 2016-09-01 富智康(香港)有限公司 System and method for controlling document scaling and rotation on a touch screen
US20130321617A1 (en) * 2012-05-30 2013-12-05 Doron Lehmann Adaptive font size mechanism
KR20150047387A (en) * 2013-10-24 2015-05-04 엘지전자 주식회사 Terminal and control method thereof
CN106201414B (en) * 2016-06-28 2019-05-17 Oppo广东移动通信有限公司 Control method, control device and electronic device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090023917A (en) * 2007-09-03 2009-03-06 엘지전자 주식회사 User interface
CN101931692A (en) * 2010-07-14 2010-12-29 康佳集团股份有限公司 Mobile terminal and method for automatically adjusting display size of image
CN102955664A (en) * 2011-08-19 2013-03-06 鸿富锦精密工业(深圳)有限公司 Method and system for electronic book display adjustment
CN103677581A (en) * 2012-09-19 2014-03-26 上海斐讯数据通信技术有限公司 Mobile terminal and method for automatically controlling terminal display
CN103793173A (en) * 2014-01-29 2014-05-14 宇龙计算机通信科技(深圳)有限公司 Displaying method and device
CN107203313A (en) * 2017-05-24 2017-09-26 维沃移动通信有限公司 Adjust desktop and show object method, mobile terminal and computer-readable recording medium

Also Published As

Publication number Publication date
CN107741814A (en) 2018-02-27

Similar Documents

Publication Publication Date Title
CN107817939B (en) Image processing method and mobile terminal
CN107741814B (en) Display control method and mobile terminal
CN111142991A (en) Application function page display method and electronic equipment
CN109032445B (en) Screen display control method and terminal equipment
CN111142723B (en) Icon moving method and electronic equipment
CN109710349B (en) Screen capturing method and mobile terminal
WO2019184947A1 (en) Image viewing method and mobile terminal
CN110489045B (en) Object display method and terminal equipment
CN110908750B (en) Screen capturing method and electronic equipment
CN109144393B (en) Image display method and mobile terminal
CN107728923B (en) Operation processing method and mobile terminal
CN110752981B (en) Information control method and electronic equipment
CN109407949B (en) Display control method and terminal
CN109240577A (en) A kind of screenshotss method and terminal
CN108287655A (en) A kind of interface display method, interface display apparatus and mobile terminal
CN109710130B (en) Display method and terminal
CN110968229A (en) Wallpaper setting method and electronic equipment
CN110536005B (en) Object display adjustment method and terminal
CN109597546B (en) Icon processing method and terminal equipment
CN109104573B (en) Method for determining focusing point and terminal equipment
CN109189514B (en) Terminal device control method and terminal device
CN111061446A (en) Display method and electronic equipment
CN108021315B (en) Control method and mobile terminal
CN109933267A (en) The method and terminal device of controlling terminal equipment
CN111443860B (en) Touch control method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant