CN108418953B - Screen control method and device of terminal, readable storage medium and terminal - Google Patents
Screen control method and device of terminal, readable storage medium and terminal Download PDFInfo
- Publication number
- CN108418953B CN108418953B CN201810113000.0A CN201810113000A CN108418953B CN 108418953 B CN108418953 B CN 108418953B CN 201810113000 A CN201810113000 A CN 201810113000A CN 108418953 B CN108418953 B CN 108418953B
- Authority
- CN
- China
- Prior art keywords
- terminal
- screen
- user
- acceleration
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 238000000034 method Methods 0.000 title claims abstract description 63
- 238000012545 processing Methods 0.000 claims abstract description 16
- 230000001133 acceleration Effects 0.000 claims description 102
- 238000011217 control strategy Methods 0.000 claims description 17
- 238000004590 computer program Methods 0.000 claims description 15
- 238000013507 mapping Methods 0.000 claims description 6
- 230000008569 process Effects 0.000 abstract description 20
- 210000005069 ears Anatomy 0.000 abstract description 6
- 230000006870 function Effects 0.000 description 19
- 230000009471 action Effects 0.000 description 16
- 230000033001 locomotion Effects 0.000 description 11
- 230000006854 communication Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 238000005282 brightening Methods 0.000 description 5
- 238000004891 communication Methods 0.000 description 5
- 238000001514 detection method Methods 0.000 description 4
- 230000005484 gravity Effects 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 230000001413 cellular effect Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000005236 sound signal Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 206010062519 Poor quality sleep Diseases 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000000449 premovement Effects 0.000 description 1
- 230000007115 recruitment Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000010897 surface acoustic wave method Methods 0.000 description 1
- 210000004243 sweat Anatomy 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72448—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72448—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
- H04M1/72454—User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
Landscapes
- Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Environmental & Geological Engineering (AREA)
- Telephone Function (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application relates to a screen control method and device of a terminal, a computer readable storage medium and a terminal. The method comprises the following steps: when a terminal is in a call state, acquiring a screen state of the terminal; when the screen is in a screen-off state, detecting attitude information of the terminal based on a sensor built in the terminal; judging whether a display screen of the terminal faces to a face area of a user or not according to the attitude information; when the display screen of the terminal faces the face area, the terminal is controlled to light the screen, the method can avoid the situation that the distance sensor is shielded by other objects (non-human ears) in the conversation process and cannot timely conduct screen lightening processing on the terminal, screen lightening operation can be timely and accurately achieved based on the posture information obtained by the terminal, and user experience is improved.
Description
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a screen control method and apparatus for a terminal, a computer-readable storage medium, and a terminal.
Background
As terminal devices enter an intelligent era, large-screen (especially touch screen) terminal devices are increasingly popularized, more and more application programs (apps) are installed on the terminal devices, so that power consumption of the terminal is high, especially in a terminal call process, if the mobile terminal keeps on a screen all the time, a large amount of electricity is consumed, and therefore the mobile terminal is generally controlled to be turned off in the call process.
However, in the process of passing through, when the user needs to light the screen to operate the terminal, whether the screen is far away from the shielding object can be detected through the distance sensor, and if the screen is far away from the shielding object, the screen is lightened. However, in the practical application process, the distance sensor is often shielded by other objects (non-human ears) during the communication process, such as oil stains, sweat, and the like, and the terminal cannot be timely subjected to screen brightening treatment when being far away from the human ears of the user.
Disclosure of Invention
The embodiment of the application provides a screen control method and device of a terminal, a computer readable storage medium and a terminal, which can timely and accurately realize screen lightening operation and improve user experience.
A screen control method of a terminal includes:
when a terminal is in a call state, acquiring a screen state of the terminal;
when the screen is in a screen-off state, detecting attitude information of the terminal based on a sensor built in the terminal within a preset time;
judging whether a display screen of the terminal faces to a face area of a user or not according to the attitude information;
and when the display screen of the terminal faces the face area, controlling the terminal to light up the screen.
A screen control apparatus of a terminal, the apparatus comprising:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring the screen state of a user terminal when the user terminal is in a call state;
the terminal comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for detecting the attitude information of the terminal based on a sensor arranged in the terminal within a preset time when the screen is in a screen-off state;
the judging module is used for judging whether a display screen of the terminal faces to a face area of a user according to the posture information;
and the control module is used for controlling the terminal to light up the screen when the display screen of the terminal faces the face area.
A computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of a screen control method of a terminal in various embodiments of the present application.
A terminal comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the screen control method of the terminal in the various embodiments of the present application when executing the computer program.
According to the screen control method and device of the terminal, the computer-readable storage medium and the terminal, when the terminal is in a call state, whether the current terminal faces to a face area of a user can be judged based on attitude information of the terminal collected by a sensor arranged in the terminal, when the terminal faces to the face area, the terminal is controlled to be lightened, the situation that a distance sensor is shielded by other objects (non-human ears) in the call process and the terminal cannot be lightened in time can be avoided, screen lightening operation can be timely and accurately achieved based on the attitude information obtained by the terminal, and user experience is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a flowchart of a screen control method of a terminal in one embodiment;
FIG. 2 is a flowchart illustrating an embodiment of detecting attitude information of a terminal based on a sensor built in the terminal within a predetermined time period;
FIG. 3 is a flowchart illustrating an embodiment of determining whether a display screen of the terminal faces a face area of a user according to the pose information;
FIG. 4 is a flowchart illustrating controlling the terminal to light up a screen when a display screen of the terminal faces a face area according to an embodiment;
fig. 5 is a flowchart of a screen control method of a terminal in another embodiment;
FIG. 6 is a block diagram showing the configuration of a screen control apparatus of a terminal in one embodiment;
fig. 7 is a schematic diagram of the internal structure of the terminal in one embodiment;
fig. 8 is a block diagram of a partial structure of a mobile phone related to a terminal according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, the first determination unit may be referred to as a second determination unit, and similarly, the second determination unit may be referred to as a first determination unit, without departing from the scope of the present invention. Both the first judgment unit and the second judgment unit are judgment units, but they are not the same judgment unit.
Fig. 1 is a flowchart of a screen control method of a terminal in one embodiment. The screen control method of the terminal in this embodiment is described by taking the operation on the terminal as an example. As shown in fig. 1, the screen control method of the terminal includes steps 102 to 106.
Step 102: when a user holds a terminal in a call state, acquiring the screen state of the terminal.
The terminal can monitor the call state of the terminal through a telephone Manager (Telephony Manager). The phone manager is used for managing a terminal call state, acquiring terminal information (terminal equipment information, SIM card information and network information), intercepting a phone state (a call state service state, a signal strength state and the like) and calling a phone dialer to make a call.
Optionally, the terminal may further obtain a current call state of the terminal by reading a log, or monitor the call state of the terminal through Android broadcasting, or the like. It should be noted that the above listed manners for detecting whether the terminal is in a call state are only examples and are not limited.
The screen state comprises a screen-off state and a screen-on state. Specifically, the terminal can be determined to be in a screen-on state or a screen-off state in a manner that the Power Manager Service detects whether the screen is lighted. After the power receives the call of the bright and dark screen, the wakefulness state of the mobile phone is set first, and then the bright and dark screen broadcast is sent to inform other application mobile phones of being in the bright screen state or the dark screen state. It should be noted that the above list is an example and not a limitation for determining the screen status of the terminal, and the screen status of the terminal may be detected in other manners.
Step 104: and when the screen is in a screen-off state, detecting the attitude information of the terminal based on a sensor built in the terminal within a preset time.
When a user holds a terminal and places the terminal beside an ear to answer a call, in order to prevent the call state from being ended due to misoperation in the process of answering the call, the screen is kept in a screen-off state during the call.
The terminal is internally provided with a sensor for acquiring attitude information or running state information of the terminal, a proportional acceleration sensor, a magnetic sensor, a direction sensor, a gyroscope sensor, a gravity sensor, a linear acceleration sensor and the like. And light sensing sensors, pressure sensors, temperature sensors, proximity sensors and the like for assisting in acquiring other information of the terminal can be arranged in the terminal. When a user holds the terminal and places the terminal beside an ear to answer a call and the screen state of the terminal is a screen-off state, an acceleration sensor and a gyroscope sensor which are arranged in the terminal can acquire acceleration information and angular velocity information of the terminal.
It should be noted that the attitude information may be understood as position information based on a standard coordinate system of an acceleration sensor and a gyroscope sensor in the terminal within a preset time, and the position information may be characterized by acceleration of X, Y, Z triaxial components and angular velocity information of X, Y, Z triaxial components. Wherein the attitude information includes three-axis acceleration and three-axis rotation angle.
Step 106: and judging whether a display screen of the terminal faces to a face area of the user or not according to the posture information.
And judging whether the display screen of the terminal faces to the face area of the user or not according to the acquired attitude information. The face area can be understood as the face information of the user can be collected if the front camera is opened by the terminal. When a user holds the terminal to place the terminal beside the ear to answer the call, the acceleration sensor and the gyroscope in the terminal can detect the attitude information when the user holds the terminal to place the terminal beside the ear to answer the call, and when the user keeps the terminal away from the ear of the user, the terminal can also detect the attitude information of the action.
According to the gesture actions of the user in the call process, historical statistical data of the posture information of the terminal in the call process can be obtained. According to the historical statistical data, the gesture information of the terminal moving from the position close to the ear of the user to any position in the moving track facing the face area of the user can be obtained, and according to the historical statistical data, the preset gesture information of the terminal facing the face area of the user can be preset. In the moving process of the terminal, when the attitude information collected by the acceleration sensor and the gyroscope accords with the preset attitude information, the display screen of the terminal can be judged to face the human face area.
Step 108: and when the display screen of the terminal faces the face area, controlling the terminal to light up the screen.
When a user needs to use the terminal to display information such as time, call duration and the like in the call process, the screen needs to be lightened for the user to use. Judging whether the screen needs to be lightened or not based on the posture information in the screen-off conversation process, and controlling to lighten the screen when the posture information shows that the display screen of the terminal faces the face area.
According to the screen control method of the terminal, when the terminal is in a call state, whether the current terminal faces the face area of the user can be judged based on the attitude information of the terminal acquired by the sensor arranged in the terminal, when the terminal faces the face area, the terminal is controlled to be lightened, the situation that the distance sensor is shielded by other objects (non-human ears) in the call process and cannot be lightened in time can be avoided, screen lightening operation can be timely and accurately realized based on the attitude information acquired by the terminal, and the user experience degree is improved.
In one embodiment, the attitude information of the terminal is detected based on a sensor built in the terminal, and is mainly based on an acceleration sensor and a gyroscope built in the terminal. The acceleration sensor (accelerometer), also known as a Gravity sensor (G-sensor), is an MEMS sensor capable of sensing the magnitude of acceleration, and can obtain the acceleration from three different axial components (X, Y, Z three-axis components) through the acceleration, so as to inform the upper layer application of performing corresponding processing.
Due to the gravity of the earth, when the terminal is horizontally placed on a desktop, the three-axis component acquired by the acceleration sensor can be that the X axis is 0, the Y axis is 0, and the Z axis is 9.81 in default; when the terminal is placed on a desktop downwards, the default of an X axis is 0, the default of a Y axis is 0, and the default of a Z axis is-9.81; when the terminal inclines to the left, the X axis is a positive value; when the terminal inclines to the right, the X axis is a negative value; when the terminal inclines upwards, the Y axis is a negative value; when the terminal is tilted downward, the Y-axis is positive. If moved or tilted, the vertical gravitational acceleration is resolved into the X, Y, Z axis. The value of the X, Y, Z axis will change constantly as the terminal is rotated. That is, it can be understood that the terminal is in a coordinate system, and whether the terminal moves or rotates can be determined by determining the current values of the current X, Y, Z triaxial components.
A gyroscope (Gyro-sensor), also called an angular velocity sensor, measures a physical quantity, which is a rotational angular velocity in yaw and tilt, unlike an acceleration sensor. On the terminal, the complete 3D motion cannot be measured or reconstructed by only using the acceleration sensor, and only the axial linear motion can be detected. However, the gyroscope can well measure the rotating and deflecting actions, so that the actual actions of a user can be accurately analyzed and judged. The gyroscope and the acceleration sensor use the same coordinate system, wherein when the terminal horizontally rotates clockwise, the Z axis is a positive value; when the terminal rotates horizontally anticlockwise, the z axis is a negative value; when the terminal rotates to the left, the y axis is a negative value; when the terminal rotates rightwards, the y axis is a positive value; when the terminal rotates upwards, the x axis is a negative value; when the terminal is rotated down, the x-axis is positive.
Fig. 2 is a flowchart illustrating an embodiment of detecting attitude information of a terminal based on a sensor built in the terminal within a preset time.
In a preset time, detecting the attitude information of the terminal based on a sensor built in the terminal, wherein the method comprises the following steps:
step 202: and acquiring the preset time.
The preset time is related to factors of the use habit of the user and the age of the user.
The acquiring of the preset time specifically comprises: when the terminal starts the function of lifting hands and brightening the screen, the terminal prompts a user to demonstrate movement actions; when the movement action executed by the user is detected and the pre-movement condition is met, the time length for executing the movement action is saved, the saved time length is used as a time basis for identifying the next action of lifting or drawing out of the user, and meanwhile, the saved time length is defined as a first time length. Repeating the moving action, detecting whether the first time length can be correctly identified as the demonstration action or not, and if the first time length can be identified, taking the first time length as the preset time length, namely, the moving action can be completed within the preset time; if the mobile action cannot be identified, the user is prompted to indicate the mobile action again, the first time length is adjusted until the mobile action can be identified correctly, and the adjusted first time length is used as the preset time length. The movement action can be understood as an action from the user holding the terminal by hand to answer the phone call by placing the terminal near the ear to the display screen of the terminal facing the face area of the user.
Step 204: and acquiring the acceleration and the rotation angle of X, Y, Z triaxial components acquired by the acceleration sensor and the gyroscope within a preset time.
The acceleration sensor and the gyroscope can acquire the acceleration and the rotation angular velocity of the X, Y, Z triaxial components of the terminal. And when the terminal moves away from the ear, starting to acquire the attitude information of the terminal, taking the attitude information acquired at the moment as first attitude information, taking the moment as the starting moment of preset time, and continuously acquiring the attitude information of the terminal acquired by the acceleration sensor and the gyroscope within the preset time.
Further, the preset time may be set to 1 second, and within the time of 1 second, the acceleration of the X, Y, Z triaxial components of the terminal is acquired according to the acquisition frequency of the acceleration sensor. The gyroscope acquires the rotation angular velocity of the current terminal X, Y, Z with three-axis components, and performs integral calculation on the angular velocity from the starting time of the preset time to any time within the preset time, so as to continuously acquire the rotation angle of X, Y, Z three-axis components within the preset time. That is, the gyroscope will acquire the rotation angles of the X, Y, Z three-axis components of angular velocity in a preset time.
Step 206: and acquiring the attitude information according to the acceleration and the rotation angle.
And acquiring attitude information of the terminal in preset time according to the three-axis acceleration acquired by the acceleration and the three-axis rotation angle acquired by the gyroscope, wherein the attitude information comprises X, Y, Z three-axis component acceleration and X, Y, Z three-axis component rotation angle. According to the obtained information of the two dimensions of the three-axis component acceleration and the three-axis component rotation angle, the attitude information of the terminal can be accurately obtained, and whether the attitude information of the terminal faces to the face area of the user within the preset time can be accurately and timely judged.
Specifically, judging whether a display screen of the terminal faces a face area of a user according to the posture information includes:
step 302: and judging whether the variation of the acceleration of the Y-axis component in the preset time is larger than a preset value.
In the normal conversation process of a user, a display screen of the terminal faces towards the ear of the user, and at the moment, initial position information of the terminal can be acquired, namely, first posture information of the terminal is acquired; in the moving process of the terminal, the three-axis acceleration data acquired by the acceleration sensor of the terminal changes on at least one single-axis component. When the terminal moves to a face area facing the user, the final position information of the terminal, that is, the second posture information of the terminal, may be acquired. The time length from the initial position (first posture information) to the final position (second posture information) is a preset time, and the preset time can be obtained according to a machine learning method.
In this embodiment, whether the variation of the acceleration of the Y-axis component is greater than a preset value in the three-axis acceleration data collected by the acceleration sensor within a preset time is determined. It is understood that the acceleration of the Y-axis component thereof is an acceleration in the Y-axis direction.
Further, the preset value is 3m/s2When the terminal moves from the initial position to the final position, the variation of the Y-axis component is more than 3m/s2. The preset value may be determined according to information such as a usage habit of a user, a holding posture of a phone call, and the like, but is not limited thereto.
When the variation of the acceleration of the Y-axis component is greater than a preset value within a preset time, step 304 is executed: and judging whether the rotating angle of the Y-axis component in the preset time is larger than a preset angle or not. That is, when the acceleration of the axial component is greater than a preset value, it is determined whether the Y-axis rotation angle acquired by the gyroscope within the preset time is greater than a preset angle. It is understood that the Y-axis rotation angle is an angle of rotation around the Y-axis, wherein the coordinate systems of the gyroscope and the acceleration sensor are the same coordinate system.
Further, the preset angle is 15 degrees, and the variation of the acceleration of the Y axis is more than 3m/s in the preset time2If the rotation angle around the Y-axis is also greater than 15 degrees, it can be determined that the terminal has moved from the initial position (with the display screen of the terminal facing the user's ear) to the final position (with the display screen of the terminal facing the user's face). That is, step 306 may be considered: the display screen of the terminal faces the face area of the user. That is, the first posture information and the second posture information may be analyzed, and when the acceleration variation of the Y-axis component is greater than a preset value and the rotation angle of the Y-axis component is also greater than a preset angle, it is considered that the display screen of the terminal faces the face area of the user.
When the acceleration variation of the Y-axis component is smaller than the preset value, or the rotation angle of the Y-axis component is smaller than the preset angle, it is determined that step 308: the display screen of the terminal is not oriented to the face area of the user.
It should be noted that the order of the determination before and after the step 302 and the step 304 is not limited, and the step 304 may be performed in an alternative order, and the step 304 may be performed first, and when the determination result is yes, the step 302 may be performed, when the determination result is yes, the step 306 may be performed, and when the determination result is no, the step 308 may be performed.
In one embodiment, in order to ensure accuracy of the display screen of the terminal facing the face area of the user, in step 306, before the display screen of the terminal facing the face area of the user, the method further includes:
acquiring the current terminal triaxial component acceleration, and judging whether the triaxial component acceleration is within a preset range;
when the three-axis component accelerations are all within the preset range, step 306 is executed: the display screen of the terminal faces the face area of the user.
It will be appreciated that the predetermined range of three-axis component acceleration may beTo set as: -5m/s2<X axis<5m/s2,-2m/s2<Y-axis<9.8m/s2,0.5m/s2<Z axis<9.8m/s2. If the current three-axis component acceleration of the terminal is within the preset range, it can be considered that the display screen of the terminal faces the face area at this time.
The screen control method in the implementation can accurately judge the attitude information of the current terminal based on the acceleration and angular velocity data of the terminal acquired by the acceleration sensor and the gyroscope within the preset time, and further can accurately judge whether the display screen of the current terminal faces to the face area of the user; when the display screen of the terminal faces to the face area of the user, the terminal is controlled to be lightened.
In one embodiment, before acquiring the acceleration of the three-axis component acquired by the acceleration sensor within a preset time period, the method further includes:
step 200: and detecting whether the terminal shakes or not based on the acceleration sensor.
The terminal is provided with a screen lifting and brightening function switch, for example, the switch name is 'hand lifting and brightening', and the switch is used for turning on or off the screen lifting and brightening function. The terminal can receive an operation instruction input by a user on the setting interface to turn on or turn off the screen-up function, and the operation instruction can be at least one of touch operation, voice operation and key operation. The terminal can also intelligently switch the lifting wake-up switch according to different scenes, for example, the lifting wake-up function is turned off in a time period of 24: 00-8: 00, and the lifting wake-up function is turned on in other time periods.
When the terminal is turned on, the Motion detection function of the acceleration sensor is turned on, that is, the Acceleration Motion Detection (AMD) function of the acceleration sensor starts to operate, so as to collect the position information of the terminal, but the gyroscope for collecting the angular velocity is in a low power consumption state. And judging whether the terminal shakes or not according to whether the acceleration sensor detects that the acceleration of the three-axis component changes or not.
Step 201: and when the terminal shakes, the gyroscope is started to be in a working state.
When the acceleration sensor detects X, Y, Z that the acceleration of any one of the three axes is changed, it indicates that the terminal is jittered. Generally, the gyroscope is in a dormant state, when the terminal shakes, the main chip controls the gyroscope to be turned on, so that the gyroscope is in a working state, a data transmission channel with the gyroscope is started, and the rotation angle information acquired by the gyroscope is acquired.
In the screen control method of the terminal in this embodiment, the acceleration sensor and the gyroscope are not simultaneously turned on, and when the acceleration sensor detects that the three-axis acceleration of the terminal changes, the main chip and the gyroscope in the low power consumption state are restored to the operating state, so that power consumption can be saved.
Fig. 4 is a flowchart for controlling the terminal to light up the screen when the display screen of the terminal faces the face area in one embodiment.
When the display screen of the terminal faces the face area, controlling the terminal to light the screen, including:
step 402: and acquiring the contact person attributes of the call, wherein the contact person attributes comprise a private contact person, a common contact person and a system contact person.
When the terminal is in a call state, the contact person attribute for the call with the current user can be acquired. The contact attributes include private contacts, frequent contacts, and system contacts.
The terminal can be pre-established with a private contact information list, a common contact information list and a system contact information list, so that when the terminal calls passively or actively, the contact attributes of the contacts communicating with the user can be determined through the private contact information list, the common contact information list and the system contact information list.
It should be noted that the private contact is a contact that the user does not want to be known by people; the common contact persons are common contact persons and can be known by people; the system contact person can be a contact person generated by a service system such as a bank, an operator, insurance and the like.
Step 404: and determining a bright screen control strategy corresponding to the contact person attribute according to the mapping relation between the contact person attribute and the bright screen control strategy.
The contact persons with different contact person attributes correspond to different screen-off control strategies, and the mapping relation between the contact person attributes and the screen-on control strategies can be stored in the terminal in advance. The contact person attribute and the bright screen control strategy can be in a one-to-one mapping relationship or in a multi-pair mapping relationship. The bright screen control strategy comprises bright screen disguising contact information, bright screen display conversation interface and bright screen locking.
Specifically, the step of camouflaging the information of the contact by the bright screen can be understood as camouflaging the information of the contact.
Further, the screen-lighting disguised contact information includes:
and when the contact is a private contact, displaying the information of the contact as a strange number or a private number. The contact information includes the name, telephone number and other information of the contact. For example, the name of the private contact can be disguised as the name of the system contact such as "china unicom", "china mobile", "business recruitment bank", etc. The bright screen display of the call interface can be understood as normal display of the call interface, and at least one or more of information of the contact, passing time, recording, phone book, adding call, mute, hands-free or keyboard are displayed; the screen is lightened and locked, so that only the current time is displayed, and the screen is locked, thereby avoiding the call ending caused by misoperation.
Step 406: and carrying out screen lightening control processing on the terminal according to the determined screen lightening control strategy.
And carrying out screen lightening control processing on the terminal according to the determined screen lightening control strategy. If the contact person attribute of the contact person currently communicating with the user is a private contact person, the bright screen control strategy corresponding to the private contact person is a bright screen disguising contact person information. That is, when the user of the handheld terminal is talking with the private contact a, and in the process of talking, the terminal is placed beside the ear of the user (not using the earphone for talking), when the user is interfered by the external environment or the user needs to use other functions (viewing information, time, etc.) of the terminal, the user moves the terminal from beside the ear of the user to the face area of the user, the screen of the terminal can be automatically lighted, and meanwhile, in the process of lighting the screen, the information of the private contact can be disguised. For example, a may be replaced by "# # bank" or the telephone number of the private contact may be replaced by "955", the name of which corresponds to the telephone number. Certainly, the name and the telephone number of a may also be set in a customized manner according to the user's needs, and the way of disguising the information of the private contact is not further limited.
The screen control method in the embodiment can protect the information of the private contact, improves the flexibility of privacy protection, and has a good privacy protection effect.
Fig. 5 is a flowchart of a screen control method of a terminal in one embodiment.
A screen control method of a terminal includes:
step 502: when a terminal is in a call state, acquiring a screen state of the terminal;
step 504: when the screen is in a screen-off state, detecting attitude information of the terminal based on a sensor built in the terminal within a preset time;
step 506: judging whether a display screen of the terminal faces to a face area of a user or not according to the attitude information;
step 508: and when the display screen of the terminal faces the face area, controlling the terminal to light up the screen.
Step 510: and when the terminal is in a screen-off state, acquiring the attitude information of the current terminal.
And when the terminal is in the screen-off state according to the step 502 and the step 510, acquiring the attitude information of the current terminal. The attitude information of the terminal is also acquired based on an acceleration sensor and a gyroscope built in the terminal.
Step 512: and when the attitude information accords with a preset screen-off condition, carrying out screen-off control processing on the terminal.
The preset screen-off condition is preset in the terminal and can be represented by the attitude information of the terminal.
In one embodiment, the preset screen-off condition is used to reflect that the terminal is dropped. For example, the preset screen-off condition may be that the rotation angle of the X-axis component is greater than or equal to 20 degrees, and the Y-axis acceleration is less than 0.05m/s2. When the attitude information meets the preset screen-off condition, the terminal is changed from a lifting state (an initial position: the display interface of the terminal faces to the face) to a putting-down state (the arm of the terminal is held by the user to naturally hang down), namely, the user hangs up the phone and the arm naturally hangs down.
In one embodiment, the preset screen-off condition is used to reflect that the terminal is being put back in a pocket or backpack. The preset screen-off condition can also be that the rotation angles of the Y-axis component and the Z-axis component are both more than or equal to 30 degrees and the acceleration of the Y-axis is less than 1m/s2. Wherein the rotation angle of the Y-axis component is greater than or equal to 30 degrees, the rotation angle of the Z-axis component is greater than or equal to 40 degrees, and the acceleration of the Y-axis is less than 1m/s2. When the attitude information meets the preset screen-off condition, the terminal is changed from a lifting state (an initial position: the display interface of the terminal faces to the face) to a putting-down state (the state that the terminal is placed in a pocket), namely, a user hangs up the phone and places the terminal in the pocket.
In one embodiment, the preset screen-off condition is used to reflect that the terminal is flipped. The preset screen-off condition can also be that the rotation angle of the X-axis component is more than or equal to 30 degrees and the Z-axis acceleration is less than-9.81 m/s2. Or the preset screen-off condition can also be that the rotation angle of the Y-axis component is more than or equal to 30 degrees and the Z-axis acceleration is less than-9.81 m/s2。
And when the terminal is in a bright screen state and the posture information in the bright screen state meets the preset screen-off condition, carrying out screen-off processing on the terminal. Of course, the preset screen-off condition may also be set according to the usage habit of the user, and is not limited to the numerical information listed in the above embodiment.
According to the method in the embodiment, the attitude information of the terminal can be accurately acquired by combining the acceleration sensor in the preset time and the triaxial acceleration and the rotation angle acquired by the gyroscope, and the screen turn-off control can be accurately and timely performed on the terminal by combining the preset screen turn-off condition, so that a user does not need to use a power supply key, the use is convenient and fast, the power consumption is saved, and the user experience is improved.
It should be understood that although the various steps in the flow charts of fig. 1-5 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 1-5 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performance of the sub-steps or stages is not necessarily sequential, but may be performed in turn or alternating with other steps or at least some of the sub-steps or stages of other steps.
Fig. 6 is a block diagram of a screen control apparatus of a terminal according to an embodiment. A screen control apparatus of a terminal, the apparatus comprising:
the acquiring module 610 acquires a screen state of a user terminal when the terminal is in a call state;
the acquisition module 620 is used for detecting the attitude information of the terminal based on a sensor built in the terminal within a preset time when the screen is in a screen-off state;
a judging module 630, configured to judge whether a display screen of the terminal faces a face area of a user according to the posture information; and
and the control module 640 is configured to control the terminal to light up the screen when the display screen of the terminal faces the face area.
The screen control device of the terminal in the embodiment can avoid the situation that the distance sensor is shielded by other objects (non-human ears) in the conversation process and cannot be used for carrying out screen lightening processing on the terminal in time, can realize screen lightening operation timely and accurately based on the posture information acquired by the terminal, and improves the user experience.
In one embodiment, the acquisition module comprises an acceleration sensor and a gyroscope; the variation of the acceleration of X, Y, Z triaxial components collected by the acceleration sensor in a preset time is obtained; the rotation angle of X, Y, Z triaxial components acquired by the gyroscope in the preset time; the acquisition module 620 acquires the attitude information based on the variation and the rotation angle of the acceleration acquired by the acceleration sensor and the gyroscope.
In one embodiment, the acquisition module 620 further comprises:
the judging unit is used for detecting whether the terminal shakes or not based on the acceleration sensor;
and the starting unit is used for starting the gyroscope to enable the gyroscope to be in a working state when the terminal shakes.
In this embodiment, the acceleration sensor and the gyroscope are not simultaneously turned on, and when the acceleration sensor detects that the triaxial acceleration of the terminal changes, the main chip and the gyroscope in the low power consumption state are enabled to be recovered to the working state, so that the power consumption can be saved.
In one embodiment, the determining module includes:
the first judging unit is used for judging whether the variation of the acceleration of the Y-axis component in the preset time is larger than a preset value or not;
the second judging unit is used for judging whether the rotating angle of the Y-axis component in the preset time is larger than a preset angle or not when the judging result of the first judging unit is yes;
the third judging unit is used for enabling the display screen of the terminal to face the face area of the user when the judging result of the second judging unit is positive; and when the judgment result of the second judgment unit is negative, the display screen of the terminal is not towards the face area of the user.
In one embodiment, a control module includes:
the device comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring the contact person attributes of the call, and the contact person attributes comprise a private contact person, a common contact person and a system contact person;
the determining unit is used for determining a bright screen control strategy corresponding to the contact person attribute according to the mapping relation between the contact person attribute and the bright screen control strategy; the bright screen control strategy comprises bright screen disguising contact information, bright screen display of a call interface and bright screen display of a virtual keyboard;
and the screen lightening control unit is used for carrying out screen lightening control processing on the terminal according to the determined screen lightening control strategy.
Specifically, the determining unit is further configured to display information of the contact as a strange number or a privacy number when the contact is a private contact.
The screen control device in the embodiment can protect the information of the private contact person, improves the flexibility of privacy protection, and has a good privacy protection effect.
In one embodiment, the control module in the screen control device of the terminal further includes:
and the screen turn-off control unit is used for controlling the screen turn-off of the terminal when the attitude information accords with the preset screen turn-off condition. The device in this embodiment combines the triaxial acceleration and the rotation angle that acceleration sensor and gyroscope gathered in the preset time can be accurate acquire the gesture information at terminal to combine to predetermine the condition of going out the screen, just can be accurate, timely go out the screen control to the terminal, need not the user and use the power button, convenient to use is swift, saves the consumption, has promoted user's experience degree.
The division of each module in the screen control device of the terminal is only used for illustration, and in other embodiments, the screen control device of the terminal may be divided into different modules as needed to complete all or part of the functions of the screen control device of the terminal.
For specific limitations of the screen control device of the terminal, reference may be made to the above limitations of the screen control method of the terminal, which are not described herein again. Each module in the screen control device of the terminal may be wholly or partially implemented by software, hardware, and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
Fig. 7 is a schematic diagram of the internal structure of the terminal in one embodiment. As shown in fig. 7, the terminal includes a processor, a memory, and a network interface connected through a system bus. Wherein, the processor is used for providing calculation and control capability and supporting the operation of the whole electronic equipment. The memory is used for storing data, programs and the like, and the memory stores at least one computer program which can be executed by the processor to realize the wireless network communication method suitable for the electronic device provided by the embodiment of the application. The memory may include a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The computer program can be executed by a processor for implementing a screen control method of a terminal provided in the following embodiments. The internal memory provides a cached execution environment for the operating system computer programs in the non-volatile storage medium. The network interface may be an ethernet card or a wireless network card, etc. for communicating with an external electronic device. The electronic device may be a mobile phone, a tablet computer, or a personal digital assistant or a wearable device, etc.
The implementation of each module in the screen control device of the terminal provided in the embodiments of the present application may be in the form of a computer program. The computer program may be run on a terminal or a server. The program modules constituted by the computer program may be stored on the memory of the terminal or the server. Which when executed by a processor, performs the steps of the method described in the embodiments of the present application.
The embodiment of the application also provides a computer readable storage medium. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the steps of a screen control method of a terminal.
A computer program product containing instructions which, when run on a computer, cause the computer to perform a screen control method of a terminal.
The embodiment of the application also provides a terminal. As shown in fig. 8, for convenience of explanation, only the parts related to the embodiments of the present application are shown, and details of the technology are not disclosed, please refer to the method part of the embodiments of the present application. The terminal may be any terminal device including a mobile phone, a tablet computer, a PDA (Personal Digital Assistant), a POS (Point of sales), a vehicle-mounted computer, a wearable device, and so on, taking the terminal as the mobile phone as an example:
fig. 8 is a block diagram of a partial structure of a mobile phone related to a terminal according to an embodiment of the present application. Referring to fig. 8, the handset includes: radio Frequency (RF) circuitry 810, memory 820, input unit 830, display unit 840, sensor 850, audio circuitry 860, wireless fidelity (WiFi) module 870, processor 880, and power supply 890. Those skilled in the art will appreciate that the handset configuration shown in fig. 8 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The RF circuit 810 may be used for receiving and transmitting signals during information transmission and reception or during a call, and may receive downlink information of a base station and then process the downlink information to the processor 880; the uplink data may also be transmitted to the base station. Typically, the RF circuitry includes, but is not limited to, an antenna, at least one Amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuit 810 may also communicate with networks and other devices via wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to Global System for mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE)), e-mail, Short Messaging Service (SMS), and the like.
The memory 820 may be used to store software programs and modules, and the processor 880 executes various functional applications and data processing of the cellular phone by operating the software programs and modules stored in the memory 820. The memory 820 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function (such as an application program for a sound playing function, an application program for an image playing function, and the like), and the like; the data storage area may store data (such as audio data, an address book, etc.) created according to the use of the mobile phone, and the like. Further, the memory 820 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 830 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone 800. Specifically, the input unit 830 may include an operation panel 831 and other input devices 832. The operation panel 831, which may also be referred to as a touch screen, may collect touch operations of a user (e.g., operations of the user on the operation panel 831 or in the vicinity of the operation panel 831 using any suitable object or accessory such as a finger or a stylus) thereon or nearby, and drive the corresponding connection device according to a preset program. In one embodiment, the operation panel 831 may include two portions of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts it to touch point coordinates, and sends the touch point coordinates to the processor 880, and can receive and execute commands from the processor 880. In addition, the operation panel 831 may be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The input unit 830 may include other input devices 832 in addition to the operation panel 831. In particular, other input devices 832 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), and the like.
The display unit 840 may be used to display information input by the user or information provided to the user and various menus of the cellular phone. The display unit 840 may include a display panel 841. In one embodiment, the Display panel 841 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. In one embodiment, the operation panel 831 can cover the display panel 841, and when the operation panel 831 detects a touch operation thereon or nearby, it is transmitted to the processor 880 to determine the type of touch event, and then the processor 880 provides a corresponding visual output on the display panel 841 according to the type of touch event. Although in fig. 8 the operation panel 831 and the display panel 841 are two separate components to implement the input and output functions of the mobile phone, in some embodiments the operation panel 831 and the display panel 841 may be integrated to implement the input and output functions of the mobile phone.
The cell phone 800 may also include at least one sensor 850, such as light sensors, motion sensors, and other sensors. Specifically, the light sensor may include an ambient light sensor that adjusts the brightness of the display panel 841 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 841 and/or the backlight when the mobile phone is moved to the ear. The motion sensor can comprise an acceleration sensor, the acceleration sensor can detect the magnitude of acceleration in each direction, the magnitude and the direction of gravity can be detected when the mobile phone is static, and the motion sensor can be used for identifying the application of the gesture of the mobile phone (such as horizontal and vertical screen switching), the vibration identification related functions (such as pedometer and knocking) and the like; the mobile phone may be provided with other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor.
The audio circuitry 860, speaker 861 and microphone 862 may provide an audio interface between the user and the handset. The audio circuit 860 can transmit the electrical signal converted from the received audio data to the speaker 861, and the electrical signal is converted into a sound signal by the speaker 861 and output; on the other hand, the microphone 862 converts the collected sound signal into an electrical signal, which is received by the audio circuit 860 and converted into audio data, and then the audio data is output to the processor 880 for processing, and then the audio data may be transmitted to another mobile phone through the RF circuit 810, or the audio data may be output to the memory 820 for subsequent processing.
WiFi belongs to short-distance wireless transmission technology, and the mobile phone can help a user to send and receive e-mails, browse webpages, access streaming media and the like through the WiFi module 870, and provides wireless broadband Internet access for the user. Although fig. 8 shows WiFi module 870, it is understood that it is not an essential component of cell phone 800 and may be omitted as desired.
The processor 880 is a control center of the mobile phone, connects various parts of the entire mobile phone using various interfaces and lines, and performs various functions of the mobile phone and processes data by operating or executing software programs and/or modules stored in the memory 820 and calling data stored in the memory 820, thereby performing an overall listening to the mobile phone. In one embodiment, processor 880 may include one or more processing units. In one embodiment, the processor 880 may integrate an application processor and a modem, wherein the application processor primarily handles operating systems, user interfaces, applications, and the like; the modem handles primarily wireless communications. It is to be appreciated that the modem need not be integrated into the processor 880. For example, the processor 880 may integrate an application processor and a baseband processor, which may constitute a modem with other peripheral chips, etc. The phone 800 also includes a power supply 890 (e.g., a battery) for powering the various components, which may be logically coupled to the processor 890 through a power management system that may be used to manage charging, discharging, and power consumption.
In one embodiment, the cell phone 800 may also include a camera, a bluetooth module, and the like.
In the embodiment of the present application, the processor included in the mobile phone implements the screen control method of the terminal described above when executing the computer program stored on the memory.
Any reference to memory, storage, database, or other medium used herein may include non-volatile and/or volatile memory. Suitable non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and bus dynamic RAM (RDRAM).
The above examples only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.
Claims (8)
1. A screen control method of a terminal, comprising:
when a terminal is in a call state, acquiring a screen state of the terminal;
when the screen is in a screen-off state, detecting attitude information of the terminal based on a sensor built in the terminal within a preset time; the built-in sensor of the terminal comprises an acceleration sensor and a gyroscope;
judging whether a display screen of the terminal faces to a face area of a user or not according to the attitude information;
when the display screen of the terminal faces to the face area, controlling the terminal to light up the screen;
detecting the attitude information of the terminal based on a sensor built in the terminal within a preset time, wherein the method comprises the following steps: acquiring the preset time; continuously acquiring the acceleration and the rotation angle of X, Y, Z triaxial components acquired by the acceleration sensor and the gyroscope within a preset time; acquiring the attitude information according to the acceleration and the rotation angle;
before the acceleration and the rotation angle of the X, Y, Z triaxial component that acceleration sensor, gyroscope gathered are continuously obtained in the preset time, still include: detecting whether the terminal shakes based on the acceleration sensor; and when the terminal shakes, the gyroscope is started to be in a working state.
2. The method of claim 1, wherein determining whether a display screen of the terminal faces a face area of a user according to the pose information comprises:
judging whether the variation of the acceleration of the Y-axis component in the preset time is larger than a preset value or not;
if yes, judging whether the rotating angle of the Y-axis component in the preset time is larger than a preset angle or not;
if so, enabling a display screen of the terminal to face the face area of the user;
and if not, the display screen of the terminal does not face the face area of the user.
3. The method according to claim 1, wherein the controlling the terminal to light up a screen when a display screen of the terminal faces a face area comprises:
acquiring contact person attributes of a call, wherein the contact person attributes comprise a private contact person, a common contact person and a system contact person;
determining a bright screen control strategy corresponding to the contact person attribute according to the mapping relation between the contact person attribute and the bright screen control strategy; the bright screen control strategy comprises bright screen disguising contact information, bright screen display call interface and bright screen locking;
and carrying out screen lightening control processing on the terminal according to the determined screen lightening control strategy.
4. The method of claim 3, wherein the highlighting disguising contact information comprises:
and when the contact is a private contact, displaying the information of the contact as a strange number or a private number.
5. The method according to any one of claims 1-4, further comprising:
when the terminal is in a bright screen state, acquiring the attitude information of the current terminal;
and when the attitude information accords with a preset screen-off condition, carrying out screen-off control processing on the terminal.
6. A screen control apparatus of a terminal, the apparatus comprising:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring the screen state of a user terminal when the user terminal is in a call state;
the acquisition module is used for detecting the attitude information of the terminal based on a sensor built in the terminal when the screen is in a screen-off state; the built-in sensor of the terminal comprises an acceleration sensor and a gyroscope; the method comprises the steps that an acceleration sensor collects X, Y, Z the variation of the acceleration of a three-axis component in a preset time, a gyroscope collects X, Y, Z the rotation angle of the three-axis component in the preset time, and a collection module obtains attitude information based on the variation and the rotation angle of the acceleration collected by the acceleration sensor and the gyroscope;
the judging module is used for judging whether a display screen of the terminal faces to a face area of a user according to the posture information;
the control module is used for controlling the terminal to light up a screen when a display screen of the terminal faces a face area;
the acquisition module also comprises a judgment unit and an opening unit; the judging unit is used for detecting whether the terminal shakes or not based on the acceleration sensor; and the starting unit is used for starting the gyroscope to enable the gyroscope to be in a working state when the terminal shakes.
7. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 5.
8. A terminal comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the steps of the method according to any of claims 1 to 5 are implemented by the processor when executing the computer program.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810113000.0A CN108418953B (en) | 2018-02-05 | 2018-02-05 | Screen control method and device of terminal, readable storage medium and terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810113000.0A CN108418953B (en) | 2018-02-05 | 2018-02-05 | Screen control method and device of terminal, readable storage medium and terminal |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108418953A CN108418953A (en) | 2018-08-17 |
CN108418953B true CN108418953B (en) | 2020-04-24 |
Family
ID=63127778
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810113000.0A Expired - Fee Related CN108418953B (en) | 2018-02-05 | 2018-02-05 | Screen control method and device of terminal, readable storage medium and terminal |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108418953B (en) |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI688882B (en) | 2019-03-29 | 2020-03-21 | 華碩電腦股份有限公司 | Electronic device and controlling method thereof |
CN111405110B (en) * | 2020-03-11 | 2021-08-03 | Tcl移动通信科技(宁波)有限公司 | Screen control method and device, storage medium and mobile terminal |
CN113596248B (en) * | 2020-04-30 | 2022-11-11 | 华为技术有限公司 | Display screen control method and device |
CN111668898A (en) * | 2020-06-03 | 2020-09-15 | 芯盟科技有限公司 | Novel small night lamp and application thereof |
CN114077468A (en) * | 2020-08-18 | 2022-02-22 | 华为技术有限公司 | Screen window redrawing method, electronic equipment and computer-readable storage medium |
CN114125143B (en) * | 2020-08-31 | 2023-04-07 | 华为技术有限公司 | Voice interaction method and electronic equipment |
CN112596600B (en) * | 2020-12-16 | 2024-08-20 | 惠州Tcl移动通信有限公司 | Screen unlocking method and device, storage medium and mobile terminal |
CN112799774A (en) * | 2021-04-06 | 2021-05-14 | 北京孵家科技股份有限公司 | Intelligent commodity sale method, device and system based on big data |
CN113472940B (en) * | 2021-06-08 | 2022-06-10 | Tcl通讯(宁波)有限公司 | Mobile terminal optimization processing method and device based on wearable device, mobile terminal and storage medium |
CN113674671A (en) * | 2021-08-18 | 2021-11-19 | 惠科股份有限公司 | Control method, peripheral controller and display device |
CN113900527A (en) * | 2021-10-29 | 2022-01-07 | 深圳Tcl数字技术有限公司 | Display screen control method and device, storage medium and display equipment |
CN114244731B (en) * | 2021-12-16 | 2024-02-27 | 湖南师范大学 | Terminal screen brightness detection method and device, server and electronic equipment |
CN116450067A (en) * | 2022-01-10 | 2023-07-18 | 荣耀终端有限公司 | Control method for screen-off display, electronic equipment and storage medium |
CN114125148B (en) * | 2022-01-11 | 2022-06-24 | 荣耀终端有限公司 | Control method of electronic equipment operation mode, electronic equipment and readable storage medium |
CN116578227B (en) * | 2023-07-10 | 2024-01-16 | 深圳市易赛通信技术有限公司 | Screen control method, device and equipment of intelligent watch and storage medium |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102377871A (en) * | 2010-08-24 | 2012-03-14 | 联想(北京)有限公司 | Information processing equipment and control method thereof |
CN105306710A (en) * | 2015-10-23 | 2016-02-03 | 上海斐讯数据通信技术有限公司 | Method and system for displaying a time based on screen locking state of smart phone |
-
2018
- 2018-02-05 CN CN201810113000.0A patent/CN108418953B/en not_active Expired - Fee Related
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102377871A (en) * | 2010-08-24 | 2012-03-14 | 联想(北京)有限公司 | Information processing equipment and control method thereof |
CN105306710A (en) * | 2015-10-23 | 2016-02-03 | 上海斐讯数据通信技术有限公司 | Method and system for displaying a time based on screen locking state of smart phone |
Also Published As
Publication number | Publication date |
---|---|
CN108418953A (en) | 2018-08-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108418953B (en) | Screen control method and device of terminal, readable storage medium and terminal | |
CN108388414B (en) | Screen-off control method and device for terminal, computer-readable storage medium and terminal | |
CN108430100B (en) | Screen control method and device of terminal, readable storage medium and terminal | |
CN114741011B (en) | Terminal display method and electronic equipment | |
CN110168483B (en) | Shortcut menu for displaying different applications on different screens | |
US20170315777A1 (en) | Method, terminal, and storage medium for starting voice input function of terminal | |
US10951754B2 (en) | Method for responding to incoming call by means of fingerprint recognition, storage medium, and mobile terminal | |
CN110456911B (en) | Electronic equipment control method and device, electronic equipment and readable storage medium | |
CN108391001A (en) | The screen control method and device of terminal, readable storage medium storing program for executing, terminal | |
CN108777741B (en) | Antenna switching control method and related product | |
CN107957843B (en) | Control method and mobile terminal | |
WO2018099043A1 (en) | Terminal behavior triggering method and terminal | |
CN112566089A (en) | Power consumption saving method, intelligent wearable device and computer readable storage medium | |
CN108347758A (en) | screen awakening method and device, terminal, computer readable storage medium | |
CN108537025B (en) | Privacy protection method and device, computer readable storage medium and terminal | |
CN108573169A (en) | Nearest task list display methods and device, storage medium, electronic equipment | |
EP2996316B1 (en) | Methods and systems for communication management between an electronic device and a wearable electronic device | |
CN107046595A (en) | Announcement information processing method, device and mobile terminal | |
CN112805988B (en) | Call control method and device, computer readable storage medium and electronic equipment | |
CN107734153B (en) | Call control method, terminal and computer readable storage medium | |
CN110071866B (en) | Instant messaging application control method, wearable device and storage medium | |
CN109151184A (en) | A kind of method for controlling mobile terminal, mobile terminal and computer readable storage medium | |
CN109933187B (en) | Wearing equipment operation control method, wearing equipment and computer readable storage medium | |
CN111225105B (en) | Method for controlling screen work, mobile terminal and storage medium | |
CN109918014B (en) | Page display method, wearable device and computer-readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information | ||
CB02 | Change of applicant information |
Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18 Applicant after: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd. Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18 Applicant before: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd. |
|
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20200424 |