CN111651114A - Screen control method and device and storage medium - Google Patents

Screen control method and device and storage medium Download PDF

Info

Publication number
CN111651114A
CN111651114A CN202010461144.2A CN202010461144A CN111651114A CN 111651114 A CN111651114 A CN 111651114A CN 202010461144 A CN202010461144 A CN 202010461144A CN 111651114 A CN111651114 A CN 111651114A
Authority
CN
China
Prior art keywords
data
matrix model
gesture
sensor
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010461144.2A
Other languages
Chinese (zh)
Inventor
陈朝喜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202010461144.2A priority Critical patent/CN111651114A/en
Publication of CN111651114A publication Critical patent/CN111651114A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers

Abstract

The disclosure relates to a screen control method and device and a storage medium. The method is used in a terminal comprising a touch screen, and comprises the following steps: acquiring data acquired by a sensor of a terminal, wherein the data acquired by the sensor at least comprises acceleration data, angular velocity data and barometer data; processing data acquired by the sensor based on a preset matrix model group to obtain an average value of each type of data; the preset matrix model group comprises at least one matrix model; judging the gesture of the user according to the mean value of each type of data and a preset rule; and controlling the display of the touch screen according to the judgment result of the user gesture. By the method, the gesture judgment can be performed by utilizing data collected by various types of sensors in the mobile terminal, and the screen is controlled to realize the function of the proximity sensor, so that the installation of the proximity sensor can be omitted, the screen occupation ratio of the mobile terminal can be improved, the cost is reduced, the design complexity and the process requirement are reduced, and the calibration flow is simplified.

Description

Screen control method and device and storage medium
Technical Field
The present disclosure relates to the field of electronic devices, and in particular, to a screen control method and apparatus, and a storage medium.
Background
With the continuous development of terminal technology, more and more convenience is brought to the daily life of people, and the requirements of users on the aspects of terminal equipment attractiveness and the like are gradually improved. Among them, a full screen has become a trend of mobile terminal development.
However, today's mobile terminals are not truly full-screen devices because, when the mobile terminals implement some functions based on built-in sensors, such as image capture sensors, proximity sensors, etc., it may be necessary to open holes on the display screen, thereby preventing the mobile terminals from moving toward a truly full-screen mobile device. Therefore, how to implement a full screen without reducing the functions supported by the mobile terminal becomes a problem to be solved urgently.
Disclosure of Invention
The disclosure provides a screen control method and device and a storage medium.
According to a first aspect of the embodiments of the present disclosure, there is provided a screen control method for a terminal including a touch screen, including: acquiring data acquired by a sensor of the terminal, wherein the data acquired by the sensor at least comprises acceleration data, angular velocity data and barometer data;
processing the data acquired by the sensor based on a preset matrix model group to obtain the mean value of each type of data; wherein the preset matrix model group comprises at least one matrix model;
judging the gesture of the user according to the average value of each type of data and a preset rule;
and controlling the display of the touch screen according to the judgment result of the user gesture.
Optionally, the processing the data acquired by the sensor based on the preset matrix model group to obtain the mean value of each type of the data includes:
according to a first matrix model in the preset matrix model group, constructing a first matrix for a plurality of data types acquired by the sensor and data of each data type at different acquisition moments;
and performing operation processing on the first matrix by using a second matrix model in the preset matrix model group to obtain the mean value of each type of data.
Optionally, the preset rules include a mean threshold and a number threshold, and the determining the user gesture according to the mean of each type of data and the preset rules includes:
if the number of the data types exceeding the mean threshold value in the mean value of the data of each type is larger than the number threshold value, determining that the user gesture is a gesture corresponding to a gesture label; the average threshold value and the number threshold value are determined by processing the historical data acquired by the sensor by using the first matrix model and the second matrix model and combining the gesture labels corresponding to the historical data.
Optionally, the determining the user gesture according to the average value of each type of data and a preset rule includes:
calculating the mean value of each type of data by using a third matrix model in the preset matrix model group to obtain a normalized mean value corresponding to the mean value of all types of data;
and judging the user gesture according to the normalized mean value and the preset rule.
Optionally, the preset rule includes a normalization threshold, and the determining the user gesture according to the normalization average and the preset rule includes:
if the normalized mean value is larger than the normalized threshold value, determining that the user gesture is a gesture corresponding to a gesture label; the normalization threshold is determined by processing historical data acquired by the sensor by using a preset matrix model group comprising the third matrix model and combining the historical data with the gesture label corresponding to the historical data.
Optionally, controlling the display of the touch screen according to the determination result of the user gesture includes:
if the user gesture is hand lifting, controlling the touch screen to be turned off;
and if the user gesture is releasing, controlling the touch screen to be lightened.
Optionally, the data collected by the sensor further includes:
light sensation data collected by the light sensor;
first direction data collected by an electronic compass;
second direction data characterizing a direction of movement determined based on the acceleration data and/or the angular velocity data.
According to a second aspect of the embodiments of the present disclosure, there is provided a screen control device for a terminal including a touch screen, including:
the acquisition module is configured to acquire data acquired by a sensor of the terminal, wherein the data acquired by the sensor at least comprises acceleration data, angular velocity data and barometer data;
the processing module is configured to process the data acquired by the sensor based on a preset matrix model group to obtain an average value of each type of data; wherein the preset matrix model group comprises at least one matrix model;
the judging module is configured to judge the user gesture according to the average value of each type of data and a preset rule;
and the control module is configured to control the display of the touch screen according to the judgment result of the user gesture.
Optionally, the processing module is specifically configured to construct a first matrix for the multiple data types acquired by the sensor and the data of each data type at different acquisition times according to a first matrix model in the preset matrix model group; and performing operation processing on the first matrix by using a second matrix model in the preset matrix model group to obtain the mean value of each type of data.
Optionally, the determining module is specifically configured to determine that the user gesture is a gesture corresponding to a gesture label if the number of data types exceeding the mean threshold in the mean of the data types is greater than the number threshold; the average threshold value and the number threshold value are determined by processing the historical data acquired by the sensor by using the first matrix model and the second matrix model and combining the gesture labels corresponding to the historical data.
Optionally, the determining module is specifically configured to perform operation processing on the mean value of each type of data by using a third matrix model in the preset matrix model group, so as to obtain a normalized mean value corresponding to the mean value of all types of data; and judging the user gesture according to the normalized mean value and the preset rule.
Optionally, the preset rule includes a normalization threshold, and the determination module is specifically configured to determine that the user gesture is a gesture corresponding to a gesture tag if the normalization average is greater than the normalization threshold; the normalization threshold is determined by processing historical data acquired by the sensor by using a preset matrix model group comprising the third matrix model and combining the historical data with the gesture label corresponding to the historical data.
Optionally, the control module is specifically configured to control the touch screen to be turned off if the user gesture is a hand raising; and if the user gesture is releasing, controlling the touch screen to be lightened.
Optionally, the data collected by the sensor further includes:
light sensation data collected by the light sensor;
first direction data collected by an electronic compass;
second direction data characterizing a direction of movement determined based on the acceleration data and/or the angular velocity data.
According to a third aspect of the embodiments of the present disclosure, there is provided a screen control device including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform the screen control method as described in the first aspect above.
According to a fourth aspect of embodiments of the present disclosure, there is provided a storage medium including:
the instructions in the storage medium, when executed by a processor of a computer, enable the computer to perform the screen control method as described in the above first aspect.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
the method comprises the steps of processing multiple types of data at least including acceleration data, angular velocity data and barometer data, acquired by a sensor, based on a preset matrix model group to obtain the mean value of the data of each type, judging a user gesture according to the mean value of the data of each type and a preset rule, and controlling the display of a touch screen based on the determined user gesture to replace the function of a proximity sensor, so that the proximity sensor does not need to be additionally installed.
It can be understood that, the above scheme realizes the function of the proximity sensor by means of multi-sensor data fusion, on one hand, the proximity sensor is installed without opening a hole on the touch screen, so that the screen occupation ratio of the mobile terminal can be improved; on the other hand, the cost can be reduced, and the design complexity is reduced; in addition, because the proximity sensor does not need to be installed, the proximity sensor does not need to be calibrated, and the calibration flow is simplified.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a flowchart illustrating a screen control method according to an exemplary embodiment of the present disclosure.
Fig. 2 a-2 c are acceleration data collected by a three-axis accelerometer in an exemplary embodiment of the disclosure.
Fig. 3 a-3 c are angular rate data collected by a three-axis gyroscope in an exemplary embodiment of the present disclosure.
FIG. 4 is barometer data collected by a barometer in an exemplary embodiment of the disclosure.
FIG. 5 is a diagram of light sensation data collected by a light sensor in an exemplary embodiment of the present disclosure.
Fig. 6 a-6 c are magnetic field strength data collected by an electronic compass in an exemplary embodiment of the present disclosure.
Fig. 7 a-7 c are direction data one fused based on acceleration data and angular velocity data in an exemplary embodiment of the disclosure.
8 a-8 c are direction data two fused based on acceleration data and angular velocity data in an exemplary embodiment of the disclosure.
Fig. 9 a-9 c are three direction data based on the fused acceleration data and angular velocity data in an exemplary embodiment of the present disclosure.
Fig. 10 is a diagram illustrating a screen control device according to an exemplary embodiment of the present disclosure.
Fig. 11 is a block diagram of a mobile terminal shown in an exemplary embodiment of the present disclosure.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
Fig. 1 is a flowchart illustrating a screen control method according to an exemplary embodiment of the present disclosure, which is used in a terminal including a touch screen, as shown in fig. 1, and includes the following steps:
s11: acquiring data acquired by a sensor of the terminal, wherein the data acquired by the sensor at least comprises acceleration data, angular velocity data and barometer data;
s12: processing the data acquired by the sensor based on a preset matrix model group to obtain the mean value of each type of data; wherein the preset matrix model group comprises at least one matrix model;
s13: judging the gesture of the user according to the average value of each type of data and a preset rule;
s14: and controlling the display of the touch screen according to the judgment result of the user gesture.
The method is applicable to a terminal, and the terminal comprises the following steps: mobile terminals such as mobile phones, tablet computers and intelligent wearable devices.
The mobile terminal comprises a display screen so as to display information to a user. Moreover, the display screen can also be a touch screen with a touch function, so that a user can conveniently touch the screen to operate the mobile terminal based on the displayed information.
The mobile terminal is also internally provided with a sensor capable of acquiring motion data, and the sensor can comprise: accelerometers and gyroscopes. In step S11, the mobile terminal acquires acceleration data acquired by an accelerometer and angular velocity data acquired by a gyroscope; and may also include motion direction data, motion trajectory data, and the like.
In addition, a barometer is further built in the mobile terminal under the touch screen, and in step S11, the mobile terminal further obtains barometer data by using the barometer, where the barometer data can represent the current height of the mobile terminal.
Fig. 2 a-2 c are acceleration data collected by a three-axis accelerometer in the embodiment of the present disclosure, as shown in the figure, the horizontal axis represents a sampling time period, and the vertical axis represents acceleration values (data during a hand-raising and hand-releasing action) collected by the accelerometer during the process of lifting and lowering the mobile terminal.
Fig. 3 a-3 c are angular velocity data collected by a three-axis gyroscope in an embodiment of the present disclosure, where the horizontal axis represents a sampling time period and the vertical axis represents angular acceleration values collected by the gyroscope during the process of lifting and lowering the mobile terminal.
Fig. 4 is barometer data collected by the barometer in an embodiment of the disclosure, as shown in fig. 4, with the horizontal axis representing a sampling time period and the vertical axis representing barometric pressure values collected by the barometer during the process of lifting and lowering the mobile terminal.
In step S12, the mobile terminal processes the data collected by the sensor based on the preset matrix model group to obtain the mean value of each type of data. So as to determine the user gesture based on the average value of each type of data and the preset rule in step S13.
It should be noted that, in the embodiment of the present disclosure, the preset matrix model group includes at least one matrix model, and the matrix model includes a way of matrix arrangement for multi-type data collected by the sensor, and/or a processing matrix for processing the multi-type data collected by the sensor.
In addition, the user gesture includes a hand-up or hand-off gesture of the user, and the user gesture represents a gesture of the user carrying the mobile terminal to move so that the mobile terminal is in a certain movement trend. For example, when the user gesture is a hand raising, the hand raising gesture represents that the mobile terminal gradually approaches to the operation body (approaches to the ear of the user); if the user gesture is hands-off, the hands-off gesture represents that the mobile terminal gradually gets away from the operation body (gets away from the ears of the user).
And processing the data acquired by the sensor based on the preset matrix model group to obtain the mean value of each type of data. The mean value of each type of data reflects mean value data of the mobile terminal in the acquisition time, and at least comprises a mean value of acceleration data, a mean value of angular velocity, a mean value of barometer data and the like.
For example, when the user raises his hand to bring the mobile phone closer to the ear, the acceleration, the angular velocity, the movement direction, the movement track, and the like of the mobile phone may show a certain regularity, and the collected barometer data may also have a rule from large to small. Meanwhile, the average value of the data may also present certain characteristics based on the regularity of the data in the acquisition time.
Therefore, the present embodiment may determine the user gesture based on the average value of each type of data and the preset rule. The preset rule refers to a manner of determining the gesture of the user based on the mean value of each type of data, for example, the preset rule may be that the mean value exceeds a threshold value, or that the magnitude of the value between the mean values of each type of data satisfies a functional relationship, and the like.
It should be noted that, in some embodiments of the present disclosure, as described above, data corresponding to a user gesture has a certain regularity in an acquisition time, and therefore, when determining the user gesture, the data of each type may also be determined according to a variation trend of the data of each type after being processed by using a preset matrix model group.
In step S14, the touch screen of the mobile terminal is controlled according to the determination result of the user gesture. The function of controlling the touch screen display, i.e., the function of the proximity sensor, according to the user gesture is described above.
It should be noted that, in some embodiments of the present disclosure, the accelerometer and the gyroscope may detect a motion condition of the mobile terminal, and the barometer may obtain a barometric pressure change of the mobile terminal during the motion, where the barometric pressure change may be caused by a height change of the mobile terminal. Thus, embodiments of the present disclosure may utilize accelerometers, gyroscopes, and barometers to collect data to implement the functionality of a proximity sensor.
Generally, a proximity sensor is built in a mobile terminal to implement control of a display screen. The proximity sensor transmits signals outwards, receives signals reflected by the barrier, and calculates the distance between the barrier and the terminal according to the time difference between the transmission and the reception of the two signals; when the distance is smaller than the distance threshold, the terminal conducts screen-off operation; and when the distance is greater than the distance threshold, the terminal performs a screen lightening operation. However, since the proximity sensor needs to transmit and receive signals, a hole needs to be formed in the display screen of the mobile terminal, which may affect the screen occupancy of the screen. In addition, the size of the opening hole may affect the performance of the mobile terminal, and the opening hole may also introduce dust and dirt to affect the performance of the mobile terminal, so that the requirements on the structural design and the production environment of the mobile terminal are high, and the difficulty of design and production is increased. In addition, in order to improve consistency, the mobile terminal needs to calibrate the proximity sensor before shipping, which adds extra cost.
According to the embodiment of the disclosure, the display of the touch screen is controlled by performing operation processing on data acquired by various types of sensors by using the preset matrix model group and judging gestures, so that the function of the proximity sensor is realized, and the hardware installation of the proximity sensor can be avoided; on the other hand, the cost can be reduced, and the design complexity is reduced; in addition, because the proximity sensor does not need to be installed, the proximity sensor does not need to be calibrated, and the calibration flow is simplified.
In some embodiments, processing data collected by the sensor based on the preset matrix model group to obtain a mean value of each type of the data includes:
according to a first matrix model in a preset matrix model group, constructing a first matrix for a plurality of data types collected by a sensor and data of each data type at different collection moments;
and performing operation processing on the first matrix by using a second matrix model in the preset matrix model group to obtain the mean value of each type of data.
In this embodiment, the first matrix model includes an arrangement manner of data collected by the sensors, for example, multiple types of data collected by different sensors may be arranged in a row direction according to data types, and data collected at different times may be arranged in a column direction, so as to obtain the first matrix. The arrangement of the first matrix may be as follows in equation (1):
Figure BDA0002510959930000071
wherein i represents the number of data types, and j represents the number of acquisition time.
It should be noted that, when the data collected at different times are arranged in the column direction, different types of data collected at the same time are located in the same column, and different rows include different types of data, so that the different types of data on the same column can reflect the movement trend of the mobile terminal (for example, the mobile terminal is far away from or close to the operation body based on the user gesture); in addition, the movement trend of the mobile terminal can be reflected according to the variation trend of the same type of data acquired at different moments.
Based on the constructed first matrix, the second matrix model in the preset matrix model group can be utilized to perform operation processing on the first matrix, so that the mean value of each type of data is obtained.
In this embodiment, the second matrix model may be a matrix having the same number of rows as the number of columns of the first matrix, but the number of columns is 1, and the sum of squares of the elements in the second matrix model is 1. It can be understood that the second matrix model is a unit basis vector, and the product of the unit basis vector and the first matrix is the mean value of each type of data.
For example, the matrix of i × 1 obtained by multiplying the unit basis vector with the number of elements j by the first matrix of i × j is the matrix obtained by weighted averaging each type of data in the first matrix. And the weight when the data in each type is weighted and averaged is the element value in the second matrix model.
The second matrix model may be as follows in equation (2):
Ej=[e1…… ej]Twherein | e1|2+|e2|2+……+|ej|2=1 (2)
In the present disclosure, the first matrix of i × j and the second matrix model of j × 1 are multiplied to obtain a mean matrix of i × 1, which is shown in the following formula (3):
Figure BDA0002510959930000081
according to the formula (3), RiIs the average matrix of i x 1, and the elements in the average matrix represent the corresponding average of a certain type of data.
It should be noted that, as described above, the element values in the second matrix model represent different weights, and thus the element values in the second matrix model can be preset according to the variation trend of the historical data under the same type. For example, when it is found from the historical data that the data acquired later is more important in the process of lifting the mobile terminal, the variation trend of each element in the second matrix model may be set to follow the following rule:
Figure BDA0002510959930000082
wherein e isjIs the jth element, e, in the second matrix modelj-1Is the j-1 th element in the second matrix model. If the data acquired in the earlier stage is more important in the process of putting down the mobile terminal according to the historical data, the second step can be setThe variation trend of each element in the two-matrix model follows the following rule:
Figure BDA0002510959930000083
of course, the values of the elements in the second matrix model may be set to be the same.
In one embodiment, the preset rule includes a mean threshold and a number threshold, and step S13 includes:
if the number of the data types exceeding the mean threshold value in the mean value of the data types is larger than the number threshold value, determining that the user gesture is a gesture corresponding to the gesture label; the average threshold value and the number threshold value are determined by processing the historical data acquired by the sensor by using the first matrix model and the second matrix model and combining the gesture labels corresponding to the historical data.
In this embodiment, the preset rule is a multi-threshold approach. The mobile terminal determines the number of data types exceeding the mean threshold value in the mean value of the data types, and determines the gesture of the user as the gesture corresponding to the gesture label when the number of the data types is greater than the number threshold value.
It should be noted that, in the embodiment of the present disclosure, both the mean threshold and the number threshold are determined by processing with the first matrix model and the second matrix model based on different types of historical data and combining gesture tags corresponding to the historical data. When determining the mean threshold and the number threshold based on different types of historical data, the number threshold may be characterized as the number of data types at least required for the user to raise their hands; the mean threshold is determined from historical data of the user when raising his hand. Of course, this may also be determined from historical data of when the user was released.
In this embodiment, when the number of data types of which the mean value of each type of data exceeds the mean value threshold is greater than the number threshold, determining that the user gesture is a hand-up; otherwise, the user gesture is determined to be hands-off.
Illustratively, taking the user's hand-raising gesture as an example, when R is obtained by using the first matrix model and the second matrix model based on 7 types of historical data in fig. 2a to 4 collected by the sensor in the mobile phoneiHair waving deviceNow RiWhen 5 elements exceed the average threshold value T, the accuracy rate of lifting the mobile phone is judged to reach 99%, and then the average threshold value corresponding to the hand-lifting gesture is determined to be T, and the number threshold value is 5.
And then determining that the gesture of the user is a hand raising gesture when the number of the type data exceeding the threshold T is more than or equal to 5 based on the total 7 types of data collected in the current preset time.
It can be understood that, in the embodiment of the disclosure, the average threshold value and the number threshold value are set based on the historical data, and different judgment rules can be formulated according to actual conditions, so that the accuracy of user screen control can be improved, and more accurate control of the display screen can be realized.
In an embodiment, the third matrix model in the preset matrix model group may be further used to perform operation processing on the mean value of each type of data to obtain a normalized mean value corresponding to the mean value of all types of data;
and judging the user gesture according to the normalized mean value and a preset rule.
In this embodiment, when the user gesture is determined according to the mean value of each type of data and the preset rule, the third matrix model in the preset matrix model group may be further utilized to perform operation on the mean value of each type of data to obtain a normalized mean value, and the user gesture is determined according to the normalized mean value and the preset rule.
The third matrix model may be a matrix having the same number of columns as the number of data types of the mean value of each type of data, but having 1 number of rows, and the sum of squares of each element in the third matrix model being 1. It will be appreciated that the third matrix model is also a unit basis vector. And taking a product of the third matrix model and a matrix formed by the mean values of all types of data to obtain a numerical value, wherein the numerical value is the normalized mean value obtained after normalization processing is carried out on the mean values of all types of data. Based on the normalized mean and preset rules, the user gesture can be judged.
The third matrix model may be represented by the following formula (4):
Ei=[e1…… ei]wherein | e1|2+|e2|2+……i|ei|2=1 (4)
In the present disclosure, a normalized mean value can be obtained by multiplying the third matrix model of 1 × i and a matrix formed by the mean values of the data of each type of i × 1, as shown in the following formula (5):
Figure BDA0002510959930000091
from equation (5), R is a number.
It should be noted that, in this embodiment, the sizes of the elements in the third matrix model may also be preset, for example, different values are set according to the importance degree of different data types. Of course, the same size may be set, and at this time,
Figure BDA0002510959930000092
wherein e isiIs the ith element in the third matrix model, ei-1Is the (i-1) th element in the third matrix model.
In an embodiment, the preset rule includes a normalization threshold, and the determining the user gesture according to the normalization average and the preset rule includes:
if the normalized mean value is larger than the normalized threshold value, determining that the user gesture is a gesture corresponding to the gesture tag; the normalization threshold is determined by processing historical data acquired by the sensor by using a preset matrix model group comprising a third matrix model and combining gesture labels corresponding to the historical data.
In this embodiment, the normalized threshold is determined by processing the historical data acquired by the sensor by using a preset matrix model group including the third matrix model and combining the gesture label corresponding to the historical data. For example, the normalized threshold may be a ratio between a number threshold and the number of data types acquired by the mobile terminal. For example, the normalized threshold is 5/7. As previously described, the number threshold is determined for the historical data based on the first matrix model and the second matrix model, and the gesture labels corresponding to the historical data.
In this embodiment, if the normalized mean value exceeds the normalized threshold value, it is determined that the user gesture is a gesture corresponding to the gesture tag.
For example, a normalized threshold value determined based on the number threshold value obtained from the history data under the user's hand-up gesture as described above. Then, when the normalized mean exceeds the normalized threshold, the user gesture is determined to be a hand-up. And if the normalized mean value does not exceed the normalized threshold value, determining that the gesture of the user is hands-off.
In some embodiments, controlling the display of the touch screen according to the determination result of the user gesture includes:
if the user gesture is hand lifting, controlling the touch screen to be turned off;
and if the user gesture is releasing, controlling the touch screen to be lightened.
For example, when the method is applied to voice call of a user, when the gesture of the user is determined to be raising hands, and the touch screen of the mobile phone is close to ears, the touch screen can be controlled to be turned off in order to avoid misoperation caused by the ears and save electric quantity; and when the gesture of the user is determined to be hands off, the touch screen of the mobile phone is far away from the ear, the user may need to check information at the moment, and therefore the touch screen can be controlled to be lightened in order to improve user experience.
In the embodiment of the present disclosure, when the data collected by the multiple types of sensors is processed by using the preset matrix model group, the processing is not limited to the data collected by the accelerometer, the gyroscope, and the barometer.
In some embodiments, the data collected by the sensor may further include:
light sensation data collected by the light sensor;
first direction data collected by an electronic compass;
second direction data characterizing a direction of movement determined based on the acceleration data and/or the angular velocity data.
In this embodiment, the light of the mobile terminal may change during the movement based on the gesture of the user. Thus, light sensation data can be collected by the light sensor.
For example, when the mobile phone is in a user viewing state, the mobile phone may be located at a position in front of the chest of the user, there may be no interference of a blocking object around the mobile phone, and when the mobile phone is located at the ear, the ear may affect the light intensity entering into the mobile phone. Therefore, when the user gesture is to raise the hand, the light sensation data detected by the light sensor is changed from large to small.
In this embodiment, the mobile terminal may change the orientation of the mobile terminal during the movement based on the gesture of the user, and thus the change in the orientation of the mobile terminal detected by the electronic compass may also reflect the first direction of the movement of the mobile terminal, and the first direction data is the orientation of the mobile terminal in the two-dimensional direction.
In addition, the mobile terminal may also obtain second direction data of the mobile terminal movement based on the acceleration data and/or the angular velocity data. The second direction data is a movement angle of the mobile terminal in a three-dimensional direction when moving.
Fig. 5 is light sensation data collected by the light sensor in the embodiment of the disclosure, and as shown in fig. 5, the horizontal axis represents a sampling time period, and the vertical axis represents light intensity values collected by the light sensor during the process of lifting and lowering the mobile terminal.
Fig. 6a to 6c are magnetic field strength data acquired by the electronic compass in the embodiment of the present disclosure, as shown in the figure, the horizontal axis is a sampling time period, and the vertical axis represents magnetic field strength data acquired by the electronic compass during the process of lifting and lowering the mobile terminal, and the magnetic field strength data, i.e., the first direction data, reflects the direction change of the mobile terminal during the process of lifting and lowering the mobile terminal.
Fig. 7a to 7c are direction data one obtained by fusing acceleration data and angular velocity data according to the embodiment of the present disclosure, as shown in the figure, the horizontal axis represents a sampling time period, and the vertical axis represents amplitude information representing the current direction of the mobile terminal during the process of being lifted and lowered.
8 a-8 c are direction data two obtained after the fusion of acceleration data and angular velocity data according to the embodiment of the present disclosure, as shown in the figure, the horizontal axis is a sampling time period, and the vertical axis represents amplitude information representing the direction change amplitude of the mobile terminal during the process of being lifted and being lowered.
Fig. 9a to 9c are direction data three obtained by fusing acceleration data and angular velocity data according to the embodiment of the present disclosure, as shown in the figure, the horizontal axis represents a sampling time period, and the vertical axis represents amplitude information representing the current direction of the mobile terminal during the process of being lifted and lowered. The fusion pattern in fig. 9 a-9 c differs from the fusion pattern in fig. 7 a-7 c. The direction data in fig. 7 a-7 c, 8 a-8 c and 9 a-9 c all belong to the second direction data.
It can be understood that, in this embodiment, a plurality of sensor data which can reflect the motion condition of the mobile terminal are fused, so that the user gesture can be determined more accurately, and the display of the touch screen can be controlled more accurately.
FIG. 10 is a diagram illustrating a screen control device according to an exemplary embodiment. The screen control apparatus is applied to a terminal, and referring to fig. 10, the screen control apparatus includes:
an obtaining module 101 configured to obtain data collected by a sensor of the terminal, where the data collected by the sensor at least includes acceleration data, angular velocity data, and barometer data;
the processing module 102 is configured to process the data acquired by the sensor based on a preset matrix model group to obtain an average value of each type of data; wherein the preset matrix model group comprises at least one matrix model;
the judging module 103 is configured to judge the user gesture according to the average value of each type of data and a preset rule;
and the control module 104 is configured to control the display of the touch screen according to the judgment result of the user gesture.
In an embodiment, the processing module 102 is specifically configured to construct a first matrix for the plurality of data types acquired by the sensor and the data of each data type at different acquisition time according to a first matrix model in the preset matrix model group; and performing operation processing on the first matrix by using a second matrix model in the preset matrix model group to obtain the mean value of each type of data.
In an embodiment, the determining module 103 is specifically configured to determine that the user gesture is a gesture corresponding to a gesture tag if the number of data types exceeding the average threshold in the average of the data types is greater than the number threshold; the average threshold value and the number threshold value are determined by processing the historical data acquired by the sensor by using the first matrix model and the second matrix model and combining the gesture labels corresponding to the historical data.
In an embodiment, the determining module 103 is specifically configured to perform operation processing on the mean value of each type of data by using a third matrix model in the preset matrix model group, so as to obtain a normalized mean value corresponding to the mean value of all types of data; and judging the user gesture according to the normalized mean value and the preset rule.
In an embodiment, the preset rule includes a normalization threshold, and the determining module 103 is specifically configured to determine that the user gesture is a gesture corresponding to the gesture tag if the normalization average is greater than the normalization threshold; the normalization threshold is determined by processing historical data acquired by the sensor by using a preset matrix model group comprising the third matrix model and combining the historical data with the gesture label corresponding to the historical data.
In an embodiment, the control module 104 is specifically configured to control the touch screen to be turned off if the user gesture is to raise the hand; and if the gesture of the user is releasing, controlling the touch screen to be lightened.
In one embodiment, the data collected by the sensor further comprises:
light sensation data collected by the light sensor;
first direction data collected by an electronic compass;
second direction data characterizing a direction of movement determined based on the acceleration data and/or the angular velocity data.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Fig. 11 is a block diagram illustrating a mobile terminal apparatus 800 according to an example embodiment. For example, the apparatus 800 may be a smartphone, a smart wearable device, or the like.
Referring to fig. 11, the apparatus 800 may include one or more of the following components: processing component 802, memory 804, power component 806, multimedia component 808, audio component 810, input/output (I/O) interface 812, sensor component 814, and communication component 816.
The processing component 802 generally controls overall operation of the device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 802 may include one or more processors 820 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interaction between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operation at the device 800. Examples of such data include instructions for any application or method operating on device 800, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power components 806 provide power to the various components of device 800. The power components 806 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the device 800.
The multimedia component 808 includes a screen that provides an output interface between the device 800 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front facing camera and/or a rear facing camera. The front-facing camera and/or the rear-facing camera may receive external multimedia data when the device 800 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the apparatus 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 also includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 814 includes one or more sensors for providing various aspects of state assessment for the device 800. For example, the sensor assembly 814 may detect the open/closed state of the device 800, the relative positioning of the components, such as a display and keypad of the apparatus 800, the sensor assembly 814 may also detect a change in position of the apparatus 800 or a component of the apparatus 800, the presence or absence of user contact with the apparatus 800, orientation or acceleration/deceleration of the apparatus 800, and a change in temperature of the apparatus 800. Sensor assembly 814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate communications between the apparatus 800 and other devices in a wired or wireless manner. The device 800 may access a wireless network based on a communication standard, such as Wi-Fi, 2G, or 3G, or a combination thereof. In an exemplary embodiment, the communication component 816 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communications component 816 further includes a near field communications (jFC) module to facilitate short range communications. For example, the module at jFC may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 804 comprising instructions, executable by the processor 820 of the device 800 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (16)

1. A screen control method for use in a terminal including a touch screen, the method comprising:
acquiring data acquired by a sensor of the terminal, wherein the data acquired by the sensor at least comprises acceleration data, angular velocity data and barometer data;
processing the data acquired by the sensor based on a preset matrix model group to obtain the mean value of each type of data; wherein the preset matrix model group comprises at least one matrix model;
judging the gesture of the user according to the average value of each type of data and a preset rule;
and controlling the display of the touch screen according to the judgment result of the user gesture.
2. The method of claim 1, wherein the processing the data collected by the sensor based on the preset matrix model group to obtain a mean value of each type of the data comprises:
according to a first matrix model in the preset matrix model group, constructing a first matrix for a plurality of data types acquired by the sensor and data of each data type at different acquisition moments;
and performing operation processing on the first matrix by using a second matrix model in the preset matrix model group to obtain the mean value of each type of data.
3. The method according to claim 2, wherein the preset rules include a mean threshold and a number threshold, and the determining the user gesture according to the mean of each type of data and the preset rules includes:
if the number of the data types exceeding the mean threshold value in the mean value of the data of each type is larger than the number threshold value, determining that the user gesture is a gesture corresponding to a gesture label; the average threshold value and the number threshold value are determined by processing the historical data acquired by the sensor by using the first matrix model and the second matrix model and combining the gesture labels corresponding to the historical data.
4. The method according to claim 1, wherein the determining the user gesture according to the average value of each type of data and a preset rule comprises:
calculating the mean value of each type of data by using a third matrix model in the preset matrix model group to obtain a normalized mean value corresponding to the mean value of all types of data;
and judging the user gesture according to the normalized mean value and the preset rule.
5. The method according to claim 4, wherein the preset rule includes a normalization threshold, and the determining the user gesture according to the normalized mean and the preset rule includes:
if the normalized mean value is larger than the normalized threshold value, determining that the user gesture is a gesture corresponding to a gesture label; the normalization threshold is determined by processing historical data acquired by the sensor by using a preset matrix model group comprising the third matrix model and combining the historical data with the gesture label corresponding to the historical data.
6. The method according to claim 1, wherein the controlling the display of the touch screen according to the determination result of the user gesture comprises:
if the user gesture is hand lifting, controlling the touch screen to be turned off;
and if the user gesture is releasing, controlling the touch screen to be lightened.
7. The method of claim 1, wherein the data collected by the sensor further comprises:
light sensation data collected by the light sensor;
first direction data collected by an electronic compass;
second direction data characterizing a direction of movement determined based on the acceleration data and/or the angular velocity data.
8. A screen control apparatus for use in a terminal including a touch screen, the apparatus comprising:
the acquisition module is configured to acquire data acquired by a sensor of the terminal, wherein the data acquired by the sensor at least comprises acceleration data, angular velocity data and barometer data;
the processing module is configured to process the data acquired by the sensor based on a preset matrix model group to obtain an average value of each type of data; wherein the preset matrix model group comprises at least one matrix model;
the judging module is configured to judge the user gesture according to the average value of each type of data and a preset rule;
and the control module is configured to control the display of the touch screen according to the judgment result of the user gesture.
9. The apparatus of claim 8,
the processing module is specifically configured to construct a first matrix for the plurality of data types acquired by the sensor and the data of each data type at different acquisition moments according to a first matrix model in the preset matrix model group; and performing operation processing on the first matrix by using a second matrix model in the preset matrix model group to obtain the mean value of each type of data.
10. The apparatus of claim 9,
the judgment module is specifically configured to determine that the user gesture is a gesture corresponding to a gesture label if the number of data types exceeding the mean threshold value in the mean value of the data of each type is greater than the number threshold value; the average threshold value and the number threshold value are determined by processing the historical data acquired by the sensor by using the first matrix model and the second matrix model and combining the gesture labels corresponding to the historical data.
11. The apparatus of claim 8,
the judgment module is specifically configured to perform operation processing on the mean value of each type of data by using a third matrix model in the preset matrix model group to obtain a normalized mean value corresponding to the mean value of all types of data; and judging the user gesture according to the normalized mean value and the preset rule.
12. The apparatus of claim 11, wherein the predetermined rule comprises a normalized threshold,
the judgment module is specifically configured to determine that the user gesture is a gesture corresponding to a gesture tag if the normalized mean value is greater than the normalized threshold value; the normalization threshold is determined by processing historical data acquired by the sensor by using a preset matrix model group comprising the third matrix model and combining the historical data with the gesture label corresponding to the historical data.
13. The apparatus of claim 8,
the control module is specifically configured to control the touch screen to be turned off if the user gesture is a hand raising; and if the user gesture is releasing, controlling the touch screen to be lightened.
14. The apparatus of claim 8, wherein the data collected by the sensor further comprises:
light sensation data collected by the light sensor;
first direction data collected by an electronic compass;
second direction data characterizing a direction of movement determined based on the acceleration data and/or the angular velocity data.
15. A screen control apparatus, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform the screen control method of any one of claims 1 to 7.
16. A non-transitory computer-readable storage medium, instructions in which, when executed by a processor of a computer, enable the computer to perform the screen control method of any one of claims 1 to 7.
CN202010461144.2A 2020-05-27 2020-05-27 Screen control method and device and storage medium Pending CN111651114A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010461144.2A CN111651114A (en) 2020-05-27 2020-05-27 Screen control method and device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010461144.2A CN111651114A (en) 2020-05-27 2020-05-27 Screen control method and device and storage medium

Publications (1)

Publication Number Publication Date
CN111651114A true CN111651114A (en) 2020-09-11

Family

ID=72350691

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010461144.2A Pending CN111651114A (en) 2020-05-27 2020-05-27 Screen control method and device and storage medium

Country Status (1)

Country Link
CN (1) CN111651114A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI797494B (en) * 2020-11-02 2023-04-01 華碩電腦股份有限公司 Electronic device and control method thereof

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130265225A1 (en) * 2007-01-05 2013-10-10 Invensense, Inc. Controlling and accessing content using motion processing on mobile devices
CN106817497A (en) * 2017-03-31 2017-06-09 努比亚技术有限公司 A kind of screen processing method and mobile terminal
CN108536377A (en) * 2018-04-11 2018-09-14 Oppo广东移动通信有限公司 Display control method and device, terminal, computer readable storage medium
CN108563387A (en) * 2018-04-13 2018-09-21 Oppo广东移动通信有限公司 Display control method and device, terminal, computer readable storage medium
CN108600557A (en) * 2018-07-13 2018-09-28 维沃移动通信有限公司 A kind of screen light on and off control method and mobile terminal
CN108668015A (en) * 2018-04-11 2018-10-16 Oppo广东移动通信有限公司 The screen control method and device of terminal, readable storage medium storing program for executing, terminal
CN108803896A (en) * 2018-05-28 2018-11-13 Oppo(重庆)智能科技有限公司 Control method, apparatus, terminal and the storage medium of screen
CN109582197A (en) * 2018-11-30 2019-04-05 北京小米移动软件有限公司 Screen control method, device and storage medium
CN109918006A (en) * 2019-01-28 2019-06-21 维沃移动通信有限公司 A kind of screen control method and mobile terminal

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130265225A1 (en) * 2007-01-05 2013-10-10 Invensense, Inc. Controlling and accessing content using motion processing on mobile devices
CN106817497A (en) * 2017-03-31 2017-06-09 努比亚技术有限公司 A kind of screen processing method and mobile terminal
CN108536377A (en) * 2018-04-11 2018-09-14 Oppo广东移动通信有限公司 Display control method and device, terminal, computer readable storage medium
CN108668015A (en) * 2018-04-11 2018-10-16 Oppo广东移动通信有限公司 The screen control method and device of terminal, readable storage medium storing program for executing, terminal
CN108563387A (en) * 2018-04-13 2018-09-21 Oppo广东移动通信有限公司 Display control method and device, terminal, computer readable storage medium
CN108803896A (en) * 2018-05-28 2018-11-13 Oppo(重庆)智能科技有限公司 Control method, apparatus, terminal and the storage medium of screen
CN108600557A (en) * 2018-07-13 2018-09-28 维沃移动通信有限公司 A kind of screen light on and off control method and mobile terminal
CN109582197A (en) * 2018-11-30 2019-04-05 北京小米移动软件有限公司 Screen control method, device and storage medium
CN109918006A (en) * 2019-01-28 2019-06-21 维沃移动通信有限公司 A kind of screen control method and mobile terminal

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI797494B (en) * 2020-11-02 2023-04-01 華碩電腦股份有限公司 Electronic device and control method thereof

Similar Documents

Publication Publication Date Title
US10498873B2 (en) Screen control method, apparatus, and non-transitory tangible computer readable storage medium
EP3110107A1 (en) Method and device for managing self-balanced vehicle
US20210314494A1 (en) Terminal, focusing method and apparatus, and computer readable storage medium
EP3575862B1 (en) Method and device for adjusting lens position
US20160127638A1 (en) Shooting parameter adjustment method and device
US10025393B2 (en) Button operation processing method in single-hand mode
US20170154604A1 (en) Method and apparatus for adjusting luminance
EP3163404A1 (en) Method and device for preventing accidental touch of terminal with touch screen
US10612918B2 (en) Mobile computing device and method for calculating a bending angle
EP3312702B1 (en) Method and device for identifying gesture
EP3156983A1 (en) Method and device for transmitting alert message
CN112202962B (en) Screen brightness adjusting method and device and storage medium
EP3001305A1 (en) Method and device for controlling display of video
CN110913133B (en) Shooting method and electronic equipment
CN111651114A (en) Screen control method and device and storage medium
US10739913B2 (en) Protective film detection method and apparatus, and storage medium
CN112187995A (en) Illumination compensation method, illumination compensation device, and storage medium
CN112148149A (en) Touch screen control method, touch screen control device and storage medium
CN108595930B (en) Terminal device control method and device
CN107329604B (en) Mobile terminal control method and device
CN105510939B (en) Obtain the method and device of motion path
CN109670432B (en) Action recognition method and device
CN109813295B (en) Orientation determination method and device and electronic equipment
CN115586469A (en) State detection method and device, electronic equipment and storage medium
CN115129281A (en) Method and device for adjusting backlight brightness, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination