CN109933196B - Screen control method and device and terminal equipment - Google Patents

Screen control method and device and terminal equipment Download PDF

Info

Publication number
CN109933196B
CN109933196B CN201910179143.6A CN201910179143A CN109933196B CN 109933196 B CN109933196 B CN 109933196B CN 201910179143 A CN201910179143 A CN 201910179143A CN 109933196 B CN109933196 B CN 109933196B
Authority
CN
China
Prior art keywords
display screen
holding posture
touch screen
posture information
terminal equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910179143.6A
Other languages
Chinese (zh)
Other versions
CN109933196A (en
Inventor
闫维荣
谢进耀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201910179143.6A priority Critical patent/CN109933196B/en
Publication of CN109933196A publication Critical patent/CN109933196A/en
Application granted granted Critical
Publication of CN109933196B publication Critical patent/CN109933196B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a screen control method and device and terminal equipment. The terminal equipment comprises a plurality of display screens, wherein the screen control method comprises the following steps: acquiring holding posture information of a user aiming at the terminal equipment; determining a target display screen matched with the holding posture information in a plurality of display screens of the terminal equipment, wherein the matched target display screen faces the user when the holding posture information is presented by the terminal equipment; and determining that the target display screen is a main display screen of the terminal equipment. The scheme of this application embodiment can judge the target display screen that faces the user in the terminal equipment based on the appearance of holding of user to terminal equipment to confirm the target display screen as the main display screen, thereby on the basis that does not need user's manual configuration, the intellectuality has realized the quick setting of main display screen, can promote the user and experience the use of many screen terminals.

Description

Screen control method and device and terminal equipment
Technical Field
The application relates to the technical field of terminal application, in particular to a screen control method and device and terminal equipment.
Background
With the continuous improvement of terminal equipment intellectualization, people have more and more use requirements on the terminal equipment, wherein the use requirements include diversified display requirements. Currently, multi-screen display has become an important development direction of terminal devices.
In the prior art, a user needs to manually configure the main display screen of the multi-screen terminal device. For the user, the configuration process is cumbersome and not conducive to setting up the main display screen quickly. Particularly, in an application scenario where the main display screen and the auxiliary display screen need to be frequently switched, the user continuously configures the main display screen, which may seriously affect the use experience.
Therefore, it is desirable to provide a screen control scheme that can quickly set the main display screen of the terminal device without requiring manual configuration by the user.
Disclosure of Invention
The embodiment of the application aims to provide a screen control method, a screen control device and a terminal device, which can quickly set a main display screen of the terminal device on the basis of no need of manual configuration of a user.
In order to achieve the above purpose, the following technical solutions are adopted in the embodiments of the present application:
in a first aspect, a screen control method is provided, which is applied to a terminal device, where the terminal device includes multiple display screens, and the method includes:
acquiring holding posture information of a user aiming at the terminal equipment;
determining a target display screen matched with the holding posture information in a plurality of display screens of the terminal equipment, wherein the matched target display screen faces the user when the terminal equipment presents the holding posture information;
and determining that the target display screen is a main display screen of the terminal equipment.
In a second aspect, a screen control device is provided, which is applied to a terminal device, the terminal device includes a plurality of display screens, and the screen control device includes:
the holding posture determining module is used for acquiring holding posture information of a user for the terminal equipment;
the holding posture analysis module is used for determining a target display screen matched with the holding posture information in a plurality of display screens of the terminal equipment, wherein the matched target display screen faces the user when the terminal equipment presents the holding posture information;
and the main display screen determining module is used for determining that the target display screen is the main display screen of the terminal equipment.
In a third aspect, a terminal device is provided, which includes: a memory, a processor and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps in the screen control apparatus of the first aspect.
The scheme of this application embodiment can judge the target display screen that faces the user in the terminal equipment based on the appearance of holding of user to terminal equipment to confirm the target display screen as the main display screen, thereby on the basis that does not need user's manual configuration, the intellectuality has realized the quick setting of main display screen, can promote the user and experience the use of many screen terminals.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a schematic step diagram of a screen control method provided in an embodiment of the present application.
Fig. 2 is a schematic diagram illustrating another step of the screen control method according to the embodiment of the present application.
Fig. 3 is a schematic logical structure diagram of a screen control device according to an embodiment of the present application.
Fig. 4 is a schematic diagram of a hardware structure of a terminal device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
As described above, in the prior art, for a multi-screen terminal device, a user is required to manually configure a main display screen. For the user, the configuration process is cumbersome and not conducive to setting up the main display screen quickly. Especially, in an application scenario where the main display screen and the auxiliary display screen need to be frequently switched, the use experience is seriously influenced.
To solve the problem, the application provides a solution, and aims to realize intelligent setting of a main display screen based on holding posture information of a user for terminal equipment.
On one hand, an embodiment of the present application provides a screen control method, which is applied to a terminal device, where the terminal device includes a plurality of display screens, as shown in fig. 1, the screen control method includes:
and step S102, acquiring holding posture information of the user aiming at the terminal equipment.
The method for acquiring the holding posture information is not unique, and the embodiment of the application is not particularly limited.
As an exemplary introduction.
This step may be based on a sensor of the terminal device to determine the grip posture information of the user for the terminal device. For example, based on a gravity sensor, determining the current orientation of the terminal device, and further determining holding posture information of the terminal device held by the user;
or, the holding posture information of the user for the terminal device may be determined based on a touch module of the terminal device. For example, based on the touch function of the touch module, the contact position between the user and the touch detection area when the user holds the terminal device is determined, and then the holding posture information of the user holding the terminal device is determined.
And step S104, determining a target display screen matched with the holding posture information in a plurality of display screens of the terminal equipment, wherein the matched target display screen faces a user when the holding posture information is presented by the terminal equipment.
And step S106, determining that the target display screen is the main display screen of the terminal equipment.
After the target display screen is determined to be the main display screen of the terminal device, the main display screen can be lightened, and/or other display screens except the main display screen can be dimmed or extinguished.
As can be known from the screen control method shown in fig. 1, in the scheme of the embodiment of the present application, the target display screen facing the user in the terminal device can be determined based on the holding posture of the user on the terminal device, and the target display screen is determined as the main display screen, so that on the basis of no need of manual configuration by the user, the quick setting of the main display screen is intelligently realized, and the use experience of the user on the multi-screen terminal device can be improved.
The following describes how to determine the grip posture information according to the embodiments of the present application.
The screen control method can acquire the holding posture information of the user aiming at the terminal equipment based on two implementation modes.
One implementation manner is to acquire holding posture information of a user for the terminal device based on a touch function on the terminal device.
For example, when step S102 is executed, touch screen detection is performed on the terminal device to obtain a touch screen detection feature. And then, acquiring holding posture information matched with the touch screen detection characteristics.
The terminal device is preset with a corresponding relation between touch screen detection characteristics and holding posture information. According to the implementation mode, after the touch screen detection characteristics are obtained, the matching holding posture information can be determined by comparing based on the corresponding relation.
In practical application, a large number of touch screen detection features of the terminal device held by a user can be collected in advance for different screens of the terminal device.
It is assumed that the terminal device has a first display screen and a second display screen.
When the first display screen faces the user, touch screen detection can be respectively performed on the holding posture information of the left-hand-held terminal device, the holding posture information of the right-hand-held terminal device and the holding posture information of the two-hand-held terminal device of the user, so that touch screen detection characteristics corresponding to various possible holding posture information when the first display screen faces the user are determined.
Similarly, when the second display screen faces the user, touch screen detection can be performed on the holding posture information of the left-hand-held terminal device, the holding posture information of the right-hand-held terminal device and the holding posture information of the two-hand-held terminal device of the user respectively, so that touch screen detection characteristics corresponding to various possible holding posture information when the second display screen faces the user are determined.
The correspondence between the collected touch screen detection features and the grip posture information can be stored in a grip posture sample database. It should be noted that, in the embodiment of the present application, a construction manner of the grip posture sample database is not specifically limited. The corresponding relation between the holding posture information in the holding posture sample database and the touch screen detection characteristics can be determined by a manufacturer based on massive test data before the terminal equipment leaves the field; or the terminal equipment can be expanded by the user after leaving the factory.
The following provides an exemplary description of a method for determining grip posture information based on touch screen detection characteristics obtained by touch screen detection.
In the method of the embodiment of the application, the touch screen detection characteristics form a touch screen detection result matrix. And each matrix unit in the touch screen detection result matrix corresponds to a touch screen detection area respectively, and represents a touch screen detection result obtained by performing touch screen detection on the touch screen detection area. The touch screen detection result of each matrix unit may be an electrical parameter, such as a pressure value, a capacitance value, and the like, obtained by performing touch detection on the matrix unit, which can reflect whether the matrix unit is touched by a user.
Assuming that the terminal device has the first display screen and the second display screen, the process of acquiring the holding posture information of the user for the terminal device can be divided into two stages.
Stage one: and performing touch screen detection on the terminal equipment to obtain touch screen detection characteristics.
In this stage, the matrix of the touch screen detection result obtained by the touch screen detection is M ═ i, AorB. The touch screen detection method comprises the steps that i represents the ordered sequence number of matrix units (the first display screen and the second display screen are divided into matrix units with the same specification), A represents a touch screen detection result of the matrix unit i in the first display screen, and B represents a touch screen detection result of the matrix unit i in the second display screen.
And a second stage: and querying the grip posture sample database based on M ═ i (AorB), and acquiring grip posture information matched with M ═ i (AorB).
As described above, different grip posture information and a sample touch screen detection result matrix corresponding to the grip posture information are recorded in the grip posture sample database.
In this stage, the grip information corresponding to the sample touch screen detection result matrix M ═ i (AorB) whose similarity satisfies the preset criterion in the grip sample database may be used as the grip information of the user currently for the terminal device.
For example, when the values a and/or B corresponding to matrix cells having a ratio exceeding a predetermined ratio (e.g., 80%) are the same, the similarity between M (i, AorB) and M (i, AorB) is determined to satisfy the predetermined criterion.
Preferably, in order to increase the query hit rate, the values of A, B elements of M ═ (i, AorB) and M ═ i, AorB can be represented in a binary manner. For example, when the pressure value corresponding to the matrix unit i is greater than the first threshold value, a takes the value of 1, otherwise, takes the value of 0; similarly, when the pressure value corresponding to the matrix unit i is greater than the second threshold value, the value of B is 1, otherwise, the value of B is 0.
It can be known that, in the binary representation, the matrix unit in M ═ i, AorB and the matrix unit in M ═ i, AorB both indicate that there are two results, "0" or "1", and the similarity in calculation is simplified, thereby improving the efficiency of the matching query.
Correspondingly, another implementation manner is to acquire holding posture information of the user for the terminal device based on the sensor function of the terminal device.
For example, when step S102 is executed, the gravity characteristic data of the terminal device after the first holding posture information is determined is acquired based on a gravity sensor and/or a gyroscope sensor of the terminal device; and then, determining second holding posture information of the terminal equipment based on the first holding posture information and the gravity characteristic data.
The gravity characteristic data may include, but is not limited to, a gravitational acceleration of the terminal device. After the first holding posture information is determined, if the gravity acceleration of the terminal equipment changes in positive and negative values, the terminal equipment is determined to turn over, and at the moment, the second holding posture information can be further updated and determined on the basis of the first holding posture information.
By way of exemplary introduction, it is assumed that the terminal device has a first display screen and a second display screen which are oppositely arranged, the first display screen corresponds to first holding posture information when facing the user, and the second display screen corresponds to second holding posture information when facing the user. If the terminal equipment is turned over after the first holding posture information is determined, the fact that the user wants to use the second display screen on the reverse side can be indicated, and at the moment, the second holding posture information can be correspondingly determined.
The following describes the execution logic of the method according to the embodiment of the present application with reference to an actual application scenario.
In the application scenario, when the terminal device is in a sleep state, the user can trigger the steps in the screen control method through double-click touch operation. Assuming that the terminal device is provided with the first display screen and the second display screen, the screen control method is shown in fig. 2 and includes the following steps:
step S201, the terminal equipment starts a vibration sensor when the first display screen and the second display screen are both in a screen-off state.
Step S202, the user expects to wake up the first display screen as a main display screen, orient the first display screen toward the user after holding the terminal device, and perform a first click touch operation of a double click touch operation on a display area of the first display screen.
Step S203, the terminal device identifies a first touch operation based on the vibration sensor, and starts the touch modules of the first display screen and the second display screen.
In step S204, the user performs a first click touch operation of the double click touch operation on the display area of the first display screen.
Step S205, the terminal device identifies a second touch operation based on the touch module of the first display screen, and performs touch screen detection based on the touch modules of the first display screen and the second display screen to obtain touch screen detection characteristics of the first display screen and the second display screen.
Step S206, the terminal device determines holding posture information of the user for the terminal device based on touch screen detection characteristics of the first display screen and the second display screen, wherein the holding posture information indicates that the current first display screen faces the user, and the first display screen is used as a target display screen.
Step S207, the terminal equipment judges whether the touch position of the second click touch operation is located in the display area of the target display screen; if so, taking the first display screen as a main display screen and lightening the first display screen; otherwise, the user' S double-click touch operation is ignored, and the process returns to step S201 again.
Assuming that the second touch operation falls into the display area of the first display screen, the user starts to use the terminal device after the first display screen is lighted. The corresponding subsequent flow comprises the following steps:
and step S208, in the process of using the terminal equipment by the user, the terminal equipment starts the gravity sensor and/or the gyroscope sensor.
In step S209, the user desires to switch the second display screen as the main display screen and flip the terminal device to orient the second display screen to himself.
Step S210, when the terminal device recognizes that the terminal device is turned over based on the gravity sensor and/or the gyroscope sensor, touch screen detection is performed based on the touch modules of the first display screen and the second display screen, and touch screen detection characteristics of the first display screen and the second display screen are obtained again.
Step S211, the terminal device determines holding posture information of the user aiming at the terminal device at present based on the touch screen detection characteristics of the first display screen and the second display screen which are obtained again, the holding posture information indicates that the second display screen faces the user, and the second display screen is used as a target display screen.
In step S212, the terminal device lights up the second display screen as the main display screen and extinguishes the first display screen.
Based on the application scenario shown in fig. 2, in the embodiment of the application, the main screen can be adaptively switched according to the holding posture information of the user on the terminal device in real time in the process that the user uses the terminal device.
Correspondingly, as shown in fig. 3, an embodiment of the present application further provides a screen control device 3000, including:
a holding posture determining module 3001, configured to obtain holding posture information of the user for the terminal device.
A holding posture analysis module 3002, configured to determine a target display screen, which is matched with the holding posture information, in the multiple display screens of the terminal device, where the matched target display screen faces the user when the terminal device presents the holding posture information.
A main display determination module 3003, configured to determine that the target display is the main display of the terminal device.
As can be known from the screen control device shown in fig. 3, in the scheme of the embodiment of the present application, the target display screen facing the user in the terminal device can be determined based on the holding posture of the user on the terminal device, and the target display screen is determined as the main display screen, so that on the basis of not requiring manual configuration by the user, the quick setting and control of the main display screen are realized, and the use experience of the user on the multi-screen terminal device can be improved.
The screen control device of the embodiment of the application can determine the holding posture information of the user for the terminal equipment based on two implementation modes.
For one implementation, the holding posture determining module 3001 of the embodiment of the present application specifically includes:
the first holding posture determining unit is used for performing touch screen detection on the terminal equipment to obtain touch screen detection characteristics; acquiring holding posture information matched with the touch screen detection characteristics; the terminal device is preset with a corresponding relation between touch screen detection characteristics and holding posture information.
Optionally, the touch screen detection features form a touch screen detection result matrix, and a matrix unit in the touch screen detection result matrix corresponds to a touch screen detection area, wherein the matrix unit represents a touch screen detection result obtained by performing touch screen detection on the touch screen detection area.
Optionally, the first holding posture determining unit is specifically configured to start a vibration sensor of the terminal device to detect a first touch operation of a preset double-touch operation; when the first touch operation is detected, starting a touch module of the terminal equipment to detect a second touch operation of the double-touch operation; and when the second touch operation is detected, performing touch screen detection on the terminal equipment.
Optionally, the main display screen determining module 3003 is specifically configured to determine that the target display screen is the main display screen of the terminal device if the second touch operation falls into the display area of the target display screen.
For another implementation, the holding posture determining module 3001 of the embodiment of the present application specifically includes:
and the second holding posture determining unit is used for acquiring the gravity characteristic data of the terminal equipment after the first holding posture information is determined, and determining second holding posture information of the terminal equipment based on the first holding posture information and the gravity characteristic data.
In addition, the screen control device of the embodiment of the present application may further include:
the screen control module is used for lighting the main display screen after the target display screen is determined to be the main display screen of the terminal equipment; and/or dimming or extinguishing other display screens except the main display screen after the target display screen is determined to be the main display screen of the terminal equipment.
Obviously, the screen control device according to the embodiment of the present application may be used as an execution main body of the screen control method shown in fig. 1, and therefore, the functions of the screen control method that can be implemented in fig. 1 and fig. 2 can also be implemented in the screen control device according to the embodiment of the present application, and are not described herein again.
In addition, as shown in fig. 4, an embodiment of the present application further provides a terminal device 400, where the terminal device 400 includes, but is not limited to: radio frequency unit 401, network module 402, audio output unit 403, input unit 404, sensor 405, display unit 406, user input unit 407, interface unit 408, memory 409, processor 410, and power supply 411. Those skilled in the art will appreciate that the terminal architecture shown in fig. 4 does not constitute a limitation of terminal device 400, and terminal device 400 may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present application, the terminal device 400 includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
Wherein, the processor 410 is configured to:
acquiring holding posture information of a user aiming at the terminal equipment;
determining a target display screen matched with the holding posture information in a plurality of display screens of the terminal equipment, wherein the matched target display screen faces the user when the holding posture information is presented by the terminal equipment;
and determining that the target display screen is a main display screen of the terminal equipment.
According to the scheme of the embodiment of the application, the target display screen facing the user in the terminal equipment can be judged based on the holding posture of the user on the terminal equipment, and the target display screen is determined to be the main display screen, so that the main display screen can be quickly set and controlled on the basis that manual configuration of the user is not needed, and the use experience of the user on the multi-screen terminal equipment can be improved.
It should be understood that, in the embodiment of the present application, the radio frequency unit 401 may be configured to receive and transmit signals during a message transmission or a call, and specifically, receive downlink data from a base station and then process the received downlink data to the processor 140; in addition, the uplink data is transmitted to the base station. Typically, radio unit 401 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. Further, the radio unit 401 can also communicate with a network and other devices through a wireless communication system.
The terminal device 400 provides the user with wireless broadband internet access via the network module 402, such as assisting the user in sending and receiving e-mails, browsing web pages, and accessing streaming media.
The audio output unit 403 may convert audio data received by the radio frequency unit 401 or the network module 402 or stored in the memory 409 into an audio signal and output as sound. Also, the audio output unit 403 may also provide audio output related to a specific function performed by the terminal 400 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 403 includes a speaker, a buzzer, a receiver, and the like.
The input unit 404 is used to receive audio or video signals. The input Unit 404 may include a Graphics Processing Unit (GPU) 4041 and a microphone 4042, and the Graphics processor 4041 processes image data of a still picture or video obtained by an image capturing apparatus (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 406. The image frames processed by the graphic processor 4041 may be stored in the memory 409 (or other storage medium) or transmitted via the radio frequency unit 401 or the network module 402. The microphone 4042 may receive sound, and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 401 in case of the phone call mode.
The terminal device 400 further comprises at least one sensor 405, such as light sensors, motion sensors and other sensors. Specifically, the light sensor includes an ambient light sensor that adjusts the brightness of the display panel 4061 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 4061 and/or the backlight when the terminal apparatus 400 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 405 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which will not be described in detail herein.
The display unit 406 is used to display information input by the user or information provided to the user. The Display unit 406 may include a Display panel 4061, and the Display panel 4061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 407 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal device. Specifically, the user input unit 407 includes a touch panel 4071 and other input devices 4072. Touch panel 4071, also referred to as a touch screen, may collect touch operations by a user on or near it (e.g., operations by a user on or near touch panel 4071 using a finger, a stylus, or any suitable object or attachment). The touch panel 4071 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 140, receives a command from the processor 140, and executes the command. In addition, the touch panel 4071 can be implemented by using various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 4071, the user input unit 407 may include other input devices 4072. Specifically, the other input devices 4072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a track ball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 4071 can be overlaid on the display panel 4061, and when the touch panel 4071 detects a touch operation thereon or nearby, the touch operation is transmitted to the processor 140 to determine the type of the touch event, and then the processor 410 provides a corresponding visual output on the display panel 4061 according to the type of the touch event. Although in fig. 4, the touch panel 4071 and the display panel 4061 are two independent components to implement the input and output functions of the terminal, in some embodiments, the touch panel 4071 and the display panel 4061 may be integrated to implement the input and output functions of the terminal, which is not limited herein.
The interface unit 408 is an interface for connecting an external device to the terminal apparatus 400. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 408 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal apparatus 400 or may be used to transmit data between the terminal 400 and an external device.
The memory 409 may be used to store software programs as well as various data. The memory 409 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 409 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 410 is a control center of the terminal, connects various parts of the entire terminal using various interfaces and lines, and performs various functions of the terminal and processes data by operating or executing software programs and/or modules stored in the memory 409 and calling data stored in the memory 409, thereby integrally monitoring the terminal. Processor 140 may include one or more processing units; preferably, the processor 410 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 410.
The terminal device 400 may further include a power supply 411 (e.g., a battery) for supplying power to various components, and preferably, the power supply 411 may be logically connected to the processor 410 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the terminal device 400 may also execute the screen control method shown in fig. 1, and implement the functions of the screen control apparatus in the embodiments shown in fig. 1 and fig. 2, which are not described again herein.
In addition, an embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the method embodiment in fig. 1, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be understood that when the computer program in the computer readable storage medium of the embodiment of the present application is executed by the processor, the functions of the screen control apparatus in the embodiments shown in fig. 1 and fig. 2 can be implemented, and the embodiments of the present application are not described herein again.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (8)

1. A screen control method is applied to a terminal device, the terminal device comprises a plurality of display screens, and the method is characterized by comprising the following steps:
acquiring holding posture information of a user aiming at the terminal equipment;
determining a target display screen matched with the holding posture information in a plurality of display screens of the terminal equipment, wherein the matched target display screen faces the user when the terminal equipment presents the holding posture information;
determining that the target display screen is a main display screen of the terminal equipment;
wherein, obtaining the holding posture information of the user aiming at the terminal equipment comprises:
performing touch screen detection on the terminal equipment to obtain touch screen detection characteristics;
acquiring holding posture information matched with the touch screen detection characteristics; the terminal equipment is preset with a corresponding relation between touch screen detection characteristics and holding posture information;
the touch screen detection features form a touch screen detection result matrix, and a matrix unit in the touch screen detection result matrix corresponds to a touch screen detection area, wherein the matrix unit represents a touch screen detection result obtained by performing touch screen detection on the touch screen detection area;
the plurality of display screens comprise a first display screen and a second display screen, and the first display screen and the second display screen are divided into matrix units with the same specification; the touch screen detection result matrix is M ═ i, Aorb, wherein i represents the ordered sequence number of the matrix unit, A represents the touch screen detection result of the matrix unit i in the first display screen, and B represents the touch screen detection result of the matrix unit i in the second display screen; values of A, B elements in the touch screen detection result matrix M ═ i (i, AorB) are binary values;
the acquiring of the holding posture information matched with the touch screen detection features comprises: based on M ═ i (i, Aorb), querying a holding posture sample database, and acquiring holding posture information matched with M ═ i (i, Aorb);
the acquiring of the grip posture information matched with M ═ i, Aorb comprises: and obtaining holding posture information corresponding to a sample touch screen detection result matrix M (i, AorB) with similarity meeting a preset standard with M (i, AorB) in the holding posture sample database.
2. The screen control method according to claim 1,
and the touch screen detection is carried out on the terminal equipment, and the method comprises the following steps:
starting a vibration sensor of the terminal equipment to detect a first touch operation of preset double-touch operation;
when the first touch operation is detected, starting a touch module of the terminal equipment to detect a second touch operation of the double-touch operation;
and when the second touch operation is detected, performing touch screen detection on the terminal equipment.
3. The screen control method according to claim 2,
determining that the target display screen is a main display screen of the terminal device, including:
and if the touch position of the second touch operation is located in the display area of the target display screen, determining that the target display screen is the main display screen of the terminal device.
4. The screen control method according to claim 1,
acquiring holding posture information of a user for the terminal equipment, comprising:
acquiring gravity characteristic data of the terminal equipment after the first holding posture information is determined;
and determining second holding posture information of the terminal equipment based on the first holding posture information and the gravity characteristic data.
5. The screen control method according to any one of claims 1 to 4,
after the determining that the target display screen is the main display screen of the terminal device, the method further comprises:
lighting up the main display screen; and/or
Dimming or extinguishing other displays than the main display.
6. A screen control device is applied to a terminal device, the terminal device comprises a plurality of display screens, and the screen control device is characterized by comprising:
the holding posture determining module is used for acquiring holding posture information of a user for the terminal equipment;
the holding posture analysis module is used for determining a target display screen matched with the holding posture information in a plurality of display screens of the terminal equipment, wherein the matched target display screen faces the user when the terminal equipment presents the holding posture information;
a main display screen determining module, configured to determine that the target display screen is a main display screen of the terminal device;
the gripping posture determining module comprises:
the first holding posture determining unit is used for performing touch screen detection on the terminal equipment to obtain touch screen detection characteristics; acquiring holding posture information matched with the touch screen detection characteristics; the terminal equipment is preset with a corresponding relation between touch screen detection characteristics and holding posture information;
the touch screen detection characteristics form a touch screen detection result matrix, and a matrix unit in the touch screen detection result matrix corresponds to a touch screen detection area, wherein a touch screen detection result obtained by performing touch screen detection on the touch screen detection area is represented;
the plurality of display screens comprise a first display screen and a second display screen, and the first display screen and the second display screen are divided into matrix units with the same specification; the touch screen detection result matrix is M ═ i, Aorb, wherein i represents the ordered sequence number of the matrix unit, A represents the touch screen detection result of the matrix unit i in the first display screen, and B represents the touch screen detection result of the matrix unit i in the second display screen; values of A, B elements in the touch screen detection result matrix M ═ i (i, AorB) are binary values;
the first holding posture determining unit is specifically configured to, when obtaining the holding posture information matched with the touch screen detection feature, query a holding posture sample database based on M ═ i (AorB), and obtain the holding posture information matched with M ═ i (AorB);
the first holding posture determining unit is specifically configured to, when obtaining the holding posture information matched with M ═ i, AorB, obtain holding posture information corresponding to a sample touch screen detection result matrix M ═ i, AorB, of which the similarity to M ═ i, AorB meets a preset standard, in the holding posture sample database.
7. The screen control device of claim 6, wherein the grip posture determination module further comprises: and the second holding posture determining unit is used for acquiring the gravity characteristic data of the terminal equipment after the first holding posture information is determined, and determining second holding posture information of the terminal equipment based on the first holding posture information and the gravity characteristic data.
8. A terminal device, comprising: memory, processor and computer program stored on the memory and running on the processor, which computer program, when executed by the processor, carries out the steps of the screen control method according to any one of claims 1 to 5.
CN201910179143.6A 2019-03-11 2019-03-11 Screen control method and device and terminal equipment Active CN109933196B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910179143.6A CN109933196B (en) 2019-03-11 2019-03-11 Screen control method and device and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910179143.6A CN109933196B (en) 2019-03-11 2019-03-11 Screen control method and device and terminal equipment

Publications (2)

Publication Number Publication Date
CN109933196A CN109933196A (en) 2019-06-25
CN109933196B true CN109933196B (en) 2021-09-07

Family

ID=66986792

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910179143.6A Active CN109933196B (en) 2019-03-11 2019-03-11 Screen control method and device and terminal equipment

Country Status (1)

Country Link
CN (1) CN109933196B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110673740B (en) * 2019-10-25 2023-08-15 合肥惠科金扬科技有限公司 Driving method, driving system, display device and readable storage medium
CN111081144A (en) * 2019-11-27 2020-04-28 咪咕互动娱乐有限公司 Electronic device, information display method and computer readable storage medium
CN111262975B (en) * 2020-01-08 2021-06-08 华为技术有限公司 Bright screen control method, electronic device, computer-readable storage medium, and program product

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106027793A (en) * 2016-06-30 2016-10-12 努比亚技术有限公司 Control device and method of double-screen mobile terminal
CN106502467A (en) * 2016-11-08 2017-03-15 广东小天才科技有限公司 A kind of screen awakening method of user terminal and device, user terminal
CN108170394A (en) * 2017-12-28 2018-06-15 努比亚技术有限公司 A kind of double-sided screen terminal and its control method
CN109005289A (en) * 2018-07-25 2018-12-14 努比亚技术有限公司 Screen lighting method, mobile terminal and readable storage medium storing program for executing
CN109214155A (en) * 2018-07-26 2019-01-15 努比亚技术有限公司 Screen lights method, apparatus, terminal and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106027793A (en) * 2016-06-30 2016-10-12 努比亚技术有限公司 Control device and method of double-screen mobile terminal
CN106502467A (en) * 2016-11-08 2017-03-15 广东小天才科技有限公司 A kind of screen awakening method of user terminal and device, user terminal
CN108170394A (en) * 2017-12-28 2018-06-15 努比亚技术有限公司 A kind of double-sided screen terminal and its control method
CN109005289A (en) * 2018-07-25 2018-12-14 努比亚技术有限公司 Screen lighting method, mobile terminal and readable storage medium storing program for executing
CN109214155A (en) * 2018-07-26 2019-01-15 努比亚技术有限公司 Screen lights method, apparatus, terminal and storage medium

Also Published As

Publication number Publication date
CN109933196A (en) 2019-06-25

Similar Documents

Publication Publication Date Title
CN109343759B (en) Screen-turning display control method and terminal
CN108227996B (en) Display control method and mobile terminal
CN109151367B (en) Video call method and terminal equipment
CN109710150B (en) Key control method and terminal
CN108958593B (en) Method for determining communication object and mobile terminal
CN110531915B (en) Screen operation method and terminal equipment
CN109343788B (en) Operation control method of mobile terminal and mobile terminal
CN107748640B (en) Screen-off display method and mobile terminal
CN108984066B (en) Application icon display method and mobile terminal
CN107728923B (en) Operation processing method and mobile terminal
CN109933196B (en) Screen control method and device and terminal equipment
CN109710130B (en) Display method and terminal
CN109634438B (en) Input method control method and terminal equipment
CN111142679A (en) Display processing method and electronic equipment
CN107967418B (en) Face recognition method and mobile terminal
CN110995924B (en) Account management method and electronic equipment
CN110780751B (en) Information processing method and electronic equipment
CN109521937B (en) Screen display control method and mobile terminal
CN108021315B (en) Control method and mobile terminal
CN108089935B (en) Application program management method and mobile terminal
JP2021524203A (en) Object recognition method and mobile terminal
CN111475066B (en) Background switching method of application program and electronic equipment
CN111261128B (en) Screen brightness adjusting method and electronic equipment
CN110673761B (en) Touch key detection method and terminal equipment thereof
CN109739430B (en) Display method and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant