CN109557999B - Bright screen control method and device and storage medium - Google Patents

Bright screen control method and device and storage medium Download PDF

Info

Publication number
CN109557999B
CN109557999B CN201710875651.9A CN201710875651A CN109557999B CN 109557999 B CN109557999 B CN 109557999B CN 201710875651 A CN201710875651 A CN 201710875651A CN 109557999 B CN109557999 B CN 109557999B
Authority
CN
China
Prior art keywords
image
screen
terminal
face
processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710875651.9A
Other languages
Chinese (zh)
Other versions
CN109557999A (en
Inventor
陈朝喜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN201710875651.9A priority Critical patent/CN109557999B/en
Publication of CN109557999A publication Critical patent/CN109557999A/en
Application granted granted Critical
Publication of CN109557999B publication Critical patent/CN109557999B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2125Just-in-time application of countermeasures, e.g., on-the-fly decryption, just-in-time obfuscation or de-obfuscation
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Computer Security & Cryptography (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)
  • Studio Devices (AREA)

Abstract

The disclosure relates to a bright screen control method, a bright screen control device and a storage medium, and relates to the technical field of terminals. The method comprises the following steps: when the terminal is in a screen extinguishing state, when a first shielding object is detected through a configured distance sensor and the distance between the first shielding object and a screen of the terminal is smaller than a first preset distance, starting a front camera of the terminal; acquiring a first image through the front camera, and carrying out image processing on the first image; and when the first image is determined to comprise the first face image after image processing, switching from the screen-off state to the screen-on state. Therefore, the screen-lighting control is automatically realized without manual operation of a user, and the convenience of operation is improved, so that the screen-lighting control efficiency is improved.

Description

Bright screen control method, device and storage medium
Technical Field
The present disclosure relates to the field of terminal technologies, and in particular, to a method and an apparatus for controlling a bright screen, and a storage medium.
Background
Currently, terminals such as mobile phones, tablet computers and the like are widely used. In an actual application process, when the terminal is in the screen-off state, if the user wants to view the screen, for example, the user wants to view time, the terminal is required to light up the screen.
In the related art, a user is generally required to manually light up a screen, that is, a terminal is generally provided with a screen-up key, for example, the screen-up key may be a HOME key or the like. When a user wants to lighten the screen, the screen lightening key can be clicked to trigger a screen lightening instruction, and the terminal lightens the screen after receiving the screen lightening instruction.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides a bright screen control method, apparatus, and storage medium.
In a first aspect, a bright screen control method is provided, and is applied to a terminal, where the method includes:
when the terminal is in a screen extinguishing state, when a first shielding object is detected through a configured distance sensor and the distance between the first shielding object and a screen of the terminal is smaller than a first preset distance, starting a front camera of the terminal;
acquiring a first image through the front camera, and carrying out image processing on the first image;
and when the first image is determined to comprise the first face image after image processing, switching from the screen-off state to the screen-on state.
Optionally, after switching from the screen-off state to the screen-on state when it is determined that the first image includes the first face image after the image processing, the method further includes;
determining the similarity between the first face image and a pre-stored face check image;
and when the similarity between the first face image and the face verification image is greater than or equal to a preset similarity, executing unlocking operation.
Optionally, when the similarity between the first face image and the face verification image is greater than or equal to a preset similarity, after the unlocking operation is performed, the method further includes:
in a specified application scene, when a second obstruction is detected by the distance sensor and the distance between the second obstruction and the screen of the terminal is smaller than a second preset distance, acquiring a second image by the front camera and carrying out image processing on the second image;
and when the second image is determined to comprise a second face image after image processing, carrying out fuzzy processing on the content currently displayed on the screen, wherein the second face image is any one face image different from the first face image.
Optionally, when it is determined that the second image includes the second face image after the image processing, after the blurring processing is performed on the content currently displayed on the screen, the method further includes:
and when the second shelter is not detected through the distance sensor and the image acquired through the front camera does not comprise the second face image, canceling the fuzzy processing of the content currently displayed on the screen.
Optionally, when the similarity between the first face image and the face verification image is greater than or equal to a preset similarity, after the unlocking operation is performed, the method further includes:
when the first sheltering object is not detected through the distance sensor and the first face image is not included in the image collected through the front camera, the screen locking operation is executed, and the screen lightening state is switched to the screen extinguishing state.
In a second aspect, a bright screen control device is provided, configured in a terminal, the device including:
the starting module is used for starting the front camera of the terminal when a first shielding object is detected by a configured distance sensor and the distance between the first shielding object and the screen of the terminal is smaller than a first preset distance when the terminal is in a screen extinguishing state;
the first acquisition processing module is used for acquiring a first image through the front camera and processing the first image;
and the switching module is used for switching from the screen-off state to the screen-on state when the first image is determined to comprise the first face image after image processing.
Optionally, the apparatus further comprises;
the determining module is used for determining the similarity between the first face image and a pre-stored face checking image;
and the unlocking module is used for executing unlocking operation when the similarity between the first face image and the face verification image is greater than or equal to the preset similarity.
Optionally, the apparatus further comprises:
the second acquisition processing module is used for acquiring a second image through the front camera and processing the second image when a second obstruction is detected through the distance sensor and the distance between the second obstruction and the screen of the terminal is less than a second preset distance in a specified application scene;
and the blurring processing module is used for blurring the content currently displayed on the screen when the second image is determined to comprise a second face image after image processing, wherein the second face image is any one different from the first face image.
Optionally, the apparatus further comprises:
and the canceling module is used for canceling the fuzzy processing of the content currently displayed on the screen when the second shelter is not detected through the distance sensor and the second face image is not included in the image acquired through the front camera.
Optionally, the switching module is further configured to:
when the first shielding object is not detected through the distance sensor and the first face image is not included in the image collected through the front camera, the screen locking operation is executed, and the screen lightening state is switched to the screen extinguishing state.
In a third aspect, a bright screen control device is provided, the device comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform the steps of the bright screen control method of the first aspect.
In a fourth aspect, a computer-readable storage medium is provided, where instructions are stored on the computer-readable storage medium, and when executed by a processor, the instructions implement the bright screen control method according to the first aspect.
In a fifth aspect, there is provided a computer program product containing instructions which, when run on a computer, cause the computer to perform the method for controlling a bright screen according to the first aspect.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
when the terminal is in a screen extinguishing state, when a first shielding object is detected by the configured distance sensor and the distance between the first shielding object and the screen of the terminal is smaller than a first preset distance, it is indicated that the user possibly approaches the screen of the terminal. In order to further determine whether the user has a need to view the screen, a front camera of the terminal is started, and a first image is acquired through the front camera. And then, carrying out image processing on the first image, and if the first image is determined to comprise the first face image after the image processing, indicating that the user faces the terminal, namely the user has a requirement for checking the screen, and at the moment, switching from the screen-off state to the screen-on state.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a flowchart illustrating a bright screen control method according to an exemplary embodiment.
Fig. 2A is a flowchart illustrating a bright screen control method according to another exemplary embodiment.
Fig. 2B is a schematic plan view of a terminal according to the embodiment of fig. 2A.
Fig. 3A is a block diagram illustrating a bright screen control apparatus according to an exemplary embodiment.
Fig. 3B is a block diagram illustrating another bright screen control device according to an exemplary embodiment.
Fig. 3C is a block diagram illustrating another bright screen control device according to an exemplary embodiment.
Fig. 3D is a block diagram illustrating another bright screen control device according to an example embodiment.
Fig. 4 is a block diagram illustrating a bright screen control apparatus 400 according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
Before the embodiments of the present disclosure are described in detail, terms, application scenarios, and execution subjects related to the embodiments of the present disclosure will be briefly described.
First, terms related to the embodiments of the present disclosure will be briefly described.
A distance sensor: the distance sensor is disposed in the terminal, and further, the distance sensor is typically disposed at a front end of the terminal. The distance sensor may be used to detect an obstruction and may determine the distance between the obstruction and the screen of the terminal. In practical implementations, the distance sensor may be a P-sensor, or alternatively, a distance sensor implemented based on structured light, TOF (Time Of Flight), or other principles.
Secondly, the application scenarios related to the embodiments of the present disclosure are briefly introduced.
In real life, the terminal provides a user with a lot of convenience, for example, the user can view date, time, etc. by means of the terminal. When a user needs to view a screen of a terminal, the terminal is required to light up the screen. However, in the related art, the user is required to manually light the screen, which results in a complicated operation, low screen-lighting control efficiency, and reduced user experience. Therefore, the embodiment of the disclosure provides a screen-lighting control method, which can avoid the need of manual control by a user, and improve the convenience of operation, thereby improving the screen-lighting control efficiency and improving the user experience. For specific implementation, refer to the embodiment shown in fig. 1 or fig. 2A.
Next, a brief description will be given of an execution main body according to an embodiment of the present disclosure.
The bright screen control method provided by the embodiment of the disclosure can be executed by a terminal, wherein a distance sensor and a front camera are arranged in the terminal, and further, the distance sensor is usually arranged near the front camera. The terminal can detect the sheltering object with a certain distance from the screen of the terminal through the distance sensor, and image acquisition can be carried out through the front camera. Further, the terminal also has an image processing capability to judge whether the acquired image includes a face image. In practical application scenarios, the terminal may include, but is not limited to, a mobile phone, a tablet computer, and a computer.
After the nouns, application scenarios and execution bodies related to the embodiments of the present disclosure are introduced, the bright screen control method related to the embodiments of the present disclosure will be described in detail with reference to fig. 1 and fig. 2A. For specific implementation, refer to the embodiments shown in fig. 1 and fig. 2A.
Fig. 1 is a flowchart illustrating a bright screen control method according to an exemplary embodiment, where the bright screen control method is used in a terminal, as shown in fig. 1, and the bright screen control method may include the following implementation steps:
in step 101, when the terminal is in a screen-off state, a first shielding object is detected by a configured distance sensor, and a distance between the first shielding object and a screen of the terminal is smaller than a first preset distance, a front camera of the terminal is started.
In step 102, a first image is captured by the front camera and is subjected to image processing.
In step 103, when it is determined that the first image includes the first face image after the image processing, the screen is switched from the off-screen state to the bright-screen state.
In the embodiment of the disclosure, when the terminal is in the screen-off state, and the first shielding object is detected by the configured distance sensor, and the distance between the first shielding object and the screen of the terminal is smaller than the first preset distance, it is indicated that the user may be close to the screen of the terminal. In order to further determine whether the user has a need to view the screen, a front camera of the terminal is started, and a first image is acquired through the front camera. And then, carrying out image processing on the first image, and if the first image is determined to comprise the first face image after the image processing, indicating that the user faces the terminal, namely the user has a requirement for checking the screen, and at the moment, switching from the screen-off state to the screen-on state.
Optionally, when it is determined that the first image includes the first face image after the image processing, switching from the screen-off state to the screen-on state further includes;
determining the similarity between the first face image and a pre-stored face verification image;
and when the similarity between the first face image and the face verification image is greater than or equal to the preset similarity, executing unlocking operation.
Optionally, when the similarity between the first face image and the face verification image is greater than or equal to a preset similarity, after the unlocking operation is performed, the method further includes:
in a specified application scene, when a second shelter is detected by the distance sensor and the distance between the second shelter and the screen of the terminal is less than a second preset distance, acquiring a second image by the front camera and carrying out image processing on the second image;
and when the second image is determined to comprise a second face image after image processing, carrying out fuzzy processing on the content currently displayed on the screen, wherein the second face image is any one face image different from the first face image.
Optionally, when it is determined that the second image includes the second face image after the image processing, after performing the blurring processing on the content currently displayed on the screen, the method further includes:
and when the second shelter is not detected through the distance sensor and the image acquired through the front camera does not comprise the second face image, canceling the fuzzy processing of the content currently displayed on the screen.
Optionally, when the similarity between the first face image and the face verification image is greater than or equal to a preset similarity, after the unlocking operation is performed, the method further includes:
when the first shielding object is not detected through the distance sensor and the first face image is not included in the image acquired through the front camera, the screen locking operation is executed, and the screen lightening state is switched to the screen extinguishing state.
All the above optional technical solutions can be combined arbitrarily to form optional embodiments of the present disclosure, and the embodiments of the present disclosure are not described in detail again.
Fig. 2A is a flowchart illustrating a bright screen control method according to an exemplary embodiment, where as shown in fig. 2A, the bright screen control method is used in a terminal, and the bright screen control method may include the following implementation steps:
in step 201, when the terminal is in a screen-off state, a first shielding object is detected by a configured distance sensor and a distance between the first shielding object and a screen of the terminal is less than a first preset distance, a front camera of the terminal is started.
The first preset distance may be set by a user according to actual needs in a user-defined manner, or may be set by the terminal in a default manner, which is not limited in the embodiment of the present disclosure.
When the terminal is in a screen-off state, when a first shielding object is detected by a configured distance sensor and the first shielding area is at a certain distance from the screen of the terminal, it may be that the face of the user is close to the screen of the terminal, that is, the first shielding object is a face. However, in a practical application scenario, since it may also be that the user blocks the distance sensor with his hand while holding the terminal, the first obstruction detected by the distance sensor may also be the user's hand. Therefore, in order to be able to accurately determine whether the face of the user is close to the screen of the terminal, that is, whether the user has a need to view the screen, the terminal activates the configured front camera.
In step 202, a first image is captured by the front camera and is subjected to image processing.
After the terminal starts the front camera, image acquisition is carried out through the front camera to obtain a first image. It is understood that if the user wants to view the screen, the face of the user usually faces the screen, and therefore, in order to determine whether the user is viewing the screen, the terminal performs image processing on the acquired first image to determine whether the first image includes a face image.
It should be noted that, for a specific implementation of the image processing performed on the first image by the terminal, reference may be made to related technologies, and details of this embodiment of the disclosure are not described again.
In step 203, when it is determined that the first image includes the first face image after the image processing, the screen is switched from the off-screen state to the bright-screen state.
If the first image is determined to include the first face image, it is indicated that the face of the user faces the screen of the terminal, that is, the user may have a need to view the screen of the terminal, and therefore, the terminal is switched from the screen-off state to the screen-on state, and the screen-on control is realized. In this way, the user can view the content displayed in the screen, for example, view the date, time, and the like.
Thus, the bright screen control method provided by the embodiment of the disclosure is realized. In addition, in an actual application scenario, a user may not only have a need to view a screen, but also have a need to use a terminal, but the terminal is usually locked in a screen-off state, so that, in order to further improve user experience and enable the user to use the terminal quickly, the embodiment of the present disclosure further provides an automatic unlocking function, and please refer to the following steps 204 to 205 for specific implementation.
In step 204, the similarity between the first face image and the pre-stored face verification image is determined.
In a specific implementation, the terminal stores a face verification image in advance, where the face verification image is typically a face image of a user having a right to use the terminal, and for example, the face verification image may be a face image of an owner of the terminal (which is described below as an example). That is, before that, the owner of the terminal may take a picture of his face image using the camera of the terminal, and then store the face image in the terminal as a face verification image.
Of course, the above description is only given by taking the example that the owner of the terminal shoots his/her face image by using the camera device of the terminal, and in practical implementation, the owner may also shoot his/her face image by using another terminal, and then store the shot face image in the terminal by using another terminal.
Therefore, in the unlocking process, the first face image of the unlocking user can be verified by using the face verification image. Since the first face image may not be the face image of the owner of the terminal, at this time, if the terminal is directly unlocked, there may be a certain potential safety hazard in the use of the terminal. For this, the terminal determines a similarity between the first face image and the face verification image to determine whether the first face image is a face image of an owner of the terminal.
It should be noted that, for a specific implementation of determining the similarity between the first face image and the pre-stored face verification image, reference may be made to image processing related technologies, and details of this implementation of the present disclosure are not described herein.
In step 205, when the similarity between the first face image and the face verification image is greater than or equal to the preset similarity, an unlocking operation is performed.
When the similarity between the first face image and the face verification image is greater than or equal to the preset similarity, it is indicated that the first face image is almost the same as the face verification image, and at this time, it can be indicated that the first face image is the face image of the owner of the terminal, that is, it can be considered that the user corresponding to the first face image has the use authority for the terminal, and therefore, the terminal can automatically execute the unlocking operation.
Therefore, when the similarity between the first face image and the face verification image is determined to be greater than or equal to the preset similarity, the unlocking operation is automatically executed, manual unlocking by a user is avoided, convenience of operation is improved, and unlocking efficiency is improved.
Further, when the first barrier is not detected through the distance sensor and the first face image is not included in the image acquired through the front camera, a screen locking operation is executed, and the screen lightening state is switched to the screen extinguishing state.
In an actual application scenario, since it cannot be fully described that the user is not currently using the terminal when the first obstruction is not detected by the distance sensor, for example, when the user places the terminal on a desktop, the terminal may not detect the user by the distance sensor, but actually the user may still view the terminal. Therefore, when the first shelter is not detected by the distance sensor, it can only be stated that the user may not use the terminal, and in order to further accurately determine whether the user does not use the terminal, the terminal needs to perform image acquisition by the front camera and perform image processing on the acquired image to determine whether the image includes the first face image.
That is, when the first blocking object is not detected by the distance sensor and the first face image is not included in the image captured by the front camera, it is determined that the user is not actually using the terminal currently. At this time, in order to save the consumption of the terminal on the electric quantity, the terminal automatically executes the screen locking operation, and the screen lightening state is switched to the screen extinguishing state.
Therefore, the distance sensor is used for detecting the first shielding object and acquiring the image through the front camera to determine whether the user uses the terminal currently, and the determination accuracy is improved. In addition, when the terminal is detected to be used by no person, the screen is automatically locked and turned off, manual operation of a user is not needed, convenience of operation is improved, power consumption is saved, and user experience is improved.
Therefore, the functions of screen lightening control and automatic unlocking of the terminal are achieved. In addition, in practical implementation, under some specific application scenarios, a user using a terminal may not want other users to see the content displayed in the current screen, and for this reason, the implementation of the present disclosure further provides a function of performing a blurring process on the displayed content, and for specific implementation, see several implementation steps below.
In step 206, in a specific application scenario, when a second obstruction is detected by the distance sensor and the distance between the second obstruction and the screen of the terminal is less than a second preset distance, a second image is acquired by the front-facing camera and is subjected to image processing.
The specific application scenario refers to an application scenario that needs privacy protection, for example, the specific application scenario may include, but is not limited to, a network video chat scenario, a view privacy information application scenario, and a scenario that uses a specific application, where the specific application may be specified by a user in advance. That is, in the specific application scenario, the user who is using the terminal may not want other users to see the content currently displayed in the screen, and for this reason, when the terminal detects that other users approach and view the screen, the displayed content may be blurred, which is implemented as follows.
The second preset distance may be set by a user according to actual needs in a user-defined manner, or may be set by a default of a terminal, which is not limited in the embodiment of the present disclosure.
That is, when the terminal detects the second shielding object through the distance sensor and the second shielding object is away from the screen of the terminal by a certain distance, the terminal acquires the second image through the front-facing camera. The front-facing camera may be operated in the background, and in this case, the front-facing camera may be operated again at the front end to acquire the second image through the front-facing camera.
Similarly, in order to determine whether the second obstruction is a face of another user, the terminal performs image processing on the acquired second image to determine whether the acquired image includes a second face image, where the second face image is any face image different from the first face image.
For example, referring to fig. 2B, the detection distance d in fig. 2B may refer to the second preset distance, and the area a refers to an image area that can be acquired by the front camera 21. That is, when the terminal detects that the second blocking object enters the area within the distance d through the distance sensor 22, the front camera 21 performs image acquisition on the area a, and then the terminal performs image processing on the acquired image to determine whether the acquired image includes the second face image.
In step 207, when it is determined that the second image includes the second face image after the image processing, the content currently displayed on the screen is blurred.
When the second image is determined to comprise the second face image after the image processing, it indicates that other users are watching the screen of the terminal, and at this time, in order to protect the privacy of the users, the terminal performs fuzzy processing on the content currently displayed on the screen. Therefore, other users can not see the content currently displayed on the screen clearly, and the privacy of the user using the terminal is protected.
In step 208, when the second obstruction is not detected by the distance sensor and the second face image is not included in the image acquired by the front camera, the blurring process on the content currently displayed on the screen is cancelled.
In the same way as described above, since it cannot be completely determined that the other user is not watching the display content of the screen when the second obstruction is not detected by the distance sensor, in order to accurately determine whether the other user is actually not watching the display content of the screen, the terminal further needs to perform image acquisition by the front camera and perform image processing on the acquired image. If the acquired image does not comprise the second face image after the image processing, the display content of the screen is determined not to be watched by other users, and at the moment, in order to not delay the normal use of the terminal by the user, the fuzzy processing on the content currently displayed on the screen can be cancelled.
It should be noted that, here, the description is only given by taking as an example that when the second obstruction is not detected by the distance sensor and the second face image is not included in the image acquired by the front camera, the blurring processing on the content currently displayed on the screen is cancelled. In another embodiment, since the user may not mind other users to watch the content currently displayed on the screen, the user may also manually cancel the blurring of the content currently displayed on the screen so as not to delay the normal use of the terminal by the user.
For example, the terminal may be further provided with a cancel blur processing option, which may be clicked to trigger a cancel blur processing instruction when the user does not need to blur the currently displayed content by the terminal, and accordingly, when the terminal receives the cancel blur processing instruction, the blur processing of the currently displayed content on the screen is cancelled.
In the embodiment of the disclosure, when the terminal is in the screen-off state, and the first shielding object is detected by the configured distance sensor, and the distance between the first shielding object and the screen of the terminal is smaller than the first preset distance, it is indicated that the user may be close to the screen of the terminal. In order to further determine whether the user has a need to view the screen, a front camera of the terminal is started, and a first image is acquired through the front camera. And then, carrying out image processing on the first image, if the first image is determined to comprise the first face image after the image processing, indicating that the user faces the terminal, namely the user has a requirement for checking the screen, and at the moment, switching from the screen-off state to the screen-on state.
Fig. 3A is a block diagram illustrating a bright screen control device according to an exemplary embodiment. Referring to fig. 3A, the apparatus includes an activation module 310, a first acquisition processing module 312, and a switching module 314.
The starting module 310 is configured to, when the terminal is in a screen-off state and a first shielding object is detected by a configured distance sensor and a distance between the first shielding object and a screen of the terminal is smaller than a first preset distance, start a front-facing camera of the terminal;
the first collecting and processing module 312 is configured to collect a first image through the front-facing camera, and perform image processing on the first image;
the switching module 314 is configured to switch from the screen-off state to the screen-on state when it is determined that the first image includes the first face image after the image processing.
Optionally, referring to fig. 3B, the apparatus further includes:
a determining module 316, configured to determine a similarity between the first face image and a pre-stored face verification image;
an unlocking module 318, configured to execute an unlocking operation when the similarity between the first face image and the face verification image is greater than or equal to a preset similarity.
Optionally, referring to fig. 3C, the apparatus further includes:
the second acquisition processing module 320 is configured to, in an appointed application scene, acquire a second image through the front camera and perform image processing on the second image when a second obstruction is detected by the distance sensor and a distance between the second obstruction and the screen of the terminal is smaller than a second preset distance;
the blurring processing module 322 is configured to, when it is determined that the second image includes a second face image after the image processing, perform blurring processing on the content currently displayed on the screen, where the second face image is any face image different from the first face image.
Optionally, referring to fig. 3D, the apparatus further includes:
a canceling module 324, configured to cancel the blurring processing on the content currently displayed on the screen when the second obstruction is not detected by the distance sensor and the second face image is not included in the image acquired by the front-facing camera.
Optionally, the switching module 314 is further configured to:
when the first shielding object is not detected through the distance sensor and the first face image is not included in the image acquired through the front camera, the screen locking operation is executed, and the screen lightening state is switched to the screen extinguishing state.
In the embodiment of the disclosure, when the terminal is in the screen-off state, and the first shielding object is detected by the configured distance sensor, and the distance between the first shielding object and the screen of the terminal is smaller than the first preset distance, it is indicated that the user may be close to the screen of the terminal. In order to further determine whether the user has a need to view the screen, a front camera of the terminal is started, and a first image is acquired through the front camera. And then, carrying out image processing on the first image, if the first image is determined to comprise the first face image after the image processing, indicating that the user faces the terminal, namely the user has a requirement for checking the screen, and at the moment, switching from the screen-off state to the screen-on state.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Fig. 4 is a block diagram illustrating a bright screen control apparatus 400 according to an exemplary embodiment. For example, the apparatus 400 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 4, the apparatus 400 may include one or more of the following components: processing components 402, memory 404, power components 406, multimedia components 408, audio components 410, input/output (I/O) interfaces 412, sensor components 414, and communication components 416.
The processing component 402 generally controls overall operation of the apparatus 400, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 402 may include one or more processors 420 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 402 can include one or more modules that facilitate interaction between the processing component 402 and other components. For example, the processing component 402 can include a multimedia module to facilitate interaction between the multimedia component 408 and the processing component 402.
The memory 404 is configured to store various types of data to support operations at the apparatus 400. Examples of such data include instructions for any application or method operating on the device 400, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 404 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power supply components 406 provide power to the various components of device 400. The power components 406 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the apparatus 400.
The multimedia component 408 includes a screen that provides an output interface between the device 400 and the user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 408 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the apparatus 400 is in an operation mode, such as a photographing mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 410 is configured to output and/or input audio signals. For example, audio component 410 includes a Microphone (MIC) configured to receive external audio signals when apparatus 400 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 404 or transmitted via the communication component 416. In some embodiments, audio component 410 also includes a speaker for outputting audio signals.
The I/O interface 412 provides an interface between the processing component 402 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor component 414 includes one or more sensors for providing various aspects of status assessment for the apparatus 400. For example, the sensor assembly 414 may detect an open/closed state of the apparatus 400, the relative positioning of the components, such as a display and keypad of the apparatus 400, the sensor assembly 414 may also detect a change in the position of the apparatus 400 or a component of the apparatus 400, the presence or absence of user contact with the apparatus 400, orientation or acceleration/deceleration of the apparatus 400, and a change in the temperature of the apparatus 400. The sensor assembly 414 may include a proximity sensor configured to detect the presence of a nearby object in the absence of any physical contact. The sensor assembly 414 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 414 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 416 is configured to facilitate wired or wireless communication between the apparatus 400 and other devices. The apparatus 400 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 416 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 416 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 400 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the bright screen control method provided by the embodiment shown in fig. 1 or fig. 2A described above.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 404 comprising instructions, executable by the processor 420 of the apparatus 400 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
A non-transitory computer readable storage medium, wherein instructions of the storage medium, when executed by a processor of a mobile terminal, enable the mobile terminal to perform the bright screen control method provided in the embodiment of fig. 1 or fig. 2A described above.
A computer program product containing instructions which, when run on a computer, cause the computer to perform the bright screen control method provided in the embodiment of fig. 1 or fig. 2A described above.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (4)

1. A bright screen control method is applied to a terminal, and is characterized by comprising the following steps:
when the terminal is in a screen extinguishing state, when a first shielding object is detected by a distance sensor arranged at the front end of the terminal and the distance between the first shielding object and a screen of the terminal is smaller than a first preset distance, starting a front camera of the terminal;
acquiring a first image through the front camera, and carrying out image processing on the first image; when it is determined that the first image comprises a first face image after image processing, switching from the screen-off state to a screen-on state; determining similarity between the first face image and a pre-stored face check image, wherein the face check image is a face image of a user having a right of use for the terminal, and the face check image is shot by the user by using a camera device of the terminal or is shot by using other terminals except the terminal; when the similarity between the first face image and the face verification image is larger than or equal to a preset similarity, executing an unlocking operation;
when the similarity between the first face image and the face verification image is greater than or equal to a preset similarity, after an unlocking operation is performed, the method further includes:
when the first shielding object is not detected through the distance sensor and the first face image is not included in the image acquired through the front camera, executing screen locking operation and switching from the screen lightening state to the screen extinguishing state;
the method further comprises the following steps:
in a specified application scene, when a second shelter is detected by the distance sensor and the distance between the second shelter and the screen of the terminal is smaller than a second preset distance, acquiring a second image by the front camera and carrying out image processing on the second image, wherein the specified application scene is an application scene needing privacy protection; when the second image is determined to comprise a second face image after image processing, carrying out fuzzy processing on the content currently displayed on the screen, wherein the second face image is any face image different from the first face image;
when the second shelter is not detected through the distance sensor and the second face image is not included in the image acquired through the front camera, the fuzzy processing on the content currently displayed on the screen is cancelled; and when a blur canceling instruction is received, canceling the blur processing of the content currently displayed on the screen by the terminal, wherein the blur canceling instruction is triggered by the user through clicking operation of a blur canceling option when the user does not need to perform blur processing on the content currently displayed by the terminal.
2. A lighting control apparatus disposed in a terminal, the apparatus comprising:
the starting module is used for starting a front camera of the terminal when a first shielding object is detected by a distance sensor arranged at the front end of the terminal and the distance between the first shielding object and a screen of the terminal is smaller than a first preset distance when the terminal is in a screen-off state;
the first acquisition processing module is used for acquiring a first image through the front camera and processing the first image;
the switching module is used for switching from the screen off state to a screen on state when the first image is determined to comprise a first face image after image processing;
the determining module is used for determining the similarity between the first face image and a pre-stored face verification image, wherein the face verification image is a face image of a user having the use authority for the terminal, and the face verification image is shot by the user by using a camera device of the terminal or is shot by using other terminals except the terminal;
the unlocking module is used for executing unlocking operation when the similarity between the first face image and the face verification image is greater than or equal to a preset similarity;
the switching module is further configured to:
when the first shielding object is not detected through the distance sensor and the first face image is not included in the image acquired through the front camera, executing screen locking operation and switching from the screen lightening state to the screen extinguishing state;
the device further comprises:
the second acquisition processing module is used for acquiring a second image through the front camera and processing the second image when a second shelter is detected through the distance sensor and the distance between the second shelter and the screen of the terminal is smaller than a second preset distance in a specified application scene, wherein the specified application scene is an application scene needing privacy protection;
the blurring processing module is used for blurring the content currently displayed on the screen when the second image is determined to comprise a second face image after image processing, wherein the second face image is any one different from the first face image;
the canceling module is used for canceling the fuzzy processing of the content currently displayed on the screen when the second shelter is not detected through the distance sensor and the second face image is not included in the image acquired through the front camera; and when a blur canceling instruction is received, canceling the blur processing of the content currently displayed on the screen by the terminal, wherein the blur canceling instruction is triggered by the user through clicking operation of a blur canceling option when the user does not need to perform blur processing on the content currently displayed by the terminal.
3. A lighting control apparatus disposed in a terminal, the apparatus comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform the steps of the method of claim 1.
4. A computer readable storage medium having instructions stored thereon, wherein the instructions, when executed by a processor, implement the steps of the method of claim 1.
CN201710875651.9A 2017-09-25 2017-09-25 Bright screen control method and device and storage medium Active CN109557999B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710875651.9A CN109557999B (en) 2017-09-25 2017-09-25 Bright screen control method and device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710875651.9A CN109557999B (en) 2017-09-25 2017-09-25 Bright screen control method and device and storage medium

Publications (2)

Publication Number Publication Date
CN109557999A CN109557999A (en) 2019-04-02
CN109557999B true CN109557999B (en) 2022-08-26

Family

ID=65862587

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710875651.9A Active CN109557999B (en) 2017-09-25 2017-09-25 Bright screen control method and device and storage medium

Country Status (1)

Country Link
CN (1) CN109557999B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110597561A (en) * 2019-09-09 2019-12-20 上海赛连信息科技有限公司 Method, medium and device for displaying identification and computing equipment
WO2021092811A1 (en) * 2019-11-13 2021-05-20 Oppo广东移动通信有限公司 Proximity detection method, terminal and storage medium
CN111368670A (en) * 2020-02-26 2020-07-03 上海依图信息技术有限公司 Face recognition method and device, readable medium and system thereof
CN111383597B (en) * 2020-03-25 2021-07-06 武汉华星光电半导体显示技术有限公司 Pixel circuit and full-screen display equipment
CN111736725A (en) * 2020-06-10 2020-10-02 京东方科技集团股份有限公司 Intelligent mirror and intelligent mirror awakening method
CN114125143B (en) * 2020-08-31 2023-04-07 华为技术有限公司 Voice interaction method and electronic equipment
CN116112597B (en) * 2020-09-03 2023-10-20 荣耀终端有限公司 Electronic equipment with off-screen display function, method for displaying off-screen interface of electronic equipment and storage medium
CN113190119A (en) * 2021-05-06 2021-07-30 Tcl通讯(宁波)有限公司 Mobile terminal screen lighting control method and device, mobile terminal and storage medium
CN113674671A (en) * 2021-08-18 2021-11-19 惠科股份有限公司 Control method, peripheral controller and display device
CN116450067A (en) * 2022-01-10 2023-07-18 荣耀终端有限公司 Control method for screen-off display, electronic equipment and storage medium
CN114125148B (en) * 2022-01-11 2022-06-24 荣耀终端有限公司 Control method of electronic equipment operation mode, electronic equipment and readable storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104202483A (en) * 2014-08-20 2014-12-10 厦门美图移动科技有限公司 Display screen switch controller of mobile terminal
WO2015188516A1 (en) * 2014-06-13 2015-12-17 中兴通讯股份有限公司 Terminal application self-adaptive display method and device
CN105549739A (en) * 2015-12-10 2016-05-04 魅族科技(中国)有限公司 Screen lighting method and terminal
CN106295596A (en) * 2016-08-17 2017-01-04 深圳市金立通信设备有限公司 A kind of unlocking method based on recognition of face and terminal
CN106326867A (en) * 2016-08-26 2017-01-11 维沃移动通信有限公司 Face recognition method and mobile terminal
CN106599716A (en) * 2016-11-30 2017-04-26 广东欧珀移动通信有限公司 Message content protection method and device, and mobile terminal
CN107015745A (en) * 2017-05-19 2017-08-04 广东小天才科技有限公司 Screen operating method, device, terminal device and computer-readable recording medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015188516A1 (en) * 2014-06-13 2015-12-17 中兴通讯股份有限公司 Terminal application self-adaptive display method and device
CN104202483A (en) * 2014-08-20 2014-12-10 厦门美图移动科技有限公司 Display screen switch controller of mobile terminal
CN105549739A (en) * 2015-12-10 2016-05-04 魅族科技(中国)有限公司 Screen lighting method and terminal
CN106295596A (en) * 2016-08-17 2017-01-04 深圳市金立通信设备有限公司 A kind of unlocking method based on recognition of face and terminal
CN106326867A (en) * 2016-08-26 2017-01-11 维沃移动通信有限公司 Face recognition method and mobile terminal
CN106599716A (en) * 2016-11-30 2017-04-26 广东欧珀移动通信有限公司 Message content protection method and device, and mobile terminal
CN107015745A (en) * 2017-05-19 2017-08-04 广东小天才科技有限公司 Screen operating method, device, terminal device and computer-readable recording medium

Also Published As

Publication number Publication date
CN109557999A (en) 2019-04-02

Similar Documents

Publication Publication Date Title
CN109557999B (en) Bright screen control method and device and storage medium
CN106797416B (en) Screen control method and device
CN104615920B (en) Notification information display method and device
CN105159640B (en) Display interface rotating method and device and mobile terminal
US20170178289A1 (en) Method, device and computer-readable storage medium for video display
CN110662095B (en) Screen projection processing method and device, terminal and storage medium
US10292004B2 (en) Method, device and medium for acquiring location information
EP3179711B1 (en) Method and apparatus for preventing photograph from being shielded
EP3113466A1 (en) Method and device for warning
CN107798231B (en) Display method and device of operation interface, terminal and storage medium
CN106357934B (en) Screen locking control method and device
CN106527682B (en) Method and device for switching environment pictures
CN105868709B (en) Method and device for closing fingerprint identification function
CN111031177A (en) Screen recording method, device and readable storage medium
CN107132769B (en) Intelligent equipment control method and device
CN112929561B (en) Multimedia data processing method and device, electronic equipment and storage medium
CN111385456A (en) Photographing preview method and device and storage medium
CN107450950B (en) Method and device for processing starting instruction
CN112434338A (en) Picture sharing method and device, electronic equipment and storage medium
CN107656616B (en) Input interface display method and device and electronic equipment
CN106570381B (en) Fingerprint unlocking method and device
CN105786561B (en) Method and device for calling process
CN108647074B (en) Method, device, hardware device and medium for displaying dynamic information in screen locking state
CN107948876B (en) Method, device and medium for controlling sound box equipment
CN107832377B (en) Image information display method, device and system, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant