CN112286411A - Display mode control method and device, storage medium and electronic equipment - Google Patents

Display mode control method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN112286411A
CN112286411A CN202011069358.1A CN202011069358A CN112286411A CN 112286411 A CN112286411 A CN 112286411A CN 202011069358 A CN202011069358 A CN 202011069358A CN 112286411 A CN112286411 A CN 112286411A
Authority
CN
China
Prior art keywords
terminal
display mode
interaction
mode
gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011069358.1A
Other languages
Chinese (zh)
Inventor
程驰
周佳
包英泽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dami Technology Co Ltd
Original Assignee
Beijing Dami Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dami Technology Co Ltd filed Critical Beijing Dami Technology Co Ltd
Priority to CN202011069358.1A priority Critical patent/CN112286411A/en
Publication of CN112286411A publication Critical patent/CN112286411A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/02Electrically-operated educational appliances with visual presentation of the material to be studied, e.g. using film strip

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Geometry (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a display mode control method and device, a storage medium and electronic equipment. The method comprises the steps of collecting an image through a camera of a first terminal, identifying a face area in the image, calculating a proportion value of the face area in the image, and setting a display mode of the first terminal as a first display mode when the proportion value is larger than a proportion threshold value. Through the method, whether the proportion value occupied by the face area of the user in the image is larger than the proportion threshold value or not is monitored in real time, when the proportion value is detected to be larger than the proportion threshold value, the fact that the distance between the user and the terminal display screen is short can be determined, the display mode of the terminal is automatically set to the eye protection mode, accordingly, the eyesight of the user is protected, particularly, the eyesight of students in online teaching is well protected, and the students can learn online and have better experience.

Description

Display mode control method and device, storage medium and electronic equipment
Technical Field
The present application relates to the field of online education, and in particular, to a method and an apparatus for controlling a display mode, a storage medium, and an electronic device.
Background
With the development of the internet, intelligent devices are increasingly used in daily life of people, and particularly for young people, the intelligent devices are required for clothes and eating houses. The intelligent equipment brings very big facility for people's daily life on the one hand, and on the other hand also has the problem, uses intelligent equipment for a long time, will influence user's eyesight absolutely. Especially, the popularization of online education, more and more students use intelligent equipment to study on the internet, the online learning time is not short, and how to protect the eyesight of the students when the students use the intelligent equipment to study on the line is a problem to be solved urgently.
Disclosure of Invention
The embodiment of the application provides a control method and device of a display mode, a computer storage medium and electronic equipment, and aims to solve the technical problem of better protecting the eyesight of students in the process of online teaching. The technical scheme is as follows:
in a first aspect, an embodiment of the present application provides a method for controlling a display mode, where the method includes:
acquiring an image through a camera of a first terminal, and identifying a face area in the image;
calculating the proportion value of the face area in the image;
and when the proportion value is larger than a proportion threshold value, setting the display mode of the first terminal to be a first display mode.
Optionally, the display mode of the first terminal includes a first display mode and a second display mode;
the method further comprises the following steps:
and when the proportion value is in the proportion threshold range, setting the display mode of the first terminal to be a second display mode.
Optionally, the method further comprises:
detecting whether a second terminal and the first terminal are in a connection state;
when the second terminal is connected with the first terminal, starting a third display mode; the third display mode is used for displaying the display content of the first terminal on the second terminal.
Optionally, the starting a third display mode when the second terminal is in a connected state with the first terminal, further includes:
switching the interaction mode of the first terminal from a default interaction mode to a gesture interaction mode; the gesture interaction mode is an interaction mode which identifies interaction gestures of a user through a camera of the first terminal in an air-spaced mode and executes target events corresponding to the interaction gestures.
Optionally, after the third display mode is turned on, the method further includes:
and when the proportion value is larger than the proportion threshold value, closing the third display mode, switching the display mode of the first terminal from the second display mode to the first display mode, and switching the interaction mode of the first terminal from the gesture interaction mode to the default interaction mode.
Optionally, the method further comprises:
when the first terminal is in the gesture interaction mode, recognizing an interaction gesture in the image;
and when the interaction gesture is a preset interaction gesture, generating an interaction result picture based on the preset interaction gesture.
Optionally, when the interaction gesture is a preset interaction gesture, generating an interaction result screen based on the preset interaction gesture includes:
when the interaction gesture is a question answering interaction gesture, generating a question interaction result picture based on the question answering interaction gesture;
and displaying the title interaction result picture through the first terminal and the second terminal.
Optionally, the method further comprises:
when the proportion value is larger than the proportion threshold value, an eye protection prompt message is displayed through a display unit; the eye protection prompting message is used for reminding a user of keeping a proper distance from the first terminal.
Optionally, the first display mode is to adjust the brightness and/or the color tone of the display screen of the first terminal to a threshold range.
In a second aspect, an embodiment of the present application provides a device for controlling display modes, where the device includes:
the image acquisition module is used for acquiring an image through a camera of a first terminal and identifying a face area in the image;
the proportion calculation module is used for calculating the proportion value of the face area in the image;
and the display mode setting module is used for setting the display mode of the first terminal to be a first display mode when the proportion value is larger than a proportion threshold value.
In a third aspect, embodiments of the present application provide a computer storage medium having a plurality of instructions adapted to be loaded by a processor and to perform the above-mentioned method steps.
In a fourth aspect, an embodiment of the present application provides an electronic device, which may include: a memory and a processor; wherein the memory stores a computer program adapted to be loaded by the memory and to perform the above-mentioned method steps.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
when the scheme of the embodiment of the application is executed, the camera of the first terminal is used for collecting the image, the face area in the image is identified, the proportion value occupied by the face area in the image is calculated, and when the proportion value is larger than the proportion threshold value, the display mode of the first terminal is set to be the first display mode. Through the method, whether the proportion value occupied by the face area of the user in the image is larger than the proportion threshold value or not is monitored in real time, when the proportion value is detected to be larger than the proportion threshold value, the fact that the distance between the user and the terminal display screen is short can be determined, the display mode of the terminal is automatically set to the eye protection mode, accordingly, the eyesight of the user is protected, particularly, the eyesight of students in online teaching is well protected, and the students can learn online and have better experience.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a schematic diagram of a system architecture of a control method for a display mode of the present application;
fig. 2 is a schematic flowchart of a method for controlling a display mode according to an embodiment of the present disclosure;
fig. 3 is a flowchart illustrating a method for controlling a display mode according to an embodiment of the present application;
fig. 4 is a schematic display interface diagram of a control method of a display mode according to an embodiment of the present disclosure;
fig. 5 is a display interface schematic diagram of a control method of a display mode according to an embodiment of the present application;
fig. 6 is a display interface schematic diagram of a control method of a display mode according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a control device for a display mode according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
Detailed Description
In order to make the objects, features and advantages of the embodiments of the present application more obvious and understandable, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the application, as detailed in the appended claims.
In the description of the present application, it is to be understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. The specific meaning of the above terms in the present application can be understood in a specific case by those of ordinary skill in the art.
Referring to fig. 1, a schematic diagram of an exemplary system architecture 100 to which a display mode control method or a display mode control apparatus according to an embodiment of the present application may be applied is shown.
As shown in fig. 1, the system architecture 100 may include one or more of terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation. For example, server 105 may be a server cluster comprised of multiple servers, or the like.
A user may use terminal devices 101,102,103 to interact with a server 105 via a network 104,
to receive or transmit messages, etc. The terminal devices 101, 102, 103 may be various electronic devices having a display screen, including but not limited to smart phones, tablets, portable computers, desktop computers, televisions, and the like.
The terminal apparatuses 101, 102, 103 in the present application may be terminal apparatuses that provide various services. For example, a user acquires an image through a camera of the terminal device 103 (which may also be the terminal device 101 or 102), recognizes a face region in the image, calculates a ratio value of the face region in the image, and sets the display mode of the first terminal to the first display mode when the ratio value is greater than a ratio threshold.
It should be noted that the control method of the display mode provided in the embodiments of the present application may be executed by one or more of the terminal devices 101, 102, and 103, and/or the server 105, and accordingly, the control device of the display mode provided in the embodiments of the present application is generally disposed in the corresponding terminal device, and/or the server 105, but the present application is not limited thereto.
In the following method embodiments, for convenience of description, only the main execution subject of each step is described as an electronic device.
Please refer to fig. 2, which is a flowchart illustrating a method for controlling a display mode according to an embodiment of the present disclosure. As shown in fig. 2, the method of the embodiment of the present application may include the steps of:
s201, collecting an image through a camera of a first terminal, and identifying a face area in the image.
The first terminal can be an intelligent device such as a mobile phone, a tablet, a notebook computer or a desktop computer.
Generally, an image containing a human face is acquired in real time through a front-facing camera of a first terminal, and a human face area in the image can be identified through a human face detection technology. The face detection technology refers to detecting a face in an image and being capable of marking the position of the face. The face detection technology mainly completes two tasks: judging whether the image contains a face area or not; and if the face exists in the image, predicting the position of the face. The face detection technique can be divided into a pre-deep learning period and a deep learning period: in the early deep learning period, a traditional computer vision algorithm is mainly applied to face detection, and relies on manual feature extraction, and then the manual features are used for training a detector; in the deep learning period, there are two general ways to try to apply the convolutional neural network to the face detection: one is to apply a target detection network suitable for various tasks to a face detection task; the other is to study a special face detection network.
S202, calculating the proportion value of the face area in the image.
Generally, the position of a face region in an image is determined based on a face detection technology, an area value of the face region may be calculated, and further, an area ratio value between the area value of the face region and the area value in the whole image may be calculated, and the area ratio value may be a ratio value of the face region in the image.
And S203, when the proportion value is larger than the proportion threshold value, setting the display mode of the first terminal as a first display mode.
The proportion threshold value is preset, and when a student keeps a proper distance from the screen of the intelligent terminal, the face area of the student occupies the proportion value of the screen of the intelligent terminal. The first display mode is a display mode of the terminal, and it can be understood that the display mode of the terminal can be divided into two types, namely, the first display mode and a default display mode. The first display mode may be to adjust the display screen brightness of the first terminal to a brightness threshold range and the tone to a tone threshold range, and the brightness value in the brightness threshold range and the tone value in the tone threshold range are brightness values and tone values capable of protecting the eyesight of the student without affecting the learning of the student using the terminal. The default display mode is to adjust the brightness and the tone of the display screen according to the brightness value and the tone value of the display screen set by the student at the first terminal, and display the picture through the display screen.
For example: the proportion threshold value can be set to 50%, the preset brightness value in the eye protection mode can be set to 40%, the preset color tone can be set to 20% of the warm tone, when the face area is detected to occupy the proportion value of the image to be greater than 50%, the current distance from the student to the first terminal can be determined to be short, the brightness of the display screen of the first terminal is adjusted to 40%, and the color tone of the display screen is adjusted to 20% of the warm tone.
When the scheme of the embodiment of the application is executed, the camera of the first terminal is used for collecting the image, the face area in the image is identified, the proportion value occupied by the face area in the image is calculated, and when the proportion value is larger than the proportion threshold value, the display mode of the first terminal is set to be the first display mode. Through the method, whether the proportion value occupied by the face area of the user in the image is larger than the proportion threshold value or not is monitored in real time, when the proportion value is detected to be larger than the proportion threshold value, the fact that the distance between the user and the terminal display screen is short can be determined, the display mode of the terminal is automatically set to the eye protection mode, accordingly, the eyesight of the user is protected, particularly, the eyesight of students in online teaching is well protected, and the students can learn online and have better experience.
Please refer to fig. 3, which is a flowchart illustrating a method for controlling a display mode according to an embodiment of the present disclosure.
The first terminal related to the embodiment of the application may have two display modes and two interaction modes, the display modes may include a first display mode and a second display mode, and the interaction modes may include a default interaction mode and a gesture interaction mode.
Specifically, the first display mode may be a threshold range that can adjust the brightness value and/or the color tone of the display screen to a corresponding value, and the first display mode may also be referred to as an eye protection mode, that is, the brightness or the color tone of the display screen within the threshold range can play a role in protecting the vision of the user; the second display mode is to adjust the brightness and/or the tone of the display screen of the first terminal to the brightness value or the tone value set by the user, and the second display mode may also be referred to as a default display mode, that is, to adjust the brightness or the tone of the display screen of the first terminal according to the setting of the user.
Specifically, the gesture interaction mode is an interaction mode for recognizing the interaction gesture of the user through a camera in an air-spaced mode and executing a target event corresponding to the interaction gesture; the default interaction mode is an interaction mode for detecting interaction operation of a user through a touch screen and executing a target event corresponding to the interaction operation.
As shown in fig. 3, the method of the embodiment of the present application may include the steps of:
s301, acquiring an image through a camera of the first terminal, and identifying a face area in the image.
Specifically, see S201 in fig. 2, which is not described herein again.
S302, calculating a proportion value of the face area in the image, and judging whether the proportion value is larger than a proportion threshold value.
Specifically, see S202 in fig. 2, which is not described herein again.
And S303, when the proportion value is larger than the proportion threshold value, setting the display mode of the first terminal as a first display mode.
The first display mode is to adjust the brightness of the display screen of the first terminal to a brightness threshold range and adjust the color tone to a color tone threshold range.
And when the display mode of the first terminal is the first display mode, the interaction mode of the first terminal is a default interaction mode. The default interaction mode may be that the user performs a touch operation on a touch screen of the terminal with a finger, and the first terminal recognizes the touch operation of the user to perform an interaction process. For example, the first terminal is an electronic device with a touch screen, such as a mobile phone or a tablet, or may be an electronic device without a touch screen, where the first terminal performs a mouse click touch operation on the first terminal by using a mouse, and the first terminal recognizes the mouse click touch operation of the user to perform an interaction process, for example, the first terminal is an electronic device without a touch screen, such as a notebook computer or a desktop computer.
For example: the proportion threshold value can be set to 50%, the brightness threshold value range in the eye protection mode can be set to 30% -40%, the hue threshold value range can be set to 20% -30% of warm tone, and when a face area is detected
When the proportion value of the domain in the image is larger than 50%, the current distance from the student to the first terminal can be determined to be short, the brightness of the display screen of the first terminal is adjusted to 40%, and the color tone of the display screen is adjusted to 20% of the warm tone.
And S304, displaying an eye protection prompt message through a display unit of the first terminal.
Wherein, the eye protection prompting message is used for reminding the user to keep a proper distance with the first terminal.
It can be understood that whether the user keeps a short distance from the display screen of the first terminal or not is judged by calculating the proportional value of the face area in the image, when the proportional value is greater than the proportional threshold value, it can be determined that the user keeps the short distance from the display screen of the first terminal, the eyesight of the user is affected, after the display mode of the user is set to the eye protection mode, an eye protection prompt message is further displayed through the display unit of the first terminal, the eye protection prompt message can remind the user of keeping the short distance from the display screen of the first terminal, and after the user sees the eye protection prompt message, the user can adjust and keep a proper distance from the display screen of the first terminal. Referring to the schematic display screen of the first terminal shown in fig. 4, 410 is a teaching video display interface, 420 is an eye protection prompting message, and when the first terminal is in the eye protection mode, the eye protection prompting message is also displayed on the display screen.
It can be understood that, when it is detected that the ratio value is greater than the ratio threshold value in a short time, the eye protection prompting message may be displayed through the display unit, for example, within 5 minutes or within 10 minutes or within any other time period, which may be set arbitrarily according to the actual situation, and this time period is not limited in any way by the embodiment of the present application.
It can be understood that, except for the eye protection prompting message displayed through the display unit in the embodiment of the application, the voice eye protection prompting message can be played through the voice unit to remind the user of keeping a proper distance from the first terminal, and the voice prompting message can be played when the proportion value detected in a short time is larger than the proportion threshold value. For example, the eye protection voice prompt message may be "please keep a distance from the screen, and if the eye protection voice prompt message keeps a short distance from the screen for a long time, the eyesight may be affected".
S305, when the proportion value is in the proportion threshold range, setting the display mode of the first terminal to be a second display mode, and detecting whether the second terminal and the first terminal are in a connection state.
The proportion threshold range is based on the proportion threshold range of the face area of the user occupying the image when the user is moderate to the screen. The second display mode is a default display mode, that is, the brightness or the color tone of the display screen of the first terminal can be adjusted according to the setting of the user, and the display screen of the first terminal displays the picture based on the brightness or the color tone. The size of the display screen of the second terminal can be larger than that of the display screen of the first terminal, the second terminal can be an intelligent device such as a computer, a tablet and an intelligent television, and the second terminal and the first terminal can be connected through a wired network, a wireless network or Bluetooth. Such as: the first terminal is a mobile phone
Or when the second terminal is a tablet, the second terminal can be a large-screen intelligent device such as a computer or an intelligent television; when the first terminal is a computer, the second terminal can be a large-screen intelligent device such as an intelligent television.
S306, when the second terminal and the first terminal are in a connection state, the third display mode is started, and the interaction mode of the first terminal is switched from the default interaction mode to the gesture interaction mode.
And the third display mode is used for displaying the display content of the first terminal on the second terminal. The gesture interaction mode is an interaction mode which identifies the interaction gesture of the user through a camera of the first terminal in an air-spaced mode and executes a target event corresponding to the interaction gesture. The default interaction mode is an interaction mode for detecting interaction operation of a user through a touch screen and executing a target event corresponding to the interaction operation.
It can be understood that the interaction gesture may be a preset interaction gesture, and when the user makes the preset interaction gesture, the terminal recognizes the preset interaction gesture and executes a corresponding target event according to the preset interaction gesture.
For example: the application scene of the application is online education, and the gesture interaction mode can be set in a game link or a question answering link. In the answering link, aiming at the selected questions, different option interaction gestures are designed in advance, when the students answer the questions, the options are made, the first terminal identifies the option interaction gestures, and a display interface of the answering result is displayed through the display unit.
It is understood that when the scale value is in the scale threshold range and the second terminal is in a connected state with the first terminal, the display mode of the first terminal is set to the second display mode, the interaction mode of the first terminal is set to the gesture interaction mode, and the third display mode is turned on.
S307, when the second terminal and the first terminal are not in a connection state, the display mode of the first terminal is a second display mode, and the interaction mode of the first terminal is a default interaction mode.
It can be understood that, when the scale value is in the scale threshold range and the second terminal is not in a connected state with the first terminal, the display mode of the first terminal is set to the second display mode, and the interaction mode of the first terminal is set to the default interaction mode.
And S308, identifying the interactive gesture in the image, and generating a question interactive result picture based on the answer interactive gesture.
Generally, an image is collected through a camera, the image is recognized based on a pre-trained interactive gesture recognition network, a user gesture in the image is recognized, the user gesture is compared with preset interactive gestures, similarity values between the gesture and the preset interactive gestures are respectively calculated, and the preset interactive gesture with the largest similarity value is selected as a gesture corresponding to a user. When the option corresponding to the preset interaction gesture is the correct option, generating a correct answer display picture, wherein the correct answer character can be displayed on the display picture, and the analysis of each option can also be displayed; when the option corresponding to the preset interactive gesture is the wrong option, generating a wrong answer display
And a display screen, wherein the display screen can display the word sample of the wrong answer and can also display the analysis of each option.
S309, displaying a title interaction result picture through the first terminal and the second terminal.
Generally, based on the task interaction result picture generated in S308, since the interaction gesture mode not only corresponds to the default display mode, but also has a large screen mode, the first terminal and the second terminal may be connected through wireless communication via a wireless network or bluetooth, and when the large screen mode is in an on state, the display content on the first terminal is synchronously displayed on the second terminal in real time.
For example: the method comprises the steps that a certain selected question has four options of ABCD, each option corresponds to a preset interaction gesture, when the gesture of a user is recognized by a first terminal, the preset interaction gesture corresponding to the gesture is determined, when the option corresponding to the preset interaction gesture is a correct option, a correct answer display picture is displayed, correct answer word patterns can be displayed on the display picture, analysis of each option can also be displayed, and the display picture can be seen in a schematic diagram of the display picture shown in FIG. 5; when the option corresponding to the preset interaction gesture is an error option, a wrong-answer display screen is displayed, where a word sample of the wrong answer can be displayed on the display screen, and an analysis of each option can also be displayed, which can be seen in a schematic diagram of the display screen shown in fig. 6.
It can be understood that, in the method described in this embodiment of the present application, a ratio value of a face area to an image is detected in real time, when the ratio value described in S305 to S309 is smaller than a ratio threshold, both the display mode and the interaction mode of the first terminal are adjusted, when it is detected that the ratio value is larger than the ratio threshold at the next time, the third display mode is turned off, the display mode of the first terminal is switched from the second display mode to the first display mode, and the interaction mode of the first terminal is switched from the gesture interaction mode to the default interaction mode.
When the scheme of the embodiment of the application is executed, the camera of the first terminal is used for collecting the image, the face area in the image is identified, the proportion value occupied by the face area in the image is calculated, and when the proportion value is larger than the proportion threshold value, the display mode of the first terminal is set to be the first display mode. Through the method, whether the proportion value occupied by the face area of the user in the image is larger than the proportion threshold value or not is monitored in real time, when the proportion value is detected to be larger than the proportion threshold value, the fact that the distance between the user and the terminal display screen is short can be determined, the display mode of the terminal is automatically set to the eye protection mode, accordingly, the eyesight of the user is protected, particularly, the eyesight of students in online teaching is well protected, and the students can learn online and have better experience.
Please refer to fig. 7, which is a schematic structural diagram of a display mode control device according to an embodiment of the present application
Figure (a). The control means of the display mode may be implemented as all or a part of the terminal by software, hardware or a combination of both. The apparatus 700 comprises:
the image acquisition module 710 is configured to acquire an image through a camera of a first terminal and identify a face region in the image;
a proportion calculating module 720, configured to calculate a proportion value of the face region in the image;
a display mode setting module 730, configured to set the display mode of the first terminal to the first display mode when the ratio value is greater than the ratio threshold.
Optionally, the apparatus 700 further comprises:
and the second display mode setting module is used for setting the display mode of the first terminal to be the second display mode when the proportion value is in the proportion threshold range.
Optionally, the second display mode setting module further includes:
a connection state detection unit, configured to detect whether a second terminal and the first terminal are in a connection state;
the connection unit is used for starting a third display mode when the second terminal and the first terminal are in a connection state; the third display mode is used for displaying the display content of the first terminal on the second terminal.
Optionally, the second display mode setting module further includes:
the mode switching first unit is used for switching the interaction mode of the first terminal from a default interaction mode to a gesture interaction mode; the gesture interaction mode is an interaction mode which identifies interaction gestures of a user through a camera of the first terminal in an air-spaced mode and executes target events corresponding to the interaction gestures.
Optionally, the apparatus 700 further comprises:
and the mode switching second unit is used for closing the third display mode, switching the display mode of the first terminal from the second display mode to the first display mode, and switching the interaction mode of the first terminal from the gesture interaction mode to the default interaction mode when the proportion value is larger than the proportion threshold value.
Optionally, the second display mode setting module further includes:
a gesture interaction first unit, configured to recognize an interaction gesture in the image when the first terminal is in the gesture interaction mode;
and the gesture interaction second unit is used for generating an interaction result picture based on the preset interaction gesture when the interaction gesture is the preset interaction gesture.
Optionally, the second display mode setting module further includes:
a gesture interaction third unit, configured to generate a question interaction result screen based on the question-answering interaction gesture when the interaction gesture is a question-answering interaction gesture;
and the gesture interaction fourth unit is used for displaying the title interaction result picture through the first terminal and the second terminal.
Optionally, the apparatus 700 further comprises:
the eye protection prompting unit is used for displaying an eye protection prompting message through the display unit when the proportion value is larger than the proportion threshold value; the eye protection prompting message is used for reminding a user of keeping a proper distance from the first terminal.
Optionally, the apparatus 700 further comprises:
and the first display mode setting unit is used for adjusting the brightness and/or the tone of the display screen of the first terminal to a threshold range in the first display mode.
When the scheme of the embodiment of the application is executed, the camera of the first terminal is used for collecting the image, the face area in the image is identified, the proportion value occupied by the face area in the image is calculated, and when the proportion value is larger than the proportion threshold value, the display mode of the first terminal is set to be the first display mode. Through the method, whether the proportion value occupied by the face area of the user in the image is larger than the proportion threshold value or not is monitored in real time, when the proportion value is detected to be larger than the proportion threshold value, the fact that the distance between the user and the terminal display screen is short can be determined, the display mode of the terminal is automatically set to the eye protection mode, accordingly, the eyesight of the user is protected, particularly, the eyesight of students in online teaching is well protected, and the students can learn online and have better experience.
Referring to fig. 8, a schematic structural diagram of an electronic device according to an embodiment of the present application is shown, where the electronic device may be used to implement the control method of the display mode in the foregoing embodiment. Specifically, the method comprises the following steps:
the memory 820 may be used to store software programs and modules, and the processor 890 performs various functional applications and data processing by operating the software programs and modules stored in the memory 820. The memory 820 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data created according to use of the terminal device (such as audio data, a phone book, etc.)
And the like. Further, storage 820 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 820 may also include a memory controller to provide the processor 590 and the input unit 530 access to the memory 820.
The input unit 830 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. In particular, the input unit 830 may include a touch-sensitive surface 831 (e.g., a touch screen, touchpad, or touch frame). The touch-sensitive surface 831, also referred to as a touch display screen or a touch pad, may collect touch operations by a user on or near the touch-sensitive surface 831 (e.g., operations by a user on or near the touch-sensitive surface 831 using a finger, a stylus, or any other suitable object or attachment) and drive the corresponding connection device according to a predefined program. Alternatively, the touch-sensitive surface 831 can include two portions, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 890, and receives and executes commands from the processor 890. In addition, the touch-sensitive surface 831 can be implemented using various types of resistive, capacitive, infrared, and surface acoustic waves.
The display unit 840 may be used to display information input by or provided to a user and various graphical user interfaces of the terminal device, which may be made up of graphics, text, icons, video, and any combination thereof. The Display unit 840 may include a Display panel 841, and the Display panel 841 may be configured in the form of an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), or the like, as an option. Further, the touch-sensitive surface 831 can overlay the display panel 841 such that when a touch operation is detected at or near the touch-sensitive surface 831, it can communicate to the processor 890 to determine the type of touch event, and the processor 890 can then provide a corresponding visual output on the display panel 841 in accordance with the type of touch event. Although in FIG. 8, touch-sensitive surface 831 and display panel 841 are implemented as two separate components to implement input and output functions, in some embodiments, touch-sensitive surface 831 may be integrated with display panel 841 to implement input and output functions.
The processor 890 is a control center of the terminal device, connects various parts of the entire terminal device using various interfaces and lines, and performs various functions of the terminal device and processes data by operating or executing software programs and/or modules stored in the memory 820 and calling data stored in the memory 820, thereby integrally monitoring the terminal device. Optionally, processor 890 may include one or more processing cores;
processor 890 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 890.
Specifically, in this embodiment, the display unit of the terminal device is a touch screen display, the terminal device further includes a memory, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the one or more processors, and the one or more programs include steps of implementing a control method of the display mode.
An embodiment of the present application further provides a computer storage medium, where the computer storage medium may store a plurality of instructions, where the instructions are suitable for being loaded by a processor and executing the above method steps, and a specific execution process may refer to specific descriptions of the embodiments shown in fig. 2 and fig. 3, which are not described herein again.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a read-only memory or a random access memory.
The above disclosure is only for the purpose of illustrating the preferred embodiments of the present application and is not to be construed as limiting the scope of the present application, so that the present application is not limited thereto, and all equivalent variations and modifications can be made to the present application.

Claims (12)

1. A method for controlling a display mode, the method comprising:
acquiring an image through a camera of a first terminal, and identifying a face area in the image;
calculating the proportion value of the face area in the image;
and when the proportion value is larger than a proportion threshold value, setting the display mode of the first terminal to be a first display mode.
2. The method according to claim 1, wherein the display mode of the first terminal comprises a first display mode and a second display mode;
the method further comprises the following steps:
and when the proportion value is in the proportion threshold range, setting the display mode of the first terminal to be a second display mode.
3. The method of claim 2, further comprising:
detecting whether a second terminal and the first terminal are in a connection state;
when the second terminal is connected with the first terminal, starting a third display mode; the third display mode is used for displaying the display content of the first terminal on the second terminal.
4. The method of claim 3, wherein the turning on a third display mode while the second terminal is in a connected state with the first terminal further comprises:
switching the interaction mode of the first terminal from a default interaction mode to a gesture interaction mode; the gesture interaction mode is an interaction mode which identifies interaction gestures of a user through a camera of the first terminal in an air-spaced mode and executes target events corresponding to the interaction gestures.
5. The method of claim 4, wherein after the turning on the third display mode, further comprising:
and when the proportion value is larger than the proportion threshold value, closing the third display mode, switching the display mode of the first terminal from the second display mode to the first display mode, and switching the interaction mode of the first terminal from the gesture interaction mode to the default interaction mode.
6. The method of claim 4, further comprising:
when the first terminal is in the gesture interaction mode, recognizing an interaction gesture in the image;
and when the interaction gesture is a preset interaction gesture, generating an interaction result picture based on the preset interaction gesture.
7. The method according to claim 6, wherein when the interaction gesture is a preset interaction gesture, generating an interaction result screen based on the preset interaction gesture comprises:
when the interaction gesture is a question answering interaction gesture, generating a question interaction result picture based on the question answering interaction gesture;
and displaying the title interaction result picture through the first terminal and the second terminal.
8. The method of claim 1, further comprising:
when the proportion value is larger than the proportion threshold value, an eye protection prompt message is displayed through a display unit; the eye protection prompting message is used for reminding a user of keeping a proper distance from the first terminal.
9. The method of claim 1, wherein the first display mode is adjusting a display screen brightness and/or a color tone of the first terminal to a threshold range.
10. An apparatus for controlling a display mode, the apparatus comprising:
the image acquisition module is used for acquiring an image through a camera of a first terminal and identifying a face area in the image;
the proportion calculation module is used for calculating the proportion value of the face area in the image;
and the display mode setting module is used for setting the display mode of the first terminal to be a first display mode when the proportion value is larger than a proportion threshold value.
11. A computer storage medium, characterized in that it stores a plurality of instructions adapted to be loaded by a processor and to perform the method steps according to any of claims 1 to 10.
12. An electronic device, comprising: a processor and a memory; wherein the memory stores a computer program adapted to be loaded by the processor and to perform the method steps of any of claims 1 to 10.
CN202011069358.1A 2020-09-30 2020-09-30 Display mode control method and device, storage medium and electronic equipment Pending CN112286411A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011069358.1A CN112286411A (en) 2020-09-30 2020-09-30 Display mode control method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011069358.1A CN112286411A (en) 2020-09-30 2020-09-30 Display mode control method and device, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN112286411A true CN112286411A (en) 2021-01-29

Family

ID=74422092

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011069358.1A Pending CN112286411A (en) 2020-09-30 2020-09-30 Display mode control method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN112286411A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113253837A (en) * 2021-04-01 2021-08-13 作业帮教育科技(北京)有限公司 Air writing method and device, online live broadcast system and computer equipment
CN113485619A (en) * 2021-07-13 2021-10-08 腾讯科技(深圳)有限公司 Information collection table processing method and device, electronic equipment and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103092343A (en) * 2013-01-06 2013-05-08 深圳创维数字技术股份有限公司 Control method based on camera and mobile terminal
CN106603667A (en) * 2016-12-16 2017-04-26 北京小米移动软件有限公司 Screen information sharing method and device
US20180204111A1 (en) * 2013-02-28 2018-07-19 Z Advanced Computing, Inc. System and Method for Extremely Efficient Image and Pattern Recognition and Artificial Intelligence Platform
CN109191802A (en) * 2018-07-20 2019-01-11 北京旷视科技有限公司 Method, apparatus, system and storage medium for sight protectio prompt
CN109451164A (en) * 2018-11-21 2019-03-08 惠州Tcl移动通信有限公司 Intelligent terminal and its eye care method, the device with store function
CN109474738A (en) * 2018-10-30 2019-03-15 努比亚技术有限公司 Terminal and its eyeshield mode control method, computer readable storage medium
CN109683703A (en) * 2018-10-30 2019-04-26 努比亚技术有限公司 A kind of display control method, terminal and computer readable storage medium
CN109816718A (en) * 2017-11-22 2019-05-28 腾讯科技(深圳)有限公司 A kind of method, apparatus and storage medium of play cuing

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103092343A (en) * 2013-01-06 2013-05-08 深圳创维数字技术股份有限公司 Control method based on camera and mobile terminal
US20180204111A1 (en) * 2013-02-28 2018-07-19 Z Advanced Computing, Inc. System and Method for Extremely Efficient Image and Pattern Recognition and Artificial Intelligence Platform
CN106603667A (en) * 2016-12-16 2017-04-26 北京小米移动软件有限公司 Screen information sharing method and device
CN109816718A (en) * 2017-11-22 2019-05-28 腾讯科技(深圳)有限公司 A kind of method, apparatus and storage medium of play cuing
CN109191802A (en) * 2018-07-20 2019-01-11 北京旷视科技有限公司 Method, apparatus, system and storage medium for sight protectio prompt
CN109474738A (en) * 2018-10-30 2019-03-15 努比亚技术有限公司 Terminal and its eyeshield mode control method, computer readable storage medium
CN109683703A (en) * 2018-10-30 2019-04-26 努比亚技术有限公司 A kind of display control method, terminal and computer readable storage medium
CN109451164A (en) * 2018-11-21 2019-03-08 惠州Tcl移动通信有限公司 Intelligent terminal and its eye care method, the device with store function

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113253837A (en) * 2021-04-01 2021-08-13 作业帮教育科技(北京)有限公司 Air writing method and device, online live broadcast system and computer equipment
CN113485619A (en) * 2021-07-13 2021-10-08 腾讯科技(深圳)有限公司 Information collection table processing method and device, electronic equipment and storage medium
CN113485619B (en) * 2021-07-13 2024-03-19 腾讯科技(深圳)有限公司 Information collection table processing method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
US11227599B2 (en) Methods and user interfaces for voice-based control of electronic devices
WO2020057257A1 (en) Application interface switching method and mobile terminal
WO2013133618A1 (en) Method of controlling at least one function of device by using eye action and device for performing the method
CN109613958A (en) A kind of terminal equipment control method and terminal device
CN112286347A (en) Eyesight protection method, device, storage medium and terminal
CN112286411A (en) Display mode control method and device, storage medium and electronic equipment
US20230015943A1 (en) Scratchpad creation method and electronic device
CN111857484A (en) Screen brightness adjusting method and device, electronic equipment and readable storage medium
CN112287767A (en) Interaction control method, device, storage medium and electronic equipment
CN112698895A (en) Display method, device, equipment and medium of electronic equipment
US20170357568A1 (en) Device, Method, and Graphical User Interface for Debugging Accessibility Information of an Application
US10007418B2 (en) Device, method, and graphical user interface for enabling generation of contact-intensity-dependent interface responses
US20210382736A1 (en) User interfaces for calibrations and/or synchronizations
CN111930971A (en) Online teaching interaction method and device, storage medium and electronic equipment
CN112150777B (en) Intelligent operation reminding device and method
CN111651102B (en) Online teaching interaction method and device, storage medium and electronic equipment
CN111369848B (en) Courseware content interaction based method and device, storage medium and electronic equipment
CN106990843B (en) Parameter calibration method of eye tracking system and electronic equipment
CN111638918A (en) Method and apparatus for presenting information
JP2014224877A (en) Learning support system, display control method, program, and information storage medium
CN110209242A (en) Push button function binding, push button function call method, device and projection control equipment
Vidal Jr et al. Extending Smartphone-Based Hand Gesture Recognition for Augmented Reality Applications with Two-Finger-Pinch and Thumb-Orientation Gestures
US20230396854A1 (en) Multilingual captions
CN110363161B (en) Reading assisting method and system
CN110661919B (en) Multi-user display method, device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210129

RJ01 Rejection of invention patent application after publication