CN112540649A - Rotation control method of display screen and notebook computer - Google Patents

Rotation control method of display screen and notebook computer Download PDF

Info

Publication number
CN112540649A
CN112540649A CN202011437080.9A CN202011437080A CN112540649A CN 112540649 A CN112540649 A CN 112540649A CN 202011437080 A CN202011437080 A CN 202011437080A CN 112540649 A CN112540649 A CN 112540649A
Authority
CN
China
Prior art keywords
coordinate
display screen
user
eye
current
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011437080.9A
Other languages
Chinese (zh)
Other versions
CN112540649B (en
Inventor
黄鹣
徐建平
叶继丰
杨国军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Chuangzhicheng Technology Co ltd
Original Assignee
Shenzhen Chuangzhicheng Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Chuangzhicheng Technology Co ltd filed Critical Shenzhen Chuangzhicheng Technology Co ltd
Priority to CN202011437080.9A priority Critical patent/CN112540649B/en
Publication of CN112540649A publication Critical patent/CN112540649A/en
Application granted granted Critical
Publication of CN112540649B publication Critical patent/CN112540649B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1654Details related to the display arrangement, including those related to the mounting of the display in the housing the display being detachable, e.g. for remote use
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1656Details related to functional adaptations of the enclosure, e.g. to provide protection against EMI, shock, water, or to host detachable peripherals like a mouse or removable expansions units like PCMCIA cards, or to provide access to internal components for maintenance or to removable storage supports like CDs or DVDs, or to mechanically mount accessories
    • G06F1/166Details related to functional adaptations of the enclosure, e.g. to provide protection against EMI, shock, water, or to host detachable peripherals like a mouse or removable expansions units like PCMCIA cards, or to provide access to internal components for maintenance or to removable storage supports like CDs or DVDs, or to mechanically mount accessories related to integrated arrangements for adjusting the position of the main body with respect to the supporting surface, e.g. legs for adjusting the tilt angle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention relates to the technical field of notebook computers, and discloses a rotation control method of a display screen and a notebook computer. The method comprises the following steps: acquiring a face image of a user in front of a display screen, wherein the face image is provided with a coordinate system; extracting a face contour in the face image; judging whether the face contour deviates from a preset coordinate range; if the face contour deviates, judging whether the face contour meets the front-view screen condition; if yes, determining the current coordinates of the two-eye area in the face image in the coordinate system; and controlling the display screen to rotate according to the current coordinate and the reference coordinate so as to keep the face contour of the user within a preset coordinate range. Therefore, the method can intelligently track the position change of the head of the user, automatically adjust the angle of the display screen, and meet the screen viewing requirement of the user without manually adjusting the angle of the display screen by the user, thereby improving the user experience.

Description

Rotation control method of display screen and notebook computer
Technical Field
The invention relates to the technical field of notebook computers, in particular to a rotation control method of a display screen and a notebook computer.
Background
The display screen of the existing notebook computer is rotatable, and the angle of the display screen is automatically adjusted according to the watching requirement of a user, so that the user can watch the display screen more clearly.
However, in many working scenarios, the user is not always able to maintain a fixed sitting posture, especially a head posture, and when the user changes the head posture in order to look for a more comfortable sitting posture, in order to clearly view the display screen, the user often needs to manually adjust the angle of the display screen, so that the user clearly views the display screen again, but the existing approaches are not intelligent enough and troublesome.
Disclosure of Invention
An object of an embodiment of the present invention is to provide a rotation control method for a display screen and a notebook computer, which can automatically adjust the display screen to meet the screen viewing requirement caused by the head change of a user.
In a first aspect, an embodiment of the present invention provides a rotation control method for a display screen, which is applied to a notebook computer, and the method includes:
acquiring a face image of a user in front of the display screen, wherein the face image is provided with a coordinate system;
extracting a face contour in the face image;
judging whether the face contour deviates from a preset coordinate range;
if the human face contour deviates, judging whether the human face contour meets the front-view screen condition;
if yes, determining the current coordinates of the two-eye area in the face image in the coordinate system;
and controlling the display screen to rotate according to the current coordinate and the reference coordinate so as to keep the face contour of the user within the preset coordinate range.
Optionally, the determining whether the face contour meets the front-view screen condition includes:
and judging whether the face contour meets the front-view screen condition or not according to a deep learning algorithm.
Optionally, the determining whether the face contour meets the front-view screen condition according to a deep learning algorithm includes:
inputting the image of the face contour into an SVM classifier to obtain the probability that the face contour meets the front-view screen condition;
judging whether the probability is greater than or equal to a preset threshold value;
if so, the face contour meets the front-view screen condition;
if not, the face contour does not meet the front-view screen condition.
Optionally, the controlling the display screen to rotate according to the current coordinate and the reference coordinate to keep the face contour of the user within the preset coordinate range includes:
judging whether the current left-eye coordinate accords with the reference range of the left-eye reference coordinate or not, or whether the current right-eye coordinate accords with the reference range of the right-eye reference coordinate or not;
if so, maintaining the position state of the display screen;
if not, controlling the display screen to rotate so as to keep the face contour of the user within the preset coordinate range.
Optionally, the controlling the display screen to rotate so as to keep the face contour of the user within the preset coordinate range includes:
determining the offset direction of the current line segment relative to the reference line segment and the target rotation direction corresponding to the offset direction;
and controlling the display screen to gradually rotate in the target rotating direction according to the unit stepping rotating amount so as to keep the human face contour of the user within the preset coordinate range.
Optionally, the controlling the display screen to gradually rotate according to a unit stepping rotation amount in the target rotation direction so as to keep the face contour of the user within the preset coordinate range includes:
acquiring an ith human face image of the display screen in the target rotation direction and rotating each time according to the unit stepping rotation amount, wherein i is a positive integer;
judging whether the current coordinate in the ith human face image conforms to the reference range of the reference coordinate;
if so, controlling the display screen to stop rotating;
and if not, assigning i to i +1, and returning to the step of obtaining the ith face image of the display screen in the target rotation direction and rotating each time according to the unit stepping rotation amount.
Optionally, the origin of the coordinate system is at the top left corner of the face image, the positive X-axis direction faces the right of the face image, the positive Y-axis direction faces the bottom of the face image, the offset direction includes an upper offset direction and a lower offset direction, and determining the offset direction of the current line segment with respect to the reference line segment includes:
calculating the difference value between the vertical coordinate in the current left-eye coordinate/the current right-eye coordinate and the vertical coordinate in the left-eye reference coordinate/the right-eye reference coordinate;
judging whether the difference value is larger than zero;
if so, determining the offset direction of the current line segment relative to the reference line segment as the lower offset direction;
if not, determining that the offset direction of the current line segment relative to the reference line segment is the upper offset direction.
Optionally, the rotating direction includes a counter-clockwise rotating direction and a clockwise rotating direction relative to the user, and the determining the target rotating direction corresponding to the offset direction includes:
if the offset direction is the upward offset direction, determining that the target rotation direction is a counterclockwise rotation direction;
and if the offset direction is the lower offset direction, determining that the target rotation direction is a clockwise rotation direction.
Optionally, the reference coordinates include a left-eye reference coordinate and a right-eye reference coordinate, and the method further includes:
acquiring a first face image of a user in front of the display screen;
processing the first face image by using a face analysis algorithm to determine whether the user is a legal user of the notebook computer;
if not, returning to the step of obtaining the initial face image of the user in front of the display screen;
if yes, acquiring a second face image of the user when the user is detected to continuously operate the notebook computer within a preset time;
respectively calculating a first coincidence degree of a specified range of a left eye current coordinate in the first face image and a specified range of a left eye current coordinate in the second face image, and a second coincidence degree of a specified range of a right eye current coordinate in the first face image and a specified range of a right eye current coordinate in the second face image;
judging whether the first coincidence degree and the second coincidence degree are both larger than a preset coincidence threshold value;
if so, taking the current coordinates of the left eye and the current coordinates of the right eye in the second face image as the reference coordinates of the left eye and the reference coordinates of the right eye respectively;
and if not, returning to the step of acquiring the first face image of the user in front of the display screen.
In a second aspect, an embodiment of the present invention provides a notebook computer, including:
the upper shell comprises a rotating part, and the rotating part is provided with a through groove;
a rotating shaft;
the lower shell is provided with two opposite pivoting parts, and the rotating shaft is rotatably arranged between the two pivoting parts after penetrating through the through groove;
a keyboard mounted to the lower case;
the display screen is embedded in the surface, facing the keyboard, of the upper shell;
the front camera is mounted on the surface, facing the keyboard, of the upper shell;
the rotating mechanism is connected with the rotating shaft and used for driving the rotating shaft to drive the display screen to rotate;
a main board, the main board accept in the inferior valve, wherein, the main board includes:
at least one processor electrically connected to the rotation mechanism;
a memory communicatively coupled to the at least one processor, the memory storing instructions executable by the at least one processor, the instructions being executable by the at least one processor to enable the at least one processor to perform the method for controlling rotation of the display screen.
In the rotation control method of the display screen provided by the embodiment of the invention, firstly, a face image of a user positioned in front of the display screen is obtained, and the face image is configured with a coordinate system; secondly, judging whether the head of the user deviates from a preset coordinate range or not according to the face image; thirdly, if the facial contour deviates, judging whether the facial contour in the facial image meets the front-view screen condition; thirdly, if the current coordinates of the two-eye area in the face image in the coordinate system are met, determining the current coordinates of the two-eye area in the face image in the coordinate system; and finally, controlling the display screen to rotate according to the current coordinate and the reference coordinate so as to keep the head of the user within a preset coordinate range. Therefore, the method can intelligently track the position change of the head of the user, automatically adjust the angle of the display screen, and meet the screen viewing requirement of the user without manually adjusting the angle of the display screen by the user, thereby improving the user experience.
Drawings
One or more embodiments are illustrated by way of example in the accompanying drawings, which correspond to the figures in which like reference numerals refer to similar elements and which are not to scale unless otherwise specified.
Fig. 1 is a schematic structural diagram of a notebook computer according to an embodiment of the present invention;
fig. 2 is a schematic circuit diagram of a notebook computer according to an embodiment of the present invention;
fig. 3 is a schematic circuit diagram of a controller according to an embodiment of the present invention;
fig. 4 is a schematic flowchart of a method for controlling rotation of a display screen according to an embodiment of the present invention;
fig. 5 is a schematic flow chart of S46 shown in fig. 4;
fig. 6 is a flowchart of S463 shown in fig. 5;
fig. 7 is a flowchart illustrating a method for controlling rotation of a display screen according to another embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that, if not conflicted, the various features of the embodiments of the invention may be combined with each other within the scope of protection of the invention. Additionally, while functional block divisions are performed in apparatus schematics, with logical sequences shown in flowcharts, in some cases, steps shown or described may be performed in sequences other than block divisions in apparatus or flowcharts. The terms "first", "second", "third", and the like used in the present invention do not limit data and execution order, but distinguish the same items or similar items having substantially the same function and action.
Referring to fig. 1 and 2 together, a notebook computer 100 includes an upper casing 11, a rotating shaft (not shown), a lower casing 12, a keyboard 13, a display 14, a front camera 15, a rotating mechanism 16, and a main board 17.
The upper shell 11 comprises a rotating part 110, the rotating part 110 is provided with a through groove, the lower shell 12 is provided with two opposite pivoting parts 120, the rotating shaft penetrates through the through groove and then is rotatably installed between the two pivoting parts 120, and a user can manually push the upper shell 11 to rotate around the rotating shaft relative to the lower shell 12.
The keyboard 13 is installed on the lower shell 12, and the keyboard 13 is used for receiving key operations of a user and completing input of corresponding key values.
The display 14 is embedded in the surface of the upper shell 11 facing the keyboard 13, and the display 14 is used for providing a display picture. When the upper case 11 is rotated, the upper case 11 may be rotated with the display screen 14.
The front camera 15 is mounted on the surface of the upper case 11 facing the keyboard 13, and the front camera 15 can capture a face image of a user positioned in front of the display screen 14.
The rotating mechanism 16 is connected to the rotating shaft and is used for driving the rotating shaft to rotate the display screen 14.
In some embodiments, the rotating mechanism 16 includes a rotating gear, a transmission shaft and a stepping motor, wherein the rotating gear is sleeved on the rotating shaft and fixed on the rotating shaft, the rotating gear is engaged with the transmission gear, and a center line of the rotating gear, a center line of the rotating shaft and a center line of the transmission gear are parallel to each other. One end of the transmission shaft is fixedly arranged in the shaft hole of the transmission gear, the other end of the transmission shaft is connected with the stepping motor, the stepping motor is controlled by the main board 17, when the stepping motor works, the transmission shaft is driven to drive the transmission gear to rotate around the center line of the transmission gear in a circumferential mode, the transmission gear pushes the rotating gear to rotate around the center line of the transmission gear in a circumferential mode, then the rotating gear can drive the rotating shaft to rotate around the center line of the rotating shaft in a circumferential mode, and then the upper shell 11 drives the display screen 14.
The main board 17 is housed in the lower case 12. As shown in fig. 2, the main board 17 includes a wireless communication unit 171, an audio/video input unit 172, a user input unit 173, a sensing unit 174, an output unit 175, an interface unit 176, a memory 177, and a controller 178.
The wireless communication unit 171 may include at least one module capable of enabling wireless communication between the terminal and a wireless communication system or between the terminal and a network in which the electronic device is located. For example, the wireless communication unit 171 includes a broadcast receiving module, a mobile communication module, a wireless internet module, a short-range communication module, and a location information module.
The audio and video input unit 172 is used to input an audio signal or a video signal, and may include a microphone (microphone). The microphone receives an external sound signal through the microphone in a call mode, a recording mode, or a voice recognition mode, and processes the sound signal into electric voice data. For the call mode, the processed sound data may be converted into a format capable of being transmitted to a mobile communication base station through a mobile communication module to be output. In the microphone, various noise removal algorithms for removing noise generated during reception of an external sound signal may be implemented.
The user input unit 173 generates input data for controlling an operation of the terminal by the user. The user input unit may include, for example, a fingerprint module, a dome switch, a touch pad (constant voltage/constant current), a jog wheel (jog wheel), or a jog switch (jog switch). The user input unit 173 may include an identification module selection switch for generating a selection signal for selecting a specific identification module among a plurality of selection modules. The fingerprint module can be fingerprint module or side screen fingerprint module on for fingerprint module, the screen under the screen.
The sensing unit 174 may detect a current state of the electronic device, such as an open/close state of the electronic device, a position of the terminal, whether contact is made with a user, a direction of the terminal, or acceleration/deceleration of the electronic device, to generate a sensing signal for controlling an operation of the terminal. For example, when the terminal is a slide phone type, whether the slide phone is opened or closed may be sensed. In addition, whether the power supply unit supplies power or whether an external device is connected to the interface unit may be sensed. The sensing unit may include, for example, a touch sensor and a proximity sensor. The touch sensor is a sensor for detecting a touch operation. For example, the touch sensor may have the form of a touch film, a touch sheet, or a touch unit.
The output unit 175 is used to generate an output related to a visual sense, an auditory sense, or a touch, and may include a sound output module, an alarm unit, and a haptic module.
The interface unit 176 performs a function of a path connecting all external devices to the terminal. The interface unit receives data from an external device, is supplied with power and transfers power to each element within the terminal, or transmits data within the terminal to the external device. For example, a wired/wireless headset port, an external charger port, a wired/wireless data port, a memory card port, a port for connecting to a device having an identification module, an audio input/output (I/O) port, a video input/output (I/O) port, and a headset port may be included in the interface unit.
The memory 177 may store programs for operating the controller, and may temporarily store input/output data (e.g., address book, message, still image, video, etc.). The memory may also store data related to various patterns of vibration and sound output when a touch input is applied to the touch screen.
The controller 178 controls the overall operation of the electronic device. For example, the controller 178 may perform control and processing related to a voice call, data communication, or video call. The controller 178 may include a multimedia module for playing multimedia. The multimedia module may be implemented within the controller and may be implemented separately from the controller 178.
Referring to fig. 3, fig. 3 is a circuit diagram of a controller according to an embodiment of the present invention, and as shown in fig. 3, the controller 178 includes one or more processors 179 and a memory 180. In fig. 3, one processor 179 is taken as an example.
The processor 179 and the memory 180 may be connected by a bus or other means, such as the bus connection shown in FIG. 3.
The memory 180, which is a non-volatile computer-readable storage medium, may be used to store non-volatile software programs, non-volatile computer-executable programs, and modules that, when executed by the one or more processors 179, perform the method of rotation control of the display screen in the various embodiments below.
Embodiments of the present invention also provide a non-transitory computer storage medium storing computer-executable instructions, which are executed by one or more processors, such as the processor 179 in fig. 3, so that the one or more processors can execute the rotation control method of the display screen in the following embodiments.
Embodiments of the present invention also provide a computer program product, which includes a computer program stored on a non-volatile computer-readable storage medium, where the computer program includes program instructions, and when the program instructions are executed by a notebook computer, the notebook computer executes a rotation control method of a display screen in each of the following embodiments. .
As another aspect of the embodiments of the present invention, an embodiment of the present invention provides a rotation control method of a display screen, where the rotation control method of the display screen is applied to a notebook computer. Referring to fig. 4, a method S400 for controlling rotation of a display screen includes:
s41, acquiring a face image of a user in front of the display screen, wherein the face image is provided with a coordinate system;
in this embodiment, the notebook computer controls the front camera to shoot the user in front of the display screen to obtain the face image, and the notebook computer configures the coordinate system for the face image according to the preset rule, for example, the origin of the coordinate system is at the top left corner of the face image, the positive direction of the X axis is towards the right side of the face image, and the positive direction of the Y axis is towards the lower side of the face image.
In some embodiments, the notebook computer controls the front-facing camera to shoot an environment image in front of the display screen according to a preset frequency, extracts a user head portrait from the environment image by using a face analysis algorithm, and judges whether the user head portrait is a legal user head portrait, if so, the environment image is used as a final face image, and if not, the front-facing camera is continuously controlled to shoot the environment image in front of the display screen according to the preset frequency. By adopting the mode, the display screen can be ensured to rotate along with the change of the head position of a legal user all the time, certain noise is reduced, and the experience of the product is improved.
S42, extracting the face contour in the face image;
in this embodiment, the notebook computer extracts a face contour in the face image according to a face analysis algorithm, wherein the face analysis algorithm may select any suitable face algorithm.
S43, judging whether the face contour deviates from the preset coordinate range, if so, executing a step S44, otherwise, returning to the step S41;
generally, the position where the user clearly views the display screen is fixed, the relative position between the head of the user and the display screen is also fixed, and if the head of the user deviates from the fixed position, it means that the user cannot clearly view the display screen at the new position, and therefore, the display screen needs to be adjusted at this time.
In the present embodiment, a designer develops a predetermined coordinate range in a coordinate system for describing a fixed position according to a fixed position where an ordinary user clearly views a display screen, for example, the predetermined coordinate range is an image area surrounded by four coordinate points (x1, y1), (x2, y2), (x3, y3), and (x4, y4), respectively. And when the face contour of the user is completely or partially moved out of the image area, the face contour deviates from the preset coordinate range. When the face contour of the user is all in the image area, the face contour does not deviate from the preset coordinate range.
S44, if the human face contour deviates, judging whether the human face contour meets the front-view screen condition, if so, executing S45, otherwise, returning to the step S41;
generally, there are many factors that urge the face contour of the user to deviate from the preset coordinate range, such as the user has a rest on a desk with his head down, or the user leaves from a seat, or the user and a colleague discuss a problem in front of the display screen and the head part deviates from the preset coordinate range, and even if the face contour of the user deviates from the preset coordinate range, the notebook computer needs to determine whether the face contour meets the front-view screen condition in order to adjust the display screen reliably and intelligently.
The front-view screen condition is used for judging whether the eyes of the user are looking at the display screen, if the eyes of the user are looking at the display screen, the head posture of the user is changed for finding a more comfortable position, and the user still needs to watch the display screen.
In some embodiments, the front-view screen condition may be customized by a designer according to design requirements, for example, the total area of both eyes in the current face image is calculated, if the total area of both eyes is greater than or equal to a preset area threshold, the front-view screen condition is satisfied, otherwise, the front-view screen condition is not satisfied.
In some embodiments, the notebook computer may determine whether the face contour meets the front-view screen condition according to a deep learning algorithm, for example, first, the notebook computer inputs an image of the face contour into the SVM classifier to obtain a probability that the face contour meets the front-view screen condition. Secondly, the notebook computer judges whether the probability is greater than or equal to a preset threshold value, if so, the face contour meets the front-view screen condition; if not, the face contour does not meet the front-view screen condition.
In some embodiments, a user may operate a laptop to train the SVM classifier, for example, first, the laptop configures a positive sample image belonging to the front-view display screen as a first label, and a negative sample image not belonging to the front-view display screen as a second label, where the first label is 1 and the second label is 0. Secondly, the notebook computer respectively extracts the binocular emmetropia feature of the positive sample image and the binocular non-emmetropia feature of the negative sample image. And finally, the notebook computer trains the SVM classifier according to the binocular emmetropic feature and the first label as well as the binocular non-emmetropic feature and the second label.
S45, if yes, determining the current coordinates of the two-eye area in the face image in the coordinate system;
in this embodiment, the current coordinates include a left-eye current coordinate and a right-eye current coordinate, and the notebook computer extracts a left-eye region and a right-eye region from the face image according to a face analysis algorithm, and calculates a left-eye current coordinate of the left-eye region in the coordinate system and a right-eye current coordinate of the right-eye region in the coordinate system.
And S46, controlling the display screen to rotate according to the current coordinate and the reference coordinate so as to keep the head of the user within a preset coordinate range.
In this embodiment, the reference coordinates are used to assist in determining the offset of the two eyes of the user, where the reference coordinates include a left-eye reference coordinate and a right-eye reference coordinate, the left-eye reference coordinate is used to assist in determining the offset of the left eye of the user, and the right-eye reference coordinate is used to assist in determining the offset of the right eye of the user. By comparing the current coordinate with the reference coordinate, the display screen can be controlled to rotate, so that the head of the user is kept in the preset coordinate range, for example, the head of the user deviates from the preset coordinate range, but the user still looks at the display screen, at the moment, the notebook computer compares the current coordinate with the reference coordinate, and sends a control instruction to the stepping motor according to a comparison result, the stepping motor drives the transmission shaft to drive the transmission gear to rotate according to the control instruction, the transmission gear pushes the rotating gear to rotate, and the rotating gear drives the rotating shaft to carry the display screen to rotate, so that the head of the user is kept in the preset coordinate range.
Therefore, the method can intelligently track the position change of the head of the user, automatically adjust the angle of the display screen, and meet the screen viewing requirement of the user without manually adjusting the angle of the display screen by the user, thereby improving the user experience.
In some embodiments, referring to fig. 5, S46 includes:
s461, judging whether the current coordinate of the left eye conforms to the reference range of the reference coordinate of the left eye or whether the current coordinate of the right eye conforms to the reference range of the reference coordinate of the right eye;
s462, if yes, keeping the position state of the display screen;
and S463, if not, controlling the display screen to rotate so as to keep the face contour of the user within the preset coordinate range.
In this embodiment, even if the head of the user is within the preset coordinate range and the eyes of the user are looking at the display screen, the sitting posture and the head of the user are not deviated a little, and the head of the user may sometimes shake a little, in this case, the head of the user still can be considered to be within the preset coordinate range and the eyes of the user are looking at the display screen, so in order to evaluate the situation and improve the robustness of the method, the method configures corresponding reference ranges for the reference coordinates of the left eye and the reference coordinates of the right eye, for example, the reference range for the reference coordinates of the left eye is: the allowable fluctuation range of the abscissa x is x11-x22, for example, x 11-8, x 22-12, and the allowable fluctuation range of the ordinate y is y11-y22, for example, y 11-10, and y 22-14. The reference range of the reference coordinates for the right eye is: the allowable fluctuation range of the abscissa x is x33-x44, for example, x 33-14, x 22-18, and the allowable fluctuation range of the ordinate y is y33-y44, for example, y 33-10, and y 44-14.
By adopting the method, the sitting posture change of the user can be reliably monitored to reliably control the display screen to rotate, the requirement of the user for clearly viewing the screen can be met, and the situation of adjusting the angle of the display screen in a disordered way is avoided.
In some embodiments, the current left-eye coordinate and the current right-eye coordinate form a current line segment, and the left-eye reference coordinate and the right-eye reference coordinate form a reference line segment, please refer to fig. 6, S463 includes:
s4631, determining the offset direction of the current line segment relative to the reference line segment and the target rotation direction corresponding to the offset direction;
s4632, controlling the display screen to rotate step by step in the target rotation direction according to the unit stepping rotation amount, so that the face contour of the user is kept in a preset coordinate range.
In this embodiment, the offset direction is used to indicate the moving direction of the head of the user relative to the display screen, and generally, if the user is looking at the display screen, the line segment formed by the two eyes moves either upward (sitting upright) or downward (reclining back on back), so in some embodiments, the offset direction includes an upper offset direction and a lower offset direction.
In some embodiments, when determining the offset direction of the current line segment from the reference line segment, the notebook computer calculates the difference between the ordinate in the current left-eye coordinate and the ordinate in the left-eye reference coordinate, or alternatively, calculates the difference between the ordinate in the current right-eye coordinate and the ordinate in the right-eye reference coordinate. Then, the notebook computer judges whether the difference value is larger than zero, if so, the offset direction of the current line segment relative to the reference line segment is determined as the lower offset direction; if not, determining that the offset direction of the current line segment relative to the reference line segment is the upward offset direction. With the method, the offset direction can be determined by simple image data without using sensors.
In this embodiment, the target rotation direction is used to indicate a direction in which the display screen rotates, where the rotation direction includes a counterclockwise rotation direction and a clockwise rotation direction relative to the user, and if the offset direction is an upward offset direction, it is determined that the target rotation direction is the counterclockwise rotation direction, and then the stepping motor may control the display screen to rotate in the counterclockwise rotation direction, so as to keep the face contour of the user within the preset coordinate range. If the offset direction is a lower offset direction, the target rotation direction is determined to be a clockwise rotation direction, and then the stepping motor can control the display screen to rotate in the clockwise rotation direction, so that the face contour of the user is kept in a preset coordinate range.
In some embodiments, the reference coordinates of different users are different, so that the method can be more intelligently adapted to different users and different scenes, and the reference coordinates can be automatically constructed for different users.
Referring to fig. 7, a method S400 for controlling rotation of a display screen includes:
s47, acquiring a first face image of a user in front of the display screen;
s48, processing the first face image by using a face analysis algorithm, determining whether the user is a legal user of the notebook computer, if so, executing S49, otherwise, returning to the step S47;
s49, if yes, when the user is detected to continuously operate the notebook computer within the preset time, acquiring a second face image of the user, and executing S50;
s50, respectively calculating a first coincidence degree of a specified range of the left eye current coordinate in the first face image and a specified range of the left eye current coordinate in the second face image, and a second coincidence degree of a specified range of the right eye current coordinate in the first face image and a specified range of the right eye current coordinate in the second face image;
s51, judging whether the first coincidence degree and the second coincidence degree are both larger than a preset coincidence threshold value, if so, executing S52, otherwise, returning to S47;
and S52, if yes, taking the current left-eye coordinate and the current right-eye coordinate in the second face image as the reference left-eye coordinate and the reference right-eye coordinate respectively.
In this embodiment, the notebook computer is initially powered on, and then the camera is controlled to shoot the user located in front of the display screen, so as to obtain the first face image. Then, the notebook computer extracts the face features by using a face analysis algorithm, judges whether the face features are matched with reference features stored in the notebook computer in advance, and if the face features are matched with the reference features, the user is legal, and then the subsequent steps can be carried out. If not, the instruction user is illegal, and then the process returns to continuously acquire the face image of the user in front of the display screen.
When the user is legal, if the notebook computer detects that the user continuously operates the notebook computer within a preset time, for example, the user continuously operates the notebook computer within 15 minutes, it is indicated that the user has entered a working state, and in the working state, the normal sitting posture of the user is relatively fixed within a corresponding time, and at this time, the notebook computer can acquire the second face image of the user, so that the notebook computer can avoid frequently controlling the camera to acquire the face image in which the sitting posture is unstable.
If the notebook computer does not detect that the user continuously operates the notebook computer within the preset time length, the timer is restarted, the preset time length is delayed again, and the step S50 is returned.
When the notebook computer obtains a second face image of the user, the notebook computer respectively calculates the specified range of the left eye current coordinate and the specified range of the right eye current coordinate in the first face image.
Generally, as described above, even if the head of the user is within the preset coordinate range and the eyes of the user are looking at the display screen, the sitting posture and the head of the user are not deviated at all, and occasionally the head of the user is slightly shaken, in which case, the head of the user can still be considered to be within the preset coordinate range and the eyes of the user are looking at the display screen. However, as long as the user fluctuates as described above, the left and right eyes of the user fluctuate within the preset range without an excessive shift.
In order to evaluate such a situation and improve the robustness of the method, the method configures corresponding specified ranges for both the current left-eye coordinate and the current right-eye coordinate, for example, the specified range for the current left-eye coordinate is: abscissa x in left eye current coordinateLeft side of. + -. Δ x1, ordinate yLeft side of± Δ y 1. The specified range of the current coordinates of the right eye is: abscissa x in the current coordinates of the right eyeRight side. + -. Δ x2, ordinate yRight side±Δy2。
Here, the specified range of the left-eye current coordinate in the first face image is (x)Left 1±Δx1,yLeft 1± Δ y1), the specified range of the right eye current coordinate is (x)Right 1±Δx2,yRight 1± Δ y 2). The specified range of the current coordinates of the left eye in the second face image is (x)Left 2±Δx1,yLeft 2± Δ y1), the specified range of the right eye current coordinate is (x)Right 2±Δx2,yRight 2±Δy2)。
Then, the notebook computer respectively calculates a first coincidence degree and a second coincidence degree, for example:
xleft 1=8,Δx1=2,yLeft 1=12,Δy1=2。
xRight 1=12,Δx2=2,yRight 1=12,Δy2=2。
xLeft 2=9,Δx1=2,yLeft 2=13,Δy1=2。
xRight 2=13,Δx2=2,yRight 2=13,Δy2=2。
The first and second degrees of overlap, 9/16 and 56.25% were calculated.
Then, assuming that the preset coincidence threshold is 50%, since the first coincidence degree and the second coincidence degree are both greater than the preset coincidence threshold, the current coordinates (x) of the left eye in the second face image are usedLeft 2±Δx1,yLeft 2Δ y1) and the current coordinates of the right eye (x)Right 2±Δx2,yRight 2± Δ y2) as the left-eye reference coordinate and the right-eye reference coordinate, respectively.
By adopting the method, the reference coordinate can be automatically, reliably and intelligently determined, and a foundation is made for subsequently adjusting the angle of the display screen.
It should be noted that, in the foregoing embodiments, a certain order does not necessarily exist between the foregoing steps, and those skilled in the art can understand, according to the description of the embodiments of the present invention, that in different embodiments, the foregoing steps may have different execution orders, that is, may be executed in parallel, may also be executed interchangeably, and the like.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a general hardware platform, and certainly can also be implemented by hardware. Based on such understanding, the above technical solutions substantially or contributing to the related art may be embodied in the form of a software product, which may be stored in a computer-readable storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the embodiments or some parts of the embodiments.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; within the idea of the invention, also technical features in the above embodiments or in different embodiments may be combined, steps may be implemented in any order, and there are many other variations of the different aspects of the invention as described above, which are not provided in detail for the sake of brevity; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (10)

1. A rotation control method of a display screen is applied to a notebook computer, and is characterized by comprising the following steps:
acquiring a face image of a user in front of the display screen, wherein the face image is provided with a coordinate system;
extracting a face contour in the face image;
judging whether the face contour deviates from a preset coordinate range;
if the human face contour deviates, judging whether the human face contour meets the front-view screen condition;
if yes, determining the current coordinates of the two-eye area in the face image in the coordinate system;
and controlling the display screen to rotate according to the current coordinate and the reference coordinate so as to keep the face contour of the user within the preset coordinate range.
2. The method of claim 1, wherein the determining whether the face contour satisfies a front-view screen condition comprises:
and judging whether the face contour meets the front-view screen condition or not according to a deep learning algorithm.
3. The method of claim 2, wherein determining whether the face contour satisfies an orthographic screen condition according to a deep learning algorithm comprises:
inputting the image of the face contour into an SVM classifier to obtain the probability that the face contour meets the front-view screen condition;
judging whether the probability is greater than or equal to a preset threshold value;
if so, the face contour meets the front-view screen condition;
if not, the face contour does not meet the front-view screen condition.
4. The method according to any one of claims 1 to 3, wherein the current coordinates comprise a left-eye current coordinate and a right-eye current coordinate, the reference coordinates comprise a left-eye reference coordinate and a right-eye reference coordinate, and the controlling the display screen to rotate according to the current coordinates and the reference coordinates so that the face contour of the user is kept within the preset coordinate range comprises:
judging whether the current left-eye coordinate accords with the reference range of the left-eye reference coordinate or not, or whether the current right-eye coordinate accords with the reference range of the right-eye reference coordinate or not;
if so, maintaining the position state of the display screen;
if not, controlling the display screen to rotate so as to keep the face contour of the user within the preset coordinate range.
5. The method of claim 4, wherein the current left-eye coordinates and the current right-eye coordinates form a current line segment, the left-eye reference coordinates and the right-eye reference coordinates form a reference line segment, and the controlling the display screen to rotate to keep the face contour of the user within the preset coordinate range comprises:
determining the offset direction of the current line segment relative to the reference line segment and the target rotation direction corresponding to the offset direction;
and controlling the display screen to gradually rotate in the target rotating direction according to the unit stepping rotating amount so as to keep the human face contour of the user within the preset coordinate range.
6. The method of claim 5, wherein the controlling the display screen to rotate stepwise by a unit step rotation amount in the target rotation direction to keep the face contour of the user within the preset coordinate range comprises:
acquiring an ith human face image of the display screen in the target rotation direction and rotating each time according to the unit stepping rotation amount, wherein i is a positive integer;
judging whether the current coordinate in the ith human face image conforms to the reference range of the reference coordinate;
if so, controlling the display screen to stop rotating;
and if not, assigning i to i +1, and returning to the step of obtaining the ith face image of the display screen in the target rotation direction and rotating each time according to the unit stepping rotation amount.
7. The method of claim 5, wherein the coordinate system has an origin at the top left-most corner of the face image, a positive X-direction toward the right of the face image, and a positive Y-direction toward the bottom of the face image, wherein the shift directions include an up shift direction and a down shift direction, and wherein determining the shift direction of the current line segment relative to the reference line segment comprises:
calculating the difference value between the vertical coordinate in the current left-eye coordinate/the current right-eye coordinate and the vertical coordinate in the left-eye reference coordinate/the right-eye reference coordinate;
judging whether the difference value is larger than zero;
if so, determining the offset direction of the current line segment relative to the reference line segment as the lower offset direction;
if not, determining that the offset direction of the current line segment relative to the reference line segment is the upper offset direction.
8. The method of claim 7, wherein the rotational direction comprises a counterclockwise rotational direction and a clockwise rotational direction relative to the user, and wherein determining the target rotational direction corresponding to the offset direction comprises:
if the offset direction is the upward offset direction, determining that the target rotation direction is a counterclockwise rotation direction;
and if the offset direction is the lower offset direction, determining that the target rotation direction is a clockwise rotation direction.
9. The method of any of claims 1 to 3, wherein the reference coordinates comprise left-eye reference coordinates and right-eye reference coordinates, the method further comprising:
acquiring a first face image of a user in front of the display screen;
processing the first face image by using a face analysis algorithm to determine whether the user is a legal user of the notebook computer;
if not, returning to the step of obtaining the initial face image of the user in front of the display screen;
if yes, acquiring a second face image of the user when the user is detected to continuously operate the notebook computer within a preset time;
respectively calculating a first coincidence degree of a specified range of a left eye current coordinate in the first face image and a specified range of a left eye current coordinate in the second face image, and a second coincidence degree of a specified range of a right eye current coordinate in the first face image and a specified range of a right eye current coordinate in the second face image;
judging whether the first coincidence degree and the second coincidence degree are both larger than a preset coincidence threshold value;
if so, taking the current coordinates of the left eye and the current coordinates of the right eye in the second face image as the reference coordinates of the left eye and the reference coordinates of the right eye respectively;
and if not, returning to the step of acquiring the first face image of the user in front of the display screen.
10. A notebook computer, comprising:
the upper shell comprises a rotating part, and the rotating part is provided with a through groove;
a rotating shaft;
the lower shell is provided with two opposite pivoting parts, and the rotating shaft is rotatably arranged between the two pivoting parts after penetrating through the through groove;
a keyboard mounted to the lower case;
the display screen is embedded in the surface, facing the keyboard, of the upper shell;
the front camera is mounted on the surface, facing the keyboard, of the upper shell;
the rotating mechanism is connected with the rotating shaft and used for driving the rotating shaft to drive the display screen to rotate;
a main board, the main board accept in the inferior valve, wherein, the main board includes:
at least one processor electrically connected to the rotation mechanism;
a memory communicatively coupled to the at least one processor, the memory storing instructions executable by the at least one processor to enable the at least one processor to perform a method of controlling rotation of a display screen as claimed in any one of claims 1 to 9.
CN202011437080.9A 2020-12-11 2020-12-11 Rotation control method of display screen and notebook computer Active CN112540649B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011437080.9A CN112540649B (en) 2020-12-11 2020-12-11 Rotation control method of display screen and notebook computer

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011437080.9A CN112540649B (en) 2020-12-11 2020-12-11 Rotation control method of display screen and notebook computer

Publications (2)

Publication Number Publication Date
CN112540649A true CN112540649A (en) 2021-03-23
CN112540649B CN112540649B (en) 2024-02-20

Family

ID=75019974

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011437080.9A Active CN112540649B (en) 2020-12-11 2020-12-11 Rotation control method of display screen and notebook computer

Country Status (1)

Country Link
CN (1) CN112540649B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113900525A (en) * 2021-10-29 2022-01-07 深圳Tcl数字技术有限公司 Digital human display method and device and display equipment

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101615033A (en) * 2008-06-25 2009-12-30 和硕联合科技股份有限公司 The angular adjustment apparatus of display module and method
CN101859176A (en) * 2009-04-08 2010-10-13 群康科技(深圳)有限公司 Screen control device and method thereof
CN101930238A (en) * 2008-12-26 2010-12-29 宏碁股份有限公司 Electronic device with display and method for adjusting display elevation
CN102033549A (en) * 2009-09-30 2011-04-27 三星电子(中国)研发中心 Viewing angle adjusting device of display device
CN102117074A (en) * 2009-12-31 2011-07-06 鸿富锦精密工业(深圳)有限公司 System for regulating angle of display and using method thereof
CN102270012A (en) * 2010-06-01 2011-12-07 鸿富锦精密工业(深圳)有限公司 Electronic device and automatic adjusting method for opening angle thereof
CN102270021A (en) * 2011-08-04 2011-12-07 浙江大学 Method for automatically adjusting display based on face identification and bracket thereof
CN103024191A (en) * 2012-12-21 2013-04-03 广东欧珀移动通信有限公司 Screen rotating method, screen rotating device and mobile terminal
CN103645749A (en) * 2013-12-23 2014-03-19 张志增 Automatic adjusting type display device and adjusting method thereof
CN103901901A (en) * 2014-03-21 2014-07-02 小米科技有限责任公司 Method and device for rotating screen of video terminal
CN104683786A (en) * 2015-02-28 2015-06-03 上海玮舟微电子科技有限公司 Human eye tracking method and device of naked eye 3D equipment
CN104881263A (en) * 2015-05-27 2015-09-02 天津三星电子有限公司 Display control method and display equipment
JP2017188766A (en) * 2016-04-05 2017-10-12 レノボ・シンガポール・プライベート・リミテッド Electronic apparatus with camera, correction method for picked-up video image, and storage medium
CN110727316A (en) * 2019-10-12 2020-01-24 北京硬壳科技有限公司 Method and device for automatically adjusting position of display and display
CN210860320U (en) * 2018-12-17 2020-06-26 淄博职业学院 But rotation angle regulation's computer screen

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101615033A (en) * 2008-06-25 2009-12-30 和硕联合科技股份有限公司 The angular adjustment apparatus of display module and method
CN101930238A (en) * 2008-12-26 2010-12-29 宏碁股份有限公司 Electronic device with display and method for adjusting display elevation
CN101859176A (en) * 2009-04-08 2010-10-13 群康科技(深圳)有限公司 Screen control device and method thereof
CN102033549A (en) * 2009-09-30 2011-04-27 三星电子(中国)研发中心 Viewing angle adjusting device of display device
CN102117074A (en) * 2009-12-31 2011-07-06 鸿富锦精密工业(深圳)有限公司 System for regulating angle of display and using method thereof
CN102270012A (en) * 2010-06-01 2011-12-07 鸿富锦精密工业(深圳)有限公司 Electronic device and automatic adjusting method for opening angle thereof
CN102270021A (en) * 2011-08-04 2011-12-07 浙江大学 Method for automatically adjusting display based on face identification and bracket thereof
CN103024191A (en) * 2012-12-21 2013-04-03 广东欧珀移动通信有限公司 Screen rotating method, screen rotating device and mobile terminal
CN103645749A (en) * 2013-12-23 2014-03-19 张志增 Automatic adjusting type display device and adjusting method thereof
CN103901901A (en) * 2014-03-21 2014-07-02 小米科技有限责任公司 Method and device for rotating screen of video terminal
CN104683786A (en) * 2015-02-28 2015-06-03 上海玮舟微电子科技有限公司 Human eye tracking method and device of naked eye 3D equipment
CN104881263A (en) * 2015-05-27 2015-09-02 天津三星电子有限公司 Display control method and display equipment
JP2017188766A (en) * 2016-04-05 2017-10-12 レノボ・シンガポール・プライベート・リミテッド Electronic apparatus with camera, correction method for picked-up video image, and storage medium
CN210860320U (en) * 2018-12-17 2020-06-26 淄博职业学院 But rotation angle regulation's computer screen
CN110727316A (en) * 2019-10-12 2020-01-24 北京硬壳科技有限公司 Method and device for automatically adjusting position of display and display

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113900525A (en) * 2021-10-29 2022-01-07 深圳Tcl数字技术有限公司 Digital human display method and device and display equipment

Also Published As

Publication number Publication date
CN112540649B (en) 2024-02-20

Similar Documents

Publication Publication Date Title
US20230252779A1 (en) Personal computing device control using face detection and recognition
EP2685704B1 (en) Unlocking a mobile terminal using face recognition
US9594945B2 (en) Method and apparatus for protecting eyesight
US8432357B2 (en) Tracking object selection apparatus, method, program and circuit
WO2018106327A1 (en) Information privacy in a virtual meeting
US20120304067A1 (en) Apparatus and method for controlling user interface using sound recognition
TW201214299A (en) Selecting view orientation in portable device via image analysis
US20210200307A1 (en) User recognition and gaze tracking in a video system
US11163995B2 (en) User recognition and gaze tracking in a video system
CN112907725B (en) Image generation, training of image processing model and image processing method and device
CN104484858B (en) Character image processing method and processing device
KR102415552B1 (en) Display device
CN106980840A (en) Shape of face matching process, device and storage medium
JP7077305B2 (en) Methods, devices and products that use biometric sensors to control display orientation
CN110096865B (en) Method, device and equipment for issuing verification mode and storage medium
US10705615B2 (en) Mobile communications device with adaptive friction of the housing
CN111242090A (en) Human face recognition method, device, equipment and medium based on artificial intelligence
CN104077585A (en) Image correction method and device and terminal
US11422609B2 (en) Electronic device and method for controlling operation of display in same
CN107958223A (en) Face identification method and device, mobile equipment, computer-readable recording medium
CN107102801A (en) Terminal screen spinning solution and device
CN108154466A (en) Image processing method and device
CN112540649A (en) Rotation control method of display screen and notebook computer
KR20200023858A (en) Electronic device and methodfor providing information in virtual reality
CN105892884A (en) Screen direction determining method and device, and mobile device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant