CN108153422B - Display object control method and mobile terminal - Google Patents

Display object control method and mobile terminal Download PDF

Info

Publication number
CN108153422B
CN108153422B CN201810014594.XA CN201810014594A CN108153422B CN 108153422 B CN108153422 B CN 108153422B CN 201810014594 A CN201810014594 A CN 201810014594A CN 108153422 B CN108153422 B CN 108153422B
Authority
CN
China
Prior art keywords
mobile terminal
face
attitude
user
face image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810014594.XA
Other languages
Chinese (zh)
Other versions
CN108153422A (en
Inventor
麦碧权
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201810014594.XA priority Critical patent/CN108153422B/en
Publication of CN108153422A publication Critical patent/CN108153422A/en
Application granted granted Critical
Publication of CN108153422B publication Critical patent/CN108153422B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser

Abstract

The embodiment of the invention provides a display object control method and a mobile terminal, wherein the method comprises the following steps: detecting an inclination angle of a face of a user, wherein the inclination angle is an included angle between the center direction of the face of the user and the vertical direction of a three-dimensional coordinate system corresponding to the attitude sensor; correcting the attitude data detected by the attitude sensor by using the inclination angle to obtain corrected attitude data; and controlling a display object of the mobile terminal according to the corrected attitude data. The embodiment of the invention can reduce the error rate of the process of controlling the display object.

Description

Display object control method and mobile terminal
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to a display object control method and a mobile terminal.
Background
With the development of mobile terminal technology, the functions of the mobile terminal are more and more improved, wherein detecting the spatial gesture of the mobile terminal through a gesture sensor is an important technical innovation in recent years, wherein the gesture sensor may be a gravity sensor, a three-axis gyroscope, a three-axis accelerometer, a three-axis electronic compass, or the like. And the mobile terminal can control the display object of the mobile terminal according to the detected space gesture, such as controlling the advancing direction of the display object, controlling the display angle of the display object or controlling the color of the display object, so as to achieve the effects of simple operation and improving the user experience. In practice, it is found that the display object is controlled only according to the spatial posture of the mobile terminal, so that the display direction error rate is relatively high in the process of controlling the display object.
Disclosure of Invention
The embodiment of the invention provides a display object control method and a mobile terminal, and aims to solve the problem that the error rate of the display direction is high in the process of controlling a display object only according to the space posture of the mobile terminal.
In order to solve the technical problem, the invention is realized as follows: a display object control method is applied to a mobile terminal comprising an attitude sensor, and comprises the following steps:
detecting an inclination angle of a user face, wherein the inclination angle is an included angle between the center direction of the user face and the vertical direction of a three-dimensional coordinate system corresponding to the attitude sensor;
correcting the attitude data detected by the attitude sensor by using the inclination angle to obtain corrected attitude data;
and controlling a display object of the mobile terminal according to the corrected attitude data.
In a first aspect, an embodiment of the present invention provides a display object control method, which is applied to a mobile terminal including an attitude sensor, and includes:
detecting an inclination angle of a user face, wherein the inclination angle is an included angle between the center direction of the user face and the vertical direction of a three-dimensional coordinate system corresponding to the attitude sensor;
correcting the attitude data detected by the attitude sensor by using the inclination angle to obtain corrected attitude data;
and controlling a display object of the mobile terminal according to the corrected attitude data.
In a second aspect, an embodiment of the present invention provides a mobile terminal, where the mobile terminal includes an attitude sensor, and the mobile terminal is characterized by including:
the detection module is used for detecting the inclination angle of the face of the user, wherein the inclination angle is the included angle between the central direction of the face of the user and the vertical direction of the three-dimensional coordinate system corresponding to the attitude sensor;
the correction module is used for correcting the attitude data detected by the attitude sensor by using the inclination angle to obtain corrected attitude data;
and the control module is used for controlling the display object of the mobile terminal according to the corrected attitude data.
In a third aspect, an embodiment of the present invention provides a mobile terminal, including: the display object control method comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, wherein the processor executes the computer program to realize the steps in the display object control method provided by the embodiment of the invention.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements steps in a display object control method provided in an embodiment of the present invention.
In the embodiment of the invention, the attitude data is corrected according to the inclination angle of the face of the user, and the display object of the mobile terminal is controlled according to the corrected attitude data, so that the display direction error rate in the process of controlling the display object can be reduced.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive exercise.
FIG. 1 is a schematic diagram of a three-dimensional coordinate system provided by an embodiment of the present invention;
fig. 2 is a flowchart of a display object control method according to an embodiment of the present invention;
fig. 3 is a schematic diagram of a center direction of a face of a user according to an embodiment of the present invention;
fig. 4 is a schematic diagram illustrating an inclination angle of a face of a user according to an embodiment of the present invention;
FIG. 5 is a flow chart of another display object control method according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of an angle of inclination of a face of a user according to another embodiment of the present invention;
fig. 7 is a block diagram of a mobile terminal according to an embodiment of the present invention;
fig. 8 is a block diagram of another mobile terminal according to an embodiment of the present invention;
fig. 9 is a block diagram of another mobile terminal according to an embodiment of the present invention;
fig. 10 is a block diagram of another mobile terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the embodiment of the present invention, the mobile terminal includes an attitude sensor, and the attitude sensor may detect attitude data of the mobile terminal, for example: coordinate data of the mobile terminal in a three-dimensional coordinate system, which may include a horizontal direction coordinate axis (X coordinate axis), a vertical direction coordinate axis (Y coordinate axis), and a vertical direction coordinate axis (Z coordinate axis) as shown in fig. 1, wherein the Z coordinate axis may also be referred to as a depth direction coordinate axis. In addition, the attitude data detected by the attitude sensor may also be the tilt angle of the mobile terminal, for example: the angle of inclination with respect to the horizontal, etc.
In the embodiment of the present invention, the attitude sensor may be a gravity sensor, a three-axis gyroscope, a three-axis accelerometer, a three-axis electronic compass, or another sensor that can detect attitude data of the mobile terminal, which is not limited in the embodiment of the present invention.
In the embodiment of the present invention, the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted mobile terminal, a wearable device, a pedometer, and the like.
Referring to fig. 2, fig. 2 is a flowchart of a display object control method according to an embodiment of the present invention, where the method is applied to a mobile terminal including an attitude sensor, and as shown in fig. 2, the method includes the following steps:
step 201, detecting an inclination angle of a face of a user, wherein the inclination angle is an included angle between a center direction of the face of the user and a vertical direction of a three-dimensional coordinate system corresponding to the attitude sensor.
The center direction of the face of the user can be the direction of the center line of the face of the user, and the center line can be the center line of the face, and the face is symmetrical left and right with the center line. The midline can be the line connecting the center of the nose and the center of the mouth, or the centerline of the outline of the human face, or the line connecting the two perpendicular eyes, etc. Referring specifically to fig. 3, fig. 3 is an image of a face of a user, and 301 in the image is a center line of the face of the user.
The vertical direction of the three-dimensional coordinate system corresponding to the attitude sensor may be the direction of the Y coordinate axis shown in fig. 1, and the included angle may be an included angle α shown in fig. 4, where X, Y, and Z respectively represent an X coordinate axis, a Y coordinate axis, and a Z coordinate axis in the three-dimensional coordinate system, and Y' represents the center direction of the face image of the user.
The inclination angle of the face of the user can be detected by adopting a face image of the face of the user through a camera, calculating the center direction of the face of the user through the face image, and further calculating the included angle; or the above-mentioned detecting the inclination angle of the user face may be detecting the spatial position of the user face by a wireless sensor such as ultrasonic wave or infrared, and calculating the central direction of the user face by the spatial position. For example: the spatial positions of the eyes, the spatial positions of the ears, the spatial position of the face contour and the like are detected by ultrasonic or infrared and other wireless sensors, and the central direction of the face of the user can be calculated, for example: when the spatial positions of the two eyes are determined, a connecting line of the two eyes can be calculated, so that the direction perpendicular to the center of the connecting line is calculated. It should be noted that, in the embodiment of the present invention, the inclination angle of the face of the user is not limited.
Step 202, correcting the attitude data detected by the attitude sensor by using the inclination angle to obtain corrected attitude data.
Wherein, the correcting the attitude data detected by the attitude sensor by using the tilt angle may be to shift the attitude data detected by the attitude sensor in a three-dimensional coordinate system according to the tilt angle, so as to obtain corrected attitude data; alternatively, the Y coordinate axis data detected by the attitude sensor may be converted according to the tilt angle, for example: as shown in fig. 4, the Y coordinate axis data detected by the attitude sensor is converted into coordinate data having the Y' representing direction as the coordinate axis, thereby obtaining corrected attitude data; or the tilt angle of the face of the user may be subtracted from the tilt angle of the mobile terminal detected by the attitude sensor, so as to obtain a corrected tilt angle.
And 203, controlling a display object of the mobile terminal according to the corrected attitude data.
The control of the display object of the mobile terminal according to the corrected gesture data may be, without limitation, controlling the display object to perform a specific action, controlling an advancing direction of the display object, or controlling an advancing speed of the display object according to the corrected gesture data. For example: in the game application program, the advancing direction or the advancing speed of a display object (such as a game character object) is controlled according to the gesture data, or the display object is controlled to execute a specific action or release a specific skill, and the like; another example is: in the three-dimensional image display process, the display angle of the display object (for example, a three-dimensional image) and the like can be controlled according to the attitude data.
It should be noted that, in practical applications, during the use of the mobile terminal by the user, the posture of the user may not be a vertical posture, for example: the user often aligns the posture of the mobile terminal with the posture of the user in order to ensure the visual angle effect of the mobile terminal, for example: when the user lies on the side to use the mobile terminal, the posture of the mobile terminal is inclined so as to ensure that the screen of the mobile terminal and the eyes of the user are in the same direction as much as possible. If the mobile terminal directly controls the display object using the posture data of the mobile terminal, the control may be erroneous because the user manipulates the mobile terminal on the basis of the tilted posture. For example: the user needs to control the game object to be converted by 30 degrees, and the inclination angle of the mobile terminal corresponding to the game object to be converted by 30 degrees is 30 degrees, so that the user controls the mobile terminal to be inclined by 30 degrees according to the operation habit, and at this moment, the inclination angle detected by the attitude sensor is 30 degrees + the inclined angle, thereby causing a control error.
In this way, according to the embodiment, the posture data detected by the posture sensor is corrected by using the inclination angle of the face of the user, and the display object of the mobile terminal is controlled according to the corrected posture data, so that the display direction error caused by the fact that the mobile terminal directly uses the posture data of the mobile terminal to control the display object can be avoided, and the display direction error rate in the process of controlling the display object is reduced. For example: the user uses the mobile terminal while lying on side, the inclination angle of the face is α, and the inclination angle of the mobile terminal is controlled to be α, as shown in fig. 4. If the user needs to control the game object to be converted by 30 degrees, and the inclination angle of the mobile terminal corresponding to the game object to be converted by 30 degrees is 30 degrees, the user controls the mobile terminal to incline by 30 degrees according to the operation habit, at this moment, the inclination angle detected by the attitude sensor is 30 degrees + the inclined angle (alpha), but the inclination angle of the face of the user is detected to be alpha, so that the game object is controlled according to 30 degrees + the inclined angle (alpha) -the inclination angle (alpha) =30 degrees of the face of the user, and therefore the display direction error is avoided.
In addition, step 201 may be executed before the mobile terminal controls the display object, for example: before controlling the game object. Step 202 and step 203 may be performed during the process of controlling the display object, and this step may be performed multiple times, specifically, the attitude data detected by the attitude sensor each time is corrected, and corresponding control is performed. For example: and in the game process, correcting the attitude data detected by the attitude sensor in real time, and controlling the game object according to the corrected attitude data.
In this embodiment, the display direction error rate in the process of controlling the display object can be reduced by correcting the posture data according to the inclination angle of the face of the user and controlling the display object of the mobile terminal according to the corrected posture data.
Referring to fig. 5, fig. 5 is a flowchart of another display object control method according to an embodiment of the present invention, where the method is applied to a mobile terminal including an attitude sensor, and as shown in fig. 5, the method includes the following steps:
step 501, acquiring a face image of the face of the user through a camera, calculating a center direction of the face image, and calculating an included angle between the center direction and a vertical direction of a three-dimensional coordinate system corresponding to the attitude sensor, wherein the included angle is an inclination angle of the face of the user.
The camera can be a front camera of the mobile terminal, and after the face image is collected, the center direction of the face image can be calculated according to the image characteristics of the face image, for example: the position information of both eyes is recognized, and the direction of the line connecting both eyes is determined, and then the vertical direction of the continuous direction is calculated as the above-mentioned center direction, etc. The calculation of the included angle between the center direction and the vertical direction of the three-dimensional coordinate system corresponding to the attitude sensor may be performed by mapping the center direction to the three-dimensional coordinate system, so as to calculate the included angle, that is, the inclination angle of the face of the user.
The inclination angle of the face of the user can be calculated by collecting the face image, so that the inclination angle of the face of the user can be accurately calculated, and the accuracy of the displayed object is improved.
In addition, step 501 may be that the mobile terminal receives a start instruction for starting an application program input by a user, suggests the application program in response to the instruction, and determines whether gesture data of the mobile terminal needs to be used in the application program, if necessary, step 501 may be executed, and if not, step 501 may not be executed, so that power consumption waste caused by detecting an inclination angle of a face of the user in the application program that does not use the gesture data may be avoided. Of course, after the application program is started, it may be determined whether the user uses the operation mode corresponding to the gesture data (for example, the gravity sensing operation mode), if so, step 501 may be executed, and if not, step 501 may not be executed. Of course, in this embodiment, step 501 may be directly executed without determination, which is not limited to this.
Optionally, the acquiring, by a camera, a face image of the face of the user, calculating a center direction of the face image, and calculating an included angle between the center direction and a vertical direction of a three-dimensional coordinate system corresponding to the attitude sensor includes:
acquiring a face image of the face of the user through a camera, and identifying position information and a contour shape of at least one local feature of the face image;
determining the center direction of the face image according to the position information and the contour shape of the at least one local feature, wherein the face image is symmetrical left and right in the center direction;
and calculating an included angle between the central direction and the vertical direction of the three-dimensional coordinate system corresponding to the attitude sensor.
Wherein the at least one local feature may comprise at least one local feature of two eyes, two ears, nose and mouth, etc. After the position information and the outline shape of at least one local feature of the face image are recognized, and the face image is bilaterally symmetrical, the mobile terminal can recognize the center line of the face image, for example: from the position information of the two eyes, a center point between the two eyes can be determined, and from the contour shape, a center point of the contour shape can be determined, and a connecting line of the two points is a center line of the face image, as can be seen in Y' in fig. 4.
In the embodiment, the center direction of the face image can be accurately calculated through the position information and the outline shape of at least one local feature of the face image, so that the accuracy of controlling the display object is improved.
Of course, in this embodiment, it is not limited to calculate the center direction of the face image by the position information and the contour shape of at least one local feature of the face image, for example: the center direction of the face image can be determined according to the position information of the two eyes, specifically, a connecting line between the two eyes can be determined according to the position information of the two eyes, and the direction perpendicular to the continuous direction is determined as the center direction of the face image; or the center direction of the face image may also be determined by the position information of the two eyes and the position information of the nose, specifically, the center points of the two eyes may be determined according to the position information of the two eyes, and the center point of the nose may be determined by the position information of the nose, and the connecting line of the two points is the center line of the face image.
In addition, the above-mentioned calculating the included angle between the center direction and the vertical direction of the three-dimensional coordinate system corresponding to the attitude sensor may be that the included angle between the vertical direction of the screen of the mobile terminal and the vertical direction of the three-dimensional coordinate system corresponding to the attitude sensor is taken as the included angle between the center direction and the vertical direction of the three-dimensional coordinate system corresponding to the attitude sensor, and therefore, the center direction of the face image and the vertical direction of the screen may be the same, for example: as shown in fig. 4. The vertical direction of the screen of the mobile terminal may be a direction perpendicular to the screen of the mobile terminal.
Of course, in some cases, the face of the user may not be aligned with the screen of the mobile terminal, so that the center direction of the face image is not the same as the vertical direction of the screen. For example: the face of the user is inclined, but the mobile phone is not inclined, or the inclined angle of the face of the user is not consistent with the inclined angle of the mobile phone. In this case, the calculating an angle between the center direction and a vertical direction of the three-dimensional coordinate system corresponding to the attitude sensor may include:
and calculating a first included angle of the central direction and the vertical direction of the screen of the mobile terminal, calculating a second included angle of the vertical direction of the screen of the mobile terminal and the vertical direction of the three-dimensional coordinate system corresponding to the attitude sensor, and taking the sum of the first included angle and the second included angle as the included angle of the central direction and the vertical direction of the three-dimensional coordinate system corresponding to the attitude sensor.
This embodiment can be seen from fig. 6, in fig. 6, T represents the vertical direction of the screen, Y' represents the center direction of the face image, α represents the above-mentioned second angle, and β represents the above-mentioned first angle.
The included angle between the center direction of the face image and the vertical direction of the three-dimensional coordinate system corresponding to the attitude sensor can be accurately calculated through the first included angle and the second included angle. Preferably, in this embodiment, a sum of the first included angle and the second included angle may be between 0 and 90 degrees, that is, an included angle between the center direction and a vertical direction of the three-dimensional coordinate system corresponding to the attitude sensor is between 0 and 90 degrees.
In the embodiment, because the first included angle between the center direction of the face image and the vertical direction of the screen of the mobile terminal is considered, the posture data detected by the posture sensor can be corrected under the condition that the face of the user is not aligned with the screen of the mobile terminal, so that the display direction error in the process of controlling the display object is further reduced.
And 502, correcting the attitude data detected by the attitude sensor by using the inclination angle to obtain corrected attitude data.
Optionally, the gesture data includes an inclination angle of the mobile terminal, and the gesture data detected by the gesture sensor is corrected by using the inclination angle to obtain corrected gesture data, including:
and subtracting the inclination angle of the face of the user from the inclination angle of the mobile terminal detected by the attitude sensor to obtain a corrected inclination angle of the mobile terminal.
Wherein, the tilt angle of the mobile terminal may be a tilt angle of the mobile terminal relative to the ground level, for example: the tilt angle of the mobile terminal may be a tilt schedule of the mobile terminal with respect to an X coordinate axis in a three-dimensional coordinate system. Of course, the inclination angle of the user face may be an inclination angle of the user face with respect to the ground horizontal plane.
In this embodiment, the inclination angle of the face of the user may be subtracted from the inclination angle of the mobile terminal to obtain a corrected inclination angle of the mobile terminal, and the corrected inclination angle is used to control the display object, so that the accuracy of controlling the display object may be improved, for example: the accuracy of the gravity sensing operation game can be improved. Preferably, in this embodiment, the attitude sensor is a gravity sensor.
Optionally, the collecting the face image of the user face through the camera, and calculating the central direction of the face image, and calculating the included angle between the central direction and the vertical direction of the three-dimensional coordinate system corresponding to the attitude sensor, includes:
acquiring N face images of the face of the user through a camera, wherein N is an integer greater than or equal to 2;
calculating the inclination angle of each face image, wherein the inclination angle of each face image is the included angle between the center direction of the face image and the vertical direction of the three-dimensional coordinate system corresponding to the attitude sensor;
calculating the variance of the inclination angles of the N face images;
the correcting the attitude data detected by the attitude sensor by using the inclination angle to obtain corrected attitude data comprises the following steps:
and if the variance is smaller than a preset threshold value, correcting the attitude data detected by the attitude sensor by using the inclination angle of one face image in the N face images to obtain corrected attitude data, or correcting the attitude data detected by the attitude sensor by using the mean value of the inclination angles of the N face images to obtain corrected attitude data.
The variance of the inclination angles of the N face images may be obtained by calculating an average inclination angle of the N face images, calculating a square of a difference between the inclination angle of each face image and the average inclination angle, and calculating a mean of all the squares. In addition, the preset threshold may be preset by a user or preset by the mobile terminal, for example: 0.1, 0.3, 0.5, 1, etc., without limiting the embodiments of the present invention.
In addition, the above-mentioned correction of the pose data detected by the pose sensor using the tilt angle of one of the N face images may be performed by correcting the pose data detected by the pose sensor using the tilt angle of any one of the N face images.
In this embodiment, it can be determined that the posture of the user is fixed by the variance being smaller than the preset threshold, so that the posture data detected by the posture sensor can be corrected based on the inclination angle of one of the N face images, and error control caused by correction of the posture data detected by the posture sensor by using the inclination angle of the face image, the variance being not smaller than the preset threshold (the posture of the user being not fixed) can be avoided.
And 503, controlling the display object of the mobile terminal according to the corrected attitude data.
In this embodiment, the embodiment of controlling the display object of the mobile terminal according to the corrected posture data is not limited, and the display object may be specifically operated according to the relevant setting of each application program. In the game application program, the advancing direction or the advancing speed of a display object (such as a game character object) is controlled according to the gesture data, or the display object is controlled to execute a specific action or release a specific skill, and the like; another example is: in the three-dimensional image display process, the display angle of a display object (e.g., a three-dimensional image) and the like can be controlled according to the attitude data.
In this embodiment, various optional embodiments are added to the embodiment shown in fig. 2, and the accuracy of controlling the display object can be further improved.
Referring to fig. 7, fig. 7 is a structural diagram of a mobile terminal according to an embodiment of the present invention, where the mobile terminal includes an attitude sensor, and as shown in fig. 7, a mobile terminal 700 includes:
the detection module 701 is used for detecting an inclination angle of a face of a user, wherein the inclination angle is an included angle between the center direction of the face of the user and the vertical direction of a three-dimensional coordinate system corresponding to the attitude sensor;
a correction module 702, configured to correct the attitude data detected by the attitude sensor by using the tilt angle, so as to obtain corrected attitude data;
and a control module 703, configured to control a display object of the mobile terminal according to the corrected attitude data.
Optionally, the detection module 701 is configured to collect, through a camera, a face image of the face of the user, calculate a center direction of the face image, and calculate an included angle between the center direction and a vertical direction of a three-dimensional coordinate system corresponding to the attitude sensor.
Optionally, as shown in fig. 8, the detecting module 701 includes:
the recognition unit 7011 is configured to collect a face image of the face of the user through a camera, and recognize position information and a contour shape of at least one local feature of the face image;
a determining unit 7012, configured to determine a center direction of the face image according to the position information and the contour shape of the at least one local feature, where the face image is symmetric left and right in the center direction;
and the first calculating unit 7013 is configured to calculate an included angle between the center direction and a vertical direction of the three-dimensional coordinate system corresponding to the attitude sensor.
Optionally, the first calculating unit 7013 is configured to calculate a first included angle between the center direction and the vertical direction of the screen of the mobile terminal, and calculate a second included angle between the vertical direction of the screen of the mobile terminal and the vertical direction of the three-dimensional coordinate system corresponding to the attitude sensor, and use a sum of the first included angle and the second included angle as an included angle between the center direction and the vertical direction of the three-dimensional coordinate system corresponding to the attitude sensor.
Optionally, as shown in fig. 9, the detecting module 701 includes:
an acquisition unit 7014, configured to acquire N face images of the user face through a camera, where N is an integer greater than or equal to 2;
a second calculating unit 7015, configured to calculate an inclination angle of each face image, where the inclination angle of each face image is an included angle between a center direction of the face image and a vertical direction of a three-dimensional coordinate system corresponding to the attitude sensor;
a third calculating unit 7016, configured to calculate variances of the inclination angles of the N face images;
the correction module 702 is configured to, if the variance is smaller than a preset threshold, correct the pose data detected by the pose sensor by using an inclination angle of one of the N face images to obtain corrected pose data, or correct the pose data detected by the pose sensor by using an average value of the inclination angles of the N face images to obtain corrected pose data.
Optionally, the gesture data includes an inclination angle of the mobile terminal, and the correction module 702 is configured to subtract the inclination angle of the face of the user from the inclination angle of the mobile terminal detected by the gesture sensor, so as to obtain a corrected inclination angle of the mobile terminal.
The mobile terminal provided in the embodiment of the present invention can implement each process implemented by the mobile terminal in the method embodiments of fig. 2 to fig. 5, and is not described herein again to avoid repetition. And the display direction error rate of the process of controlling the display object can be reduced.
Figure 10 is a schematic diagram of the hardware structure of a mobile terminal implementing various embodiments of the present invention,
the mobile terminal 1000 includes, but is not limited to: radio frequency unit 1001, network module 1002, audio output unit 1003, input unit 1004, sensor 1005, display unit 1006, user input unit 1007, interface unit 1008, memory 1009, processor 1010, and power supply 1011. Those skilled in the art will appreciate that the mobile terminal architecture illustrated in fig. 10 is not intended to be limiting of mobile terminals, and that a mobile terminal may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted mobile terminal, a wearable device, a pedometer, and the like.
The processor 1010 is configured to detect an inclination angle of a face of a user, where the inclination angle is an included angle between a center direction of the face of the user and a vertical direction of a three-dimensional coordinate system corresponding to the attitude sensor;
correcting the attitude data detected by the attitude sensor by using the inclination angle to obtain corrected attitude data;
and controlling a display object of the mobile terminal according to the corrected attitude data.
Optionally, the detecting the tilt angle of the face of the user performed by the processor 1010 includes:
the method comprises the steps of collecting a face image of a face of a user through a camera, calculating the center direction of the face image, and calculating the included angle between the center direction and the vertical direction of a three-dimensional coordinate system corresponding to an attitude sensor.
Optionally, the processor 1010 executes the acquiring, by the camera, the face image of the face of the user, calculating a central direction of the face image, and calculating an included angle between the central direction and a vertical direction of a three-dimensional coordinate system corresponding to the attitude sensor, including:
acquiring a face image of the face of the user through a camera, and identifying position information and a contour shape of at least one local feature of the face image;
determining the center direction of the face image according to the position information and the contour shape of the at least one local feature, wherein the face image is symmetrical left and right in the center direction;
and calculating an included angle between the central direction and the vertical direction of the three-dimensional coordinate system corresponding to the attitude sensor.
Optionally, the calculating, performed by the processor 1010, an included angle between the central direction and a vertical direction of a three-dimensional coordinate system corresponding to the attitude sensor includes:
and calculating a first included angle of the central direction and the vertical direction of the screen of the mobile terminal, calculating a second included angle of the vertical direction of the screen of the mobile terminal and the vertical direction of the three-dimensional coordinate system corresponding to the attitude sensor, and taking the sum of the first included angle and the second included angle as the included angle of the central direction and the vertical direction of the three-dimensional coordinate system corresponding to the attitude sensor.
Optionally, the acquiring, by the processor 1010, the face image of the face of the user through the camera, calculating a center direction of the face image, and calculating an included angle between the center direction and a vertical direction of a three-dimensional coordinate system corresponding to the attitude sensor, includes:
acquiring N face images of the face of the user through a camera, wherein N is an integer greater than or equal to 2;
calculating the inclination angle of each face image, wherein the inclination angle of each face image is the included angle between the center direction of the face image and the vertical direction of the three-dimensional coordinate system corresponding to the attitude sensor;
calculating the variance of the inclination angles of the N human face images;
processor 1010 is configured to correct the attitude data detected by the attitude sensor using the tilt angle to obtain corrected attitude data, including:
and if the variance is smaller than a preset threshold value, correcting the attitude data detected by the attitude sensor by using the inclination angle of one face image in the N face images to obtain corrected attitude data, or correcting the attitude data detected by the attitude sensor by using the mean value of the inclination angles of the N face images to obtain corrected attitude data.
Optionally, the gesture data includes an inclination angle of the mobile terminal, and the processor 1010 executes the correction on the gesture data detected by the gesture sensor by using the inclination angle to obtain corrected gesture data, including:
and subtracting the inclination angle of the face of the user from the inclination angle of the mobile terminal detected by the attitude sensor to obtain a corrected inclination angle of the mobile terminal.
The mobile terminal can reduce the display direction error rate of the process of controlling the display object.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 1001 may be used for receiving and sending signals during a message transmission or a call, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 1010; in addition, uplink data is transmitted to the base station. In general, radio frequency unit 1001 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. Further, the radio frequency unit 1001 may also communicate with a network and other devices through a wireless communication system.
The mobile terminal provides the user with wireless broadband internet access through the network module 1002, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 1003 may convert audio data received by the radio frequency unit 1001 or the network module 1002 or stored in the memory 1009 into an audio signal and output as sound. Also, the audio output unit 1003 may also provide audio output related to a specific function performed by the mobile terminal 1000 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 1003 includes a speaker, a buzzer, a receiver, and the like.
The input unit 1004 is used to receive an audio or video signal. The input Unit 1004 may include a Graphics Processing Unit (GPU) 10041 and a microphone 10042, the Graphics processor 10041 Processing image data of still pictures or video obtained by an image capturing device (such as a camera) in a video capture mode or an image capture mode. The processed image frames may be displayed on the display unit 1006. The image frames processed by the graphic processor 10041 may be stored in the memory 1009 (or other storage medium) or transmitted via the radio frequency unit 1001 or the network module 1002. The microphone 10042 can receive sound and can process such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 1001 in case of a phone call mode.
The mobile terminal 1000 can also include at least one sensor 1005, with the sensor 1005 including a gesture sensor, and can also include a light sensor and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 10061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 10061 and/or the backlight when the mobile terminal 1000 moves to the ear. As one of the attitude sensors, the gravity sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the attitude of the mobile terminal (such as horizontal and vertical screen switching, related games, magnetometer attitude calibration), and vibration identification related functions (such as pedometer and tapping); the sensor 1005 may further include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which will not be described herein.
The display unit 1006 is used to display information input by the user or information provided to the user. The Display unit 1006 may include a Display panel 10061, and the Display panel 10061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 1007 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 1007 includes a touch panel 10071 and other input devices 10072. The touch panel 10071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 10071 (e.g., operations by a user on or near the touch panel 10071 using a finger, a stylus, or any other suitable object or attachment). The touch panel 10071 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 1010, receives a command from the processor 1010, and executes the command. In addition, the touch panel 10071 may be implemented by various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 10071, the user input unit 1007 can include other input devices 10072. Specifically, the other input devices 10072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a track ball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 10071 can be overlaid on the display panel 10061, and when the touch panel 10071 detects a touch operation thereon or nearby, the touch operation is transmitted to the processor 1010 to determine the type of the touch event, and then the processor 1010 provides a corresponding visual output on the display panel 10061 according to the type of the touch event. Although in fig. 10, the touch panel 10071 and the display panel 10061 are two independent components for implementing the input and output functions of the mobile terminal, in some embodiments, the touch panel 10071 and the display panel 10061 may be integrated to implement the input and output functions of the mobile terminal, which is not limited herein.
The interface unit 1008 is an interface through which external devices are connected to the mobile terminal 1000. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 1008 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 1000 or may be used to transmit data between the mobile terminal 1000 and external devices.
The memory 1009 may be used to store software programs as well as various data. The memory 1009 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, and the like), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 1009 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 1010 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 1009 and calling data stored in the memory 1009, thereby integrally monitoring the mobile terminal. Processor 1010 may include one or more processing units; preferably, the processor 1010 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1010.
The mobile terminal 1000 may also include a power supply 1011 (e.g., a battery) for supplying power to various components, and preferably, the power supply 1011 may be logically connected to the processor 1010 through a power management system that may be used for managing charging, discharging, and power consumption.
In addition, the mobile terminal 1000 includes some functional modules that are not shown, and are not described in detail herein.
Preferably, an embodiment of the present invention further provides a mobile terminal, including a processor 1010, a memory 1009, and a computer program stored in the memory 1009 and capable of running on the processor 1010, where the computer program is executed by the processor 1010 to implement each process of the above display object control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the embodiment of the display object control method, and can achieve the same technical effect, and in order to avoid repetition, the details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising a component of' 8230; \8230;" does not exclude the presence of another like element in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the particular illustrative embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but is intended to cover various modifications, equivalent arrangements, and equivalents thereof, which may be made by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (11)

1. A display object control method applied to a mobile terminal including an attitude sensor, comprising:
detecting an inclination angle of a user face, wherein the inclination angle is an included angle between the center direction of the user face and the vertical direction of a three-dimensional coordinate system corresponding to the attitude sensor;
correcting the attitude data detected by the attitude sensor by using the inclination angle to obtain corrected attitude data;
controlling a display object of the mobile terminal according to the corrected attitude data;
the included angle between the center direction and the vertical direction of the three-dimensional coordinate system corresponding to the attitude sensor is obtained in the following mode:
calculating a first included angle between the center direction and the vertical direction of the screen of the mobile terminal, calculating a second included angle between the vertical direction of the screen of the mobile terminal and the vertical direction of the three-dimensional coordinate system corresponding to the attitude sensor, and taking the sum of the first included angle and the second included angle as the included angle between the center direction and the vertical direction of the three-dimensional coordinate system corresponding to the attitude sensor;
wherein the controlling the display object of the mobile terminal according to the corrected attitude data includes:
and controlling the display object to execute a specific action, controlling the advancing direction of the display object or controlling the advancing speed of the display object according to the corrected posture data.
2. The method of claim 1, wherein detecting the angle of inclination of the face of the user comprises:
the method comprises the steps of collecting a face image of a face of a user through a camera, calculating the center direction of the face image, and calculating the included angle between the center direction and the vertical direction of a three-dimensional coordinate system corresponding to an attitude sensor.
3. The method of claim 2, wherein the acquiring a face image of the face of the user by a camera, calculating a center direction of the face image, and calculating an angle between the center direction and a vertical direction of a three-dimensional coordinate system corresponding to the attitude sensor, comprises:
acquiring a face image of the face of the user through a camera, and identifying position information and a contour shape of at least one local feature of the face image;
determining the center direction of the face image according to the position information and the contour shape of the at least one local feature, wherein the face image is symmetrical left and right in the center direction;
and calculating an included angle between the center direction and the vertical direction of the three-dimensional coordinate system corresponding to the attitude sensor.
4. The method of claim 2, wherein the acquiring a face image of the face of the user by a camera, calculating a center direction of the face image, and calculating an angle between the center direction and a vertical direction of a three-dimensional coordinate system corresponding to the attitude sensor, comprises:
acquiring N face images of the face of the user through a camera, wherein N is an integer greater than or equal to 2;
calculating the inclination angle of each face image, wherein the inclination angle of each face image is the included angle between the center direction of the face image and the vertical direction of the three-dimensional coordinate system corresponding to the attitude sensor;
calculating the variance of the inclination angles of the N face images;
the correcting the attitude data detected by the attitude sensor by using the inclination angle to obtain corrected attitude data comprises the following steps:
and if the variance is smaller than a preset threshold value, correcting the attitude data detected by the attitude sensor by using the inclination angle of one of the N face images to obtain corrected attitude data, or correcting the attitude data detected by the attitude sensor by using the mean value of the inclination angles of the N face images to obtain corrected attitude data.
5. The method of any one of claims 1 to 4, wherein the attitude data includes a tilt angle of the mobile terminal, and wherein the correcting the attitude data detected by the attitude sensor using the tilt angle to obtain corrected attitude data comprises:
and subtracting the inclination angle of the face of the user from the inclination angle of the mobile terminal detected by the attitude sensor to obtain a corrected inclination angle of the mobile terminal.
6. A mobile terminal comprising an attitude sensor, comprising:
the detection module is used for detecting the inclination angle of the face of the user, wherein the inclination angle is the included angle between the center direction of the face of the user and the vertical direction of the three-dimensional coordinate system corresponding to the attitude sensor;
the correction module is used for correcting the attitude data detected by the attitude sensor by using the inclination angle to obtain corrected attitude data;
the control module is used for controlling a display object of the mobile terminal according to the corrected attitude data;
the included angle between the center direction and the vertical direction of the three-dimensional coordinate system corresponding to the attitude sensor is obtained by the following method:
calculating a first included angle between the center direction and the vertical direction of the screen of the mobile terminal, calculating a second included angle between the vertical direction of the screen of the mobile terminal and the vertical direction of the three-dimensional coordinate system corresponding to the attitude sensor, and taking the sum of the first included angle and the second included angle as the included angle between the center direction and the vertical direction of the three-dimensional coordinate system corresponding to the attitude sensor;
the control module is used for controlling the display object to execute a specific action, controlling the advancing direction of the display object or controlling the advancing speed of the display object according to the corrected posture data.
7. The mobile terminal of claim 6, wherein the detection module is configured to collect a face image of the face of the user through a camera, calculate a center direction of the face image, and calculate an included angle between the center direction and a vertical direction of a three-dimensional coordinate system corresponding to the attitude sensor.
8. The mobile terminal of claim 7, wherein the detection module comprises:
the identification unit is used for acquiring a face image of the face of the user through a camera and identifying the position information and the outline shape of at least one local feature of the face image;
the determining unit is used for determining the center direction of the face image according to the position information and the outline shape of the at least one local feature, wherein the face image is symmetrical left and right in the center direction;
and the first calculation unit is used for calculating an included angle between the center direction and the vertical direction of the three-dimensional coordinate system corresponding to the attitude sensor.
9. The mobile terminal of claim 7, wherein the detection module comprises:
the acquisition unit is used for acquiring N face images of the face of the user through a camera, wherein N is an integer greater than or equal to 2;
the second calculation unit is used for calculating the inclination angle of each face image, and the inclination angle of each face image is the included angle between the center direction of the face image and the vertical direction of the three-dimensional coordinate system corresponding to the attitude sensor;
a third calculation unit, configured to calculate variances of the tilt angles of the N face images;
and the correction module is used for correcting the attitude data detected by the attitude sensor by using the inclination angle of one face image in the N face images to obtain corrected attitude data if the variance is smaller than a preset threshold value, or correcting the attitude data detected by the attitude sensor by using the mean value of the inclination angles of the N face images to obtain the corrected attitude data.
10. The mobile terminal according to any of claims 6 to 9, wherein the gesture data includes a tilt angle of the mobile terminal, and the correction module is configured to subtract the tilt angle of the face of the user from the tilt angle of the mobile terminal detected by the gesture sensor to obtain a corrected tilt angle of the mobile terminal.
11. A mobile terminal, comprising: memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps in the display object control method according to any one of claims 1 to 5 when executing the computer program.
CN201810014594.XA 2018-01-08 2018-01-08 Display object control method and mobile terminal Active CN108153422B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810014594.XA CN108153422B (en) 2018-01-08 2018-01-08 Display object control method and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810014594.XA CN108153422B (en) 2018-01-08 2018-01-08 Display object control method and mobile terminal

Publications (2)

Publication Number Publication Date
CN108153422A CN108153422A (en) 2018-06-12
CN108153422B true CN108153422B (en) 2023-02-17

Family

ID=62461161

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810014594.XA Active CN108153422B (en) 2018-01-08 2018-01-08 Display object control method and mobile terminal

Country Status (1)

Country Link
CN (1) CN108153422B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109224437A (en) 2018-08-28 2019-01-18 腾讯科技(深圳)有限公司 The exchange method and terminal and storage medium of a kind of application scenarios
JP2020129263A (en) * 2019-02-08 2020-08-27 セイコーエプソン株式会社 Display system, program for controlling information processing device, and method of controlling information processing device
CN110456923B (en) * 2019-07-31 2023-08-11 维沃移动通信有限公司 Gesture sensing data processing method and electronic equipment
CN110412993B (en) * 2019-09-04 2023-03-21 上海飞科电器股份有限公司 Autonomous charging method and mobile robot
CN110941998B (en) * 2019-11-07 2024-02-02 咪咕互动娱乐有限公司 Gesture adjustment method, gesture adjustment device and computer storage medium
CN111429519B (en) * 2020-03-27 2021-07-16 贝壳找房(北京)科技有限公司 Three-dimensional scene display method and device, readable storage medium and electronic equipment
CN112158143B (en) * 2020-10-23 2022-05-17 浙江吉利控股集团有限公司 Vehicle control method and system and vehicle
CN117440212A (en) * 2023-12-20 2024-01-23 深圳市亿莱顿科技有限公司 Screen-throwing display equipment and control method and device thereof

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103455256A (en) * 2013-08-21 2013-12-18 小米科技有限责任公司 Method and terminal for rotating display picture of screen
CN107483709A (en) * 2016-09-29 2017-12-15 维沃移动通信有限公司 A kind of horizontal/vertical screen switching method and mobile terminal

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003280785A (en) * 2002-03-26 2003-10-02 Sony Corp Image display processor, image display processing method and computer program
CN105807902B (en) * 2014-12-30 2019-01-18 Tcl集团股份有限公司 Stablize the method and a kind of mobile terminal that mobile terminal screen content is shown
CN106603802A (en) * 2015-10-20 2017-04-26 中兴通讯股份有限公司 Method and device for controlling mobile intelligent device
CN105956518A (en) * 2016-04-21 2016-09-21 腾讯科技(深圳)有限公司 Face identification method, device and system
US10254123B2 (en) * 2016-05-24 2019-04-09 Telenav, Inc. Navigation system with vision augmentation mechanism and method of operation thereof
CN106125933A (en) * 2016-06-28 2016-11-16 维沃移动通信有限公司 The method of a kind of display interface rotation and mobile terminal
CN107273823A (en) * 2017-05-26 2017-10-20 西安理工大学 A kind of neck attitude monitoring method merged based on sensor with image procossing

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103455256A (en) * 2013-08-21 2013-12-18 小米科技有限责任公司 Method and terminal for rotating display picture of screen
CN107483709A (en) * 2016-09-29 2017-12-15 维沃移动通信有限公司 A kind of horizontal/vertical screen switching method and mobile terminal

Also Published As

Publication number Publication date
CN108153422A (en) 2018-06-12

Similar Documents

Publication Publication Date Title
CN108153422B (en) Display object control method and mobile terminal
CN107943551B (en) Screen display method and mobile terminal
CN108182043B (en) Information display method and mobile terminal
CN108459797B (en) Control method of folding screen and mobile terminal
CN109381165B (en) Skin detection method and mobile terminal
CN110174993B (en) Display control method, terminal equipment and computer readable storage medium
CN107707817B (en) video shooting method and mobile terminal
CN108628515B (en) Multimedia content operation method and mobile terminal
CN111031234B (en) Image processing method and electronic equipment
CN109241832B (en) Face living body detection method and terminal equipment
CN109343788B (en) Operation control method of mobile terminal and mobile terminal
CN109618218B (en) Video processing method and mobile terminal
CN110933494A (en) Picture sharing method and electronic equipment
CN111314616A (en) Image acquisition method, electronic device, medium and wearable device
CN110312070B (en) Image processing method and terminal
CN109618055B (en) Position sharing method and mobile terminal
CN109859718B (en) Screen brightness adjusting method and terminal equipment
CN109784234B (en) Right-angled bend identification method based on forward fisheye lens and vehicle-mounted equipment
CN109443261B (en) Method for acquiring folding angle of folding screen mobile terminal and mobile terminal
CN108345780B (en) Unlocking control method and mobile terminal
CN110740265B (en) Image processing method and terminal equipment
CN109819331B (en) Video call method, device and mobile terminal
CN109257504B (en) Audio processing method and terminal equipment
CN109189517B (en) Display switching method and mobile terminal
CN108965701B (en) Jitter correction method and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant