CN106406530B - Screen display method and mobile terminal thereof - Google Patents

Screen display method and mobile terminal thereof Download PDF

Info

Publication number
CN106406530B
CN106406530B CN201610835214.XA CN201610835214A CN106406530B CN 106406530 B CN106406530 B CN 106406530B CN 201610835214 A CN201610835214 A CN 201610835214A CN 106406530 B CN106406530 B CN 106406530B
Authority
CN
China
Prior art keywords
mobile terminal
image
target user
screen
grating
Prior art date
Application number
CN201610835214.XA
Other languages
Chinese (zh)
Other versions
CN106406530A (en
Inventor
刘铁峰
Original Assignee
宇龙计算机通信科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 宇龙计算机通信科技(深圳)有限公司 filed Critical 宇龙计算机通信科技(深圳)有限公司
Priority to CN201610835214.XA priority Critical patent/CN106406530B/en
Publication of CN106406530A publication Critical patent/CN106406530A/en
Application granted granted Critical
Publication of CN106406530B publication Critical patent/CN106406530B/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements

Abstract

The embodiment of the invention discloses a screen display method and a mobile terminal thereof, which are used for enabling a user to see information displayed on a screen at any angle in an anti-peeping mode of the mobile terminal. The method provided by the embodiment of the invention comprises the following steps: the mobile terminal acquires a current face image of a target user through a front camera; the mobile terminal determines the face movement information of the target user through the comparison between the current face image and the obtained previous face image; the mobile terminal carries out segmentation processing on the interface displayed on the display screen according to the face movement information of the target user to obtain a segmentation result; and the mobile terminal adjusts the grating on the grating screen according to the segmentation result and determines a display center corresponding to the face of the target user. According to the embodiment of the invention, when the mobile terminal is in the peep-proof mode, a user can see the information displayed on the screen at any angle.

Description

Screen display method and mobile terminal thereof

Technical Field

The invention relates to the technical field of mobile terminals, in particular to a screen display method and a mobile terminal thereof.

Background

Nowadays, portable mobile terminals such as notebook computers, tablet computers, smart phones and the like have become essential electronic devices in people's lives. Users often use mobile terminals in public places such as fast food restaurants, train stations, airports, etc. to browse web pages, watch movies, send and receive e-mails, video chats, etc. In order to protect private information being used by a user in a public place from being leaked, it is necessary to prevent a person beside the user from peeping contents on a display screen of the mobile terminal.

In the prior art, the anti-peeping adhesive film manufactured by utilizing the shutter microstructure technology cannot clearly see contents displayed on a screen as long as the visual angle exceeds 30 degrees, the technology cannot intelligently adjust a display area according to the angle formed by the eyes of a user and the screen, and the area capable of seeing screen information is limited to be 30 degrees in the vertical direction of the screen.

Since the area for seeing the screen information is limited to 30 degrees in the vertical direction of the screen, the user cannot see the information displayed on the screen when the user looks at the mobile phone screen from the side.

Disclosure of Invention

The embodiment of the invention provides a screen display method and a mobile terminal thereof, which can enable a display center to change according to the movement of the eye position of a target user, so that a user can see information displayed on a screen at any angle in an anti-peeping mode of the mobile terminal.

An embodiment of the present invention provides a method for determining user attribute information, which specifically includes:

the method is applied to a mobile terminal, a grating screen is installed on a display screen of the mobile terminal, and the method comprises the following steps:

the mobile terminal acquires a current face image of a target user through a front camera;

the mobile terminal determines the face movement information of the target user through the comparison of the current face image and the acquired previous face image;

the mobile terminal carries out segmentation processing on the interface displayed on the display screen according to the face movement information of the target user to obtain a segmentation result;

and the mobile terminal adjusts the grating on the grating screen according to the segmentation result and determines a display center corresponding to the face of the target user.

A second aspect of the embodiments of the present invention provides a mobile terminal, which specifically includes:

the acquisition unit is used for acquiring a current face image of a target user through the front camera;

the first determining unit is used for determining the face movement information of the target user through the comparison between the current face image acquired by the acquiring unit and the acquired previous face image;

the first segmentation unit is used for carrying out segmentation processing on the interface displayed on the display screen according to the face movement information of the target user determined by the first determination unit to obtain a segmentation result;

and the adjusting unit is used for adjusting the grating on the grating screen according to the segmentation result of the first segmentation unit and determining a display center corresponding to the face of the target user.

A third aspect of the embodiments of the present invention provides a mobile terminal, which specifically includes:

an input device, an output device, a processor, and a memory;

the processor is used for executing the following steps by calling the operation instruction stored in the memory:

acquiring a current face image of a target user through a front-facing camera;

determining the face movement information of the target user through the comparison of the current face image and the acquired previous face image;

performing segmentation processing on an interface displayed on the display screen according to the face movement information of the target user to obtain a segmentation result;

and adjusting the grating on the grating screen according to the segmentation result, and determining a display center corresponding to the face of the target user.

According to the technical scheme, the embodiment of the invention has the following advantages:

in the embodiment of the invention, a mobile terminal acquires a current face image of a target user through a front camera, the mobile terminal determines face movement information of the target user through comparison between the current face image and an acquired previous face image, the mobile terminal performs segmentation processing on an interface displayed on a display screen according to the face movement information of the target user to obtain a segmentation result, and the mobile terminal adjusts a grating on the grating screen according to the segmentation result to determine a display center corresponding to the face of the target user. The display center can be changed according to the movement of the eye position of the target user, so that the user can see the information displayed on the screen at any angle in the peep-proof mode of the mobile terminal.

Drawings

FIG. 1 is a diagram illustrating a grating screen and a mobile terminal screen according to an embodiment of the present invention;

FIG. 2 is a flowchart illustrating an embodiment of a screen display method according to an embodiment of the present invention;

FIG. 3 is a flowchart illustrating a screen display method according to another embodiment of the present invention;

FIG. 4 is a flowchart illustrating a screen display method according to another embodiment of the present invention;

FIG. 5 is a diagram of an embodiment of a screen display method according to an embodiment of the present invention;

FIG. 6 is a diagram of another embodiment of a screen display method according to an embodiment of the present invention;

FIG. 7 is a diagram of another embodiment of a screen display method according to an embodiment of the present invention;

FIG. 8 is a diagram of another embodiment of a screen display method according to an embodiment of the present invention;

FIG. 9 is a diagram of another embodiment of a screen display method according to an embodiment of the present invention;

FIG. 10 is a diagram showing another example of a screen display method according to an embodiment of the present invention;

fig. 11 is a schematic structural diagram of an embodiment of a mobile terminal according to the present invention;

fig. 12 is a schematic structural diagram of another embodiment of a mobile terminal according to the present invention;

fig. 13 is a schematic structural diagram of another embodiment of a mobile terminal according to the present invention;

fig. 14 is a schematic structural diagram of another embodiment of a mobile terminal according to the present invention;

fig. 15 is a schematic structural diagram of another embodiment of a mobile terminal according to the present invention;

fig. 16 is a schematic structural diagram of another embodiment of a mobile terminal according to an embodiment of the present invention.

Detailed Description

The embodiment of the invention provides a screen display method and a mobile terminal thereof, which are used for enabling a user to see information displayed on a screen at any angle in an anti-peeping mode of the mobile terminal.

In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.

The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that the embodiments described herein may be practiced otherwise than as specifically illustrated or described herein. Furthermore, the terms "comprises," "comprising," or "having," and any variations thereof, are intended to cover non-exclusive inclusions, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.

In the embodiment of the invention, a grating screen is designed, the screen and a display screen of a mobile terminal adopt a full-lamination mode, and the grating screen is completely transparent when not electrified, so that the normal display of a normal screen is not influenced; when the grating screen is needed to be used, the grating screen is electrified, the grating screen can display a grating like a display image, the space and the width of the grating can be adjusted, the image right below the grating screen can be shielded in the place with the grating, and a schematic diagram 1 of the grating screen and the display screen is shown.

Referring to fig. 2, an embodiment of a screen display method according to the embodiment of the present invention includes:

201. the mobile terminal obtains a current face image of a target user through the front-facing camera.

In this embodiment, when the user starts the screen of the mobile terminal, the front-facing camera and the grating screen of the mobile terminal are also simultaneously started, and the mobile terminal acquires the current face image of the target user through the front-facing camera.

It should be noted that, when the user needs to exit the peep-proof mode when using the mobile terminal, the grating screen can be manually closed.

202. The mobile terminal determines face movement information of a target user.

In this embodiment, the mobile terminal needs to determine the face movement information of the target user by comparing the current face image with the obtained previous face image, and then the mobile terminal determines the movement direction of the face image according to the face movement information.

It should be noted that the front camera of the mobile terminal periodically takes pictures after the screen is turned on.

203. And the mobile terminal performs segmentation processing on the interface displayed on the display screen according to the face movement information of the target user.

In this embodiment, when the mobile terminal determines that the angle of the face of the target user changes relative to the screen according to the face movement information of the target user, the mobile terminal performs segmentation processing on the interface displayed on the display screen according to the face movement information of the target user to obtain a segmentation result, and if the mobile terminal determines that the angle of the face of the target user does not change relative to the screen according to the face movement information of the target user, the display interface of the mobile terminal remains the original segmentation.

204. And the mobile terminal adjusts the grating on the grating screen according to the segmentation result.

In this embodiment, a grating exists on a grating screen, and the grating is adjustable, after the mobile terminal obtains a segmentation result of the display interface, the mobile terminal adjusts the grating on the grating screen according to the segmentation result, so that a display center of the display interface of the mobile terminal is correspondingly changed according to the movement of the face image of the target user, the display center is opposite to the face image of the target user in real time, and if the face image of the target user is located on the side of the display interface of the mobile terminal, at this time, even if a peeping user cannot clearly see the display content on the front side of the display interface of the terminal.

It should be noted that, in all the above embodiments, the face image of the target user may be an eye image of the target user, and may also be other part images of the face, such as a nose or a cheek, which is not limited herein.

In the embodiment of the invention, the mobile terminal acquires the current face image of the target user through the front-facing camera, the mobile terminal determines the face movement information of the target user through the comparison of the current face image and the acquired previous face image, the mobile terminal performs segmentation processing on an interface displayed on a display screen according to the face movement information of the target user to obtain a segmentation result, and then the mobile terminal adjusts the grating on the grating screen according to the segmentation result to determine the display center corresponding to the face of the target user. The display center can be changed according to the movement of the position of the face image of the target user, so that the user can also see the information displayed on the screen at any angle in the peep-proof mode of the mobile terminal.

Referring to fig. 3, an embodiment of a screen display method according to the embodiment of the present invention includes:

301. the mobile terminal determines a facial image of the target user.

In this embodiment, because the mobile terminal has the peep-proof function, the target user and the peep user need to be distinguished, and the mobile terminal determines the face image of the user, which is firstly shot by the front camera after the screen is opened, as the face image of the target user.

It should be noted that, there are various methods for the mobile terminal to determine the face image of the target user, for example, the mobile terminal determines the face image of the user closest to the mobile terminal as the face image of the target user, or the mobile terminal compares each face image of the user captured by the front camera with a face image preset by the mobile terminal, and determines the face image of the user corresponding to the preset face image as the face image of the target user, which is not limited herein.

302. The mobile terminal obtains a current face image of a target user through the front-facing camera.

303. The mobile terminal determines face movement information of a target user.

In this embodiment, steps 302 and 303 are similar to steps 201 and 202 in fig. 2, and are not described herein again.

304. And the mobile terminal performs segmentation processing on the interface displayed on the display screen according to the face movement information of the target user.

In this embodiment, when the mobile terminal determines that the angle of the face of the target user changes relative to the screen according to the face movement information of the target user, the mobile terminal performs segmentation processing on the interface displayed on the display screen according to the face movement information of the target user to obtain a segmentation result, and if the mobile terminal determines that the angle of the face of the target user does not change relative to the screen according to the face movement information of the target user, the display interface of the mobile terminal remains the original segmentation.

It should be noted that, the mobile terminal divides each frame of picture of the interface displayed on the display screen into an a region and a B region by taking the Y axis as a coordinate according to the face movement information of the target user, where the a region and the B region are arranged at intervals, and the B region corresponds to the divided picture.

305. And the mobile terminal adjusts the grating on the grating screen according to the segmentation result.

In this embodiment, a grating exists on a grating screen, and the grating is adjustable, after the mobile terminal obtains a segmentation result of the display interface, the mobile terminal adjusts the grating on the grating screen according to the segmentation result, so that a display center of the display interface of the mobile terminal is correspondingly changed according to the movement of the face image of the target user, the display center is opposite to the face image of the target user in real time, and if the face image of the target user is located on the side of the display interface of the mobile terminal, at this time, even if a peeping user cannot clearly see the display content on the front side of the display interface of the terminal.

It should be noted that, the mobile terminal adjusts the width of the grating on the grating screen and the distance between the gratings according to the division result, so that the grating corresponds to the area a, and the divided picture corresponds to the area B.

It should be noted that, the mobile terminal changes the width of the grating and the spacing between the gratings by rotating the gratings on the grating screen.

It should be noted that, in all the above embodiments, the face image of the target user may be an eye image of the target user, and may also be other part images of the face, such as a nose or a cheek, which is not limited herein.

In the embodiment of the invention, a mobile terminal acquires a current face image of a target user through a front camera, the mobile terminal determines face movement information of the target user through comparison between the current face image and an acquired previous face image, and the mobile terminal performs segmentation processing on an interface displayed on a display screen according to the face movement information of the target user to obtain a segmentation result, wherein the mobile terminal segments each frame image of the interface displayed on the display screen into an area A and an area B by taking a Y axis as a coordinate according to the face movement information of the target user, the area A and the area B are arranged at intervals, and the area B corresponds to a segmented image. And then the mobile terminal adjusts the grating on the grating screen according to the segmentation result and determines a display center corresponding to the face of the target user, wherein the mobile terminal adjusts the width of the grating on the grating screen and the distance between the grating and the grating according to the segmentation result so that the grating corresponds to the area A, and the segmented picture corresponds to the area B. The display center can be changed according to the movement of the position of the face image of the target user, so that the user can also see the information displayed on the screen at any angle in the peep-proof mode of the mobile terminal.

Referring to fig. 4, an embodiment of a screen display method according to the embodiment of the present invention includes:

401. the mobile terminal determines a facial image of the target user.

402. The mobile terminal obtains a current face image of a target user through the front-facing camera.

403. The mobile terminal determines face movement information of a target user.

404. And the mobile terminal performs segmentation processing on the interface displayed on the display screen according to the face movement information of the target user.

In this embodiment, steps 401 to 404 are similar to steps 301 to 304 in fig. 3, and are not described herein again.

405. The mobile terminal selects an interference picture from a preset interference picture library.

In this embodiment, the mobile terminal selects an interference picture from a preset interference picture library, then segments the interference picture by using the Y axis as a coordinate, and places the segmented picture in the area a.

It should be noted that, step 405 and step 404 have no timing relationship, and step 405 may be executed after step 404 or simultaneously with step 404, which is not limited herein.

It should be noted that, in practical applications, step 405 may or may not be executed, and is not limited herein.

406. And the mobile terminal adjusts the grating on the grating screen according to the segmentation result.

In this embodiment, step 406 is similar to step 305 in fig. 3, and is not described herein again.

In all the above embodiments, the image of the face of the target user may be an image of the eyes of the target user, and may also be an image of other parts of the face, such as the nose or the cheek, which is not limited herein.

In the embodiment of the invention, a mobile terminal acquires a current face image of a target user through a front camera, the mobile terminal determines face movement information of the target user through comparison between the current face image and an acquired previous face image, and the mobile terminal performs segmentation processing on an interface displayed on a display screen according to the face movement information of the target user to obtain a segmentation result, wherein the mobile terminal segments each frame image of the interface displayed on the display screen into an area A and an area B by taking a Y axis as a coordinate according to the face movement information of the target user, the area A and the area B are arranged at intervals, and the area B corresponds to a segmented image. And then the mobile terminal adjusts the grating on the grating screen according to the segmentation result and determines a display center corresponding to the face of the target user, wherein the mobile terminal adjusts the width of the grating on the grating screen and the distance between the grating and the grating according to the segmentation result so that the grating corresponds to the area A, and the segmented picture corresponds to the area B. The display center can be changed according to the movement of the position of the face image of the target user, so that the user can also see the information displayed on the screen at any angle in the peep-proof mode of the mobile terminal.

For ease of understanding, the present embodiment is described below with reference to specific application scenarios:

when the user Alice opens the screen of the mobile phone, the grating screen on the mobile phone of the user Alice is automatically powered on, the grating screen automatically forms a grating, the mobile phone enters an anti-peeping mode, the eye image closest to the mobile phone screen is used as the eye image of the target user by the mobile phone through the front camera, and the eye image of the user Alice is determined to be the eye image of the target user by the mobile phone when the eye of the user Alice is closest to the mobile phone screen. The mobile phone acquires the eye image of the user Alice, and the real-time position of the user Alice eyes is obtained by comparing the currently acquired eye image of the user Alice with the previous eye image of the user Alice.

The visual angle from the position of human eyes to the mobile phone screen is taken as the visual angle of the display picture, and the screen information of the visual angle is shown in fig. 5, wherein the part shielded by the grating is a black area in the picture, and the area not shielded by the grating is a white area in the picture; if the original image of the display interface of the mobile phone of the user Alice is shown in fig. 6, the mobile phone will segment the picture shown in fig. 6 by using the Y axis as the coordinate according to the movement information of the eyes of the user Alice, fill the segmented picture in the white area, the filled effect is shown in fig. 7, the mobile phone can further select an interference picture from the interference picture library, and segment the picture by using the Y axis as the coordinate, the segmentation principle is to put the segmented interference picture in the black area, for example, the sun in fig. 8 is the interference picture, then the mobile phone correspondingly adjusts the width of the grating on the grating screen and the distance between the gratings according to the segmentation rule, so that the display center of the mobile phone screen always faces the eyes of Alice, so that Alice can see all images of the original image of the display interface of the mobile phone at any angle after the anti-peeping mode of the mobile phone is turned on, as shown in fig. 7, i.e can see the full view of the images displayed by the mobile phone, and when the mobile phone is seen from other directions, part of real images are blocked by the grating due to the angle problem, most of information seen by the peeper is image information of a black area, so that only an interface where part of real images and part of interference images are fused can be seen, fig. 9 shows that the peeper looks from the left, and fig. 10 shows that the peeper looks from the right.

The screen display method in the embodiment of the present invention is described above, and referring to fig. 11, the mobile terminal in the embodiment of the present invention is described below, where the mobile terminal in the embodiment of the present invention includes:

an obtaining unit 1101, configured to obtain a current face image of a target user through a front-facing camera;

a first determining unit 1102, configured to determine face movement information of a target user through comparison between the current face image acquired by the acquiring unit and an acquired previous face image;

a first dividing unit 1103, configured to perform division processing on the interface displayed on the display screen according to the face movement information of the target user determined by the first determining unit, so as to obtain a division result;

and an adjusting unit 1104, configured to adjust a raster on the raster screen according to the segmentation result of the first segmentation unit, and determine a display center corresponding to the face of the target user.

In the embodiment of the present invention, an obtaining unit 1101 obtains a current face image of a target user through a front-facing camera, a first determining unit 1102 determines face movement information of the target user through comparison between the current face image and an obtained previous face image, a first dividing unit 1103 performs division processing on an interface displayed on a display screen according to the face movement information of the target user to obtain a division result, and an adjusting unit 1104 adjusts a raster on the raster screen according to the division result to determine a display center corresponding to the face of the target user. The display center can be changed according to the movement of the eye position of the target user, so that the user can see the information displayed on the screen at any angle in the peep-proof mode of the mobile terminal.

Referring to fig. 12, another embodiment of the mobile terminal according to the embodiment of the present invention includes:

a second determining unit 1201, configured to determine a face image of the user, which is first captured by the front-facing camera after the screen is opened, as a face image of the target user;

an obtaining unit 1202, configured to obtain a current face image of a target user through a front-facing camera;

a first determining unit 1203, configured to determine face movement information of a target user through comparison between the current face image acquired by the acquiring unit and an acquired previous face image;

a first dividing unit 1203, configured to perform dividing processing on the interface displayed on the display screen according to the face movement information of the target user determined by the first determining unit, so as to obtain a dividing result;

wherein the adjusting unit 1204 includes:

the dividing subunit 12041 is configured to divide each frame of picture of the interface displayed on the display screen into an a region and a B region with the Y axis as a coordinate according to the face movement information of the target user, where the a region and the B region are arranged at intervals, and the B region corresponds to the divided picture.

A selecting unit 1205, configured to select an interference picture from a preset interference picture library;

and the second dividing unit 1206 is configured to divide the interference picture selected by the selecting unit by using the Y axis as a coordinate, and place the divided picture in the area a.

It should be noted that, in an actual apparatus, the selecting unit 1205 and the second dividing unit 1206 may not exist, and are not limited herein.

An adjusting unit 1207, configured to adjust a raster on the raster screen according to the segmentation result of the first segmentation unit, and determine a display center corresponding to the face of the target user.

The adjusting unit 1207 includes:

an adjusting subunit 12071, configured to adjust, according to the segmentation result, a width of a grating on the grating screen and a distance between the gratings, so that the grating corresponds to the area a, and the segmented picture corresponds to the area B.

In the embodiment of the present invention, the second determining unit 1201 determines the face image of the user, which is first captured by the front camera after the screen is opened, as the face image of the target user, the obtaining unit 1201 obtains the current face image of the target user through the front camera, the first determining unit 1202 determines the face movement information of the target user through the comparison between the current face image and the obtained previous face image, the first segmenting unit 1203 segments the interface displayed on the display screen according to the face movement information of the target user to obtain a segmentation result, and the adjusting subunit 12071 adjusts the width of the raster on the raster screen and the distance between the raster and determines the display center corresponding to the face of the target user according to the segmentation result. The display center can be changed according to the movement of the eye position of the target user, so that the user can see the information displayed on the screen at any angle in the peep-proof mode of the mobile terminal.

Referring to fig. 13, another embodiment of the mobile terminal according to the embodiment of the present invention includes:

a third determining unit 1301, configured to determine, through the front-facing camera, a distance between each user face image and the mobile terminal, and determine a user face image closest to the mobile terminal as a face image of the target user;

an obtaining unit 1302, configured to obtain a current face image of a target user through a front-facing camera;

a first determining unit 1303, configured to determine face movement information of the target user through comparison between the current face image acquired by the acquiring unit and an acquired previous face image;

a first dividing unit 1304, configured to perform division processing on the interface displayed on the display screen according to the face movement information of the target user determined by the first determining unit, so as to obtain a division result;

wherein the first dividing unit 1304 includes:

a dividing subunit 13041, configured to divide each frame of picture of the interface displayed on the display screen into an a region and a B region by taking the Y axis as a coordinate according to the face movement information of the target user, where the a region and the B region are arranged at intervals, and the B region corresponds to the divided picture.

A selecting unit 1305, configured to select an interference picture from a preset interference picture library;

a second dividing unit 1306, configured to divide the interference picture selected by the selecting unit by using the Y axis as a coordinate, and place the divided picture in the area a.

It should be noted that, in an actual apparatus, the selecting unit 1305 and the second dividing unit 1306 may not exist, and the specific embodiment is not limited herein.

An adjusting unit 1307, configured to adjust the raster on the raster screen according to the segmentation result of the first segmentation unit, and determine a display center corresponding to the face of the target user.

Wherein, adjusting unit 1307 includes:

and an adjusting subunit 13071, configured to adjust, according to the segmentation result, a width of a grating on the grating screen and an interval between the gratings, so that the grating corresponds to the a region, and the segmented picture corresponds to the B region.

In the embodiment of the present invention, the third determining unit 1301 determines the distance between each user face image and the mobile terminal through the front-facing camera, and determines the user face image closest to the mobile terminal as the face image of the target user; the obtaining unit 1302 obtains a current face image of a target user through a front camera, the first determining unit 1303 determines face movement information of the target user through comparison between the current face image and an obtained previous face image, the first dividing unit 1304 divides an interface displayed on the display screen according to the face movement information of the target user to obtain a division result, and the adjusting subunit 13071 adjusts the width of a grating on the grating screen and the distance between the grating and the grating according to the division result to determine a display center corresponding to the face of the target user. The display center can be changed according to the movement of the eye position of the target user, so that the user can see the information displayed on the screen at any angle in the peep-proof mode of the mobile terminal.

Referring to fig. 14, another embodiment of the mobile terminal according to the embodiment of the present invention includes:

a comparison unit 1401, configured to compare the face images of the users captured by the front-facing camera with face images preset by the mobile terminal;

a fourth determination unit 1402 for determining the user face image corresponding to the preset face image obtained by the comparison unit as the face image of the target user.

An obtaining unit 1403, configured to obtain a current face image of the target user through the front-facing camera;

a first determining unit 1404, configured to determine face movement information of a target user by comparing the current face image acquired by the acquiring unit with an acquired previous face image;

a first dividing unit 1405, configured to perform dividing processing on the interface displayed on the display screen according to the face movement information of the target user determined by the first determining unit, so as to obtain a dividing result;

wherein the first dividing unit 1405 includes:

the dividing subunit 14051 is configured to divide each frame of picture of the interface displayed on the display screen into an a region and a B region by taking the Y axis as a coordinate according to the face movement information of the target user, where the a region and the B region are arranged at intervals, and the B region corresponds to the divided picture.

A selecting unit 1406, configured to select an interference picture from a preset interference picture library;

a second dividing unit 1407, configured to divide the interference picture selected by the selecting unit by using the Y axis as a coordinate.

It should be noted that, in an actual apparatus, the selecting unit 1406 and the second dividing unit 1407 may not exist, and the specific embodiment is not limited herein.

An adjusting unit 1408, configured to adjust a raster on the raster screen according to the segmentation result, and determine a display center corresponding to the face of the target user.

The adjusting unit 1408 includes:

an adjusting subunit 14081, configured to adjust a width of a grating on the grating screen and a distance between the gratings according to the segmentation result, so that the grating corresponds to the a area, and the segmented picture corresponds to the B area.

In the embodiment of the present invention, a comparison unit 1401 compares each user face image captured by a front camera with a face image preset by a mobile terminal, a fourth determination unit 1402 determines a user face image corresponding to a preset face image as a face image of a target user, an acquisition unit 1403 acquires a current face image of the target user through the front camera, a first determination unit 1404 determines face movement information of the target user through comparison between the current face image and an acquired previous face image, a first segmentation unit 1405 segments an interface displayed on a display screen according to the face movement information of the target user to obtain a segmentation result, and an adjustment subunit 14801 adjusts a width of a grating on the grating screen according to the segmentation result, and adjusting the distance between the gratings, and determining a display center corresponding to the face of the target user. The display center can be changed according to the movement of the eye position of the target user, so that the user can see the information displayed on the screen at any angle in the peep-proof mode of the mobile terminal.

As shown in fig. 15, for convenience of description, only the parts related to the embodiment of the present invention are shown, and details of the specific technology are not disclosed, please refer to the method part of the embodiment of the present invention. The terminal may be any terminal device including a mobile phone, a tablet computer, a PDA (Personal Digital Assistant), a POS (Point of sales), a vehicle-mounted computer, etc., taking the terminal as the mobile phone as an example:

fig. 15 is a block diagram showing a partial structure of a cellular phone related to a terminal provided by an embodiment of the present invention. Referring to fig. 15, the cellular phone includes: radio Frequency (RF) circuitry 1510, memory 1520, input unit 1530, display unit 1540, sensor 1550, audio circuitry 1560, wireless fidelity (WiFi) module 1570, processor 1580, and power supply 1590. Those skilled in the art will appreciate that the handset configuration shown in fig. 15 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.

The following describes each component of the mobile phone in detail with reference to fig. 15:

the RF circuit 1510 may be configured to receive and transmit signals during information transmission and reception or during a call, and in particular, receive downlink information of a base station and then process the received downlink information to the processor 1580; in addition, the data for designing uplink is transmitted to the base station. In general, RF circuit 1510 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, RF circuit 1510 may also communicate with networks and other devices via wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to global system for Mobile communications (GSM), General Packet Radio Service (GPRS), Code Division multiple Access (Code Division multiple Access 1615609le Access, CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Messaging Service (SMS), etc.

The memory 1520 may be used to store software programs and modules, and the processor 1180 executes various functional applications and data processing of the mobile phone by operating the software programs and modules stored in the memory 1520. The memory 1520 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 1520 may include high-speed random access memory and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.

The input unit 1530 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone. Specifically, the input unit 1530 may include a touch panel 1531 and other input devices 1532. The touch panel 1531, also referred to as a touch screen, can collect touch operations of a user (e.g., operations of the user on or near the touch panel 1531 using any suitable object or accessory such as a finger or a stylus) and drive corresponding connection devices according to a preset program. Alternatively, the touch panel 1531 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, and sends the touch point coordinates to the processor 1580, and can receive and execute commands sent by the processor 1580. In addition, the touch panel 1531 may be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The input unit 1530 may include other input devices 1532 in addition to the touch panel 1531. In particular, other input devices 1532 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.

The display unit 1540 may be used to display information input by the user or information provided to the user and various menus of the mobile phone. The Display unit 1540 may include a Display panel 1541, and optionally, the Display panel 1541 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch panel 1531 may cover the display panel 1541, and when the touch panel 1531 detects a touch operation on or near the touch panel 1531, the touch operation is transmitted to the processor 1580 to determine the type of the touch event, and then the processor 1580 provides a corresponding visual output on the display panel 1541 according to the type of the touch event. Although in fig. 15, the touch panel 1531 and the display panel 1541 are two separate components to implement the input and output functions of the mobile phone, in some embodiments, the touch panel 1531 and the display panel 1541 may be integrated to implement the input and output functions of the mobile phone.

The handset can also include at least one sensor 1550, such as light sensors, motion sensors, and other sensors. Specifically, the light sensor may include an ambient light sensor that adjusts the brightness of the display panel 1541 according to the brightness of ambient light and a proximity sensor that turns off the display panel 1541 and/or the backlight when the mobile phone is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.

Audio circuitry 1560, speaker 1561, and microphone 1562 may provide an audio interface between a user and a cell phone. The audio circuit 1560 may transmit the electrical signal converted from the received audio data to the speaker 1561, and convert the electrical signal into an audio signal by the speaker 1561 and output the audio signal; on the other hand, the microphone 1562 converts collected sound signals into electrical signals, which are received by the audio circuit 1560 and converted into audio data, which are processed by the audio data output processor 1580 and then passed through the RF circuit 1510 for transmission to, for example, another cellular phone, or for output to the memory 1520 for further processing.

WiFi belongs to short-distance wireless transmission technology, and the mobile phone can help a user to receive and send e-mails, browse webpages, access streaming media and the like through a WiFi module 1570, and provides wireless broadband internet access for the user. Although fig. 15 shows WiFi module 1570, it is understood that it does not belong to the essential constitution of the handset and can be omitted entirely as needed within the scope not changing the essence of the invention.

The processor 1580 is a control center of the mobile phone, connects various parts of the entire mobile phone by using various interfaces and lines, and performs various functions of the mobile phone and processes data by operating or executing software programs and/or modules stored in the memory 1520 and calling data stored in the memory 1520, thereby integrally monitoring the mobile phone. Optionally, the processor 1580 may include one or more processing units; preferably, the processor 1580 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, and the like, and a modem processor, which mainly handles wireless communications. It is to be appreciated that the modem processor may not be integrated into the processor 1580.

The handset also includes a power supply 1590 (e.g., a battery) for powering the various components, which may preferably be logically coupled to the processor 1580 via a power management system to manage charging, discharging, and power consumption management functions via the power management system.

Although not shown, the mobile phone may further include a camera, a bluetooth module, etc., which are not described herein.

In this embodiment of the present invention, the processor 1580 included in the terminal further has the following functions:

acquiring a current face image of a target user through a front-facing camera;

determining the face movement information of the target user by comparing the current face image with the acquired previous face image;

performing segmentation processing on an interface displayed on a display screen according to the face movement information of the target user to obtain a segmentation result;

and adjusting the grating on the grating screen according to the segmentation result, and determining a display center corresponding to the face of the target user.

Referring to fig. 16, fig. 16 is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention, where the server 1600 may have a relatively large difference due to different configurations or performances, and may include one or more Central Processing Units (CPUs) 1622 (e.g., one or more processors) and a memory 1632, and one or more storage media 1630 (e.g., one or more mass storage devices) storing an application program 1642 or data 1644. Memory 1632 and storage media 1630 may be transient or persistent storage, among others. The program stored on the storage medium 1630 may include one or more modules (not shown), each of which may include a sequence of instructions operating on a server. Further, central processing unit 1622 may be configured to communicate with storage medium 1630 to execute a series of instruction operations on storage medium 1630 at server 1600.

The server 1600 may also include one or more power supplies 1626, one or more wired or wireless network interfaces 1650, one or more input-output interfaces 1658, and/or one or more operating systems 1641, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, etc.

The steps performed by the server in the above embodiment may be based on the server structure shown in fig. 16.

It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.

In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.

The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.

In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.

The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.

The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (15)

1. A screen display method is applied to a mobile terminal, a grating screen is installed on a display screen of the mobile terminal, and the method comprises the following steps:
the mobile terminal acquires a current face image of a target user through a front camera;
the mobile terminal determines the face movement information of the target user through the comparison of the current face image and the acquired previous face image of the target user;
the mobile terminal carries out segmentation processing on the interface displayed on the display screen according to the face movement information of the target user to obtain a segmentation result;
the mobile terminal adjusts the grating on the grating screen according to the segmentation result and determines a display center corresponding to the face of the target user;
the mobile terminal performs segmentation processing on the interface displayed on the display screen according to the face movement information of the target user, and the segmentation processing comprises the following steps:
and the mobile terminal divides each frame of picture of the interface displayed on the display screen into an area A and an area B by taking the Y axis as a coordinate according to the face movement information of the target user, wherein the area A and the area B are arranged at intervals, and the area B corresponds to the divided picture.
2. The screen display method of claim 1, wherein before the mobile terminal obtains the current facial image of the target user through a front-facing camera, the method further comprises:
and the mobile terminal determines the face image of the user shot by the front camera for the first time after the screen is opened as the face image of the target user.
3. The screen display method of claim 1, wherein before the mobile terminal obtains the current facial image of the target user through a front-facing camera, the method further comprises:
and the mobile terminal judges the distance between each user face image and the mobile terminal through the front camera, and determines the user face image closest to the mobile terminal as the face image of the target user.
4. The screen display method of claim 1, wherein before the mobile terminal obtains the current facial image of the target user through a front-facing camera, the method further comprises:
the mobile terminal compares the face image of each user shot by the front camera with a face image preset by the mobile terminal;
and the mobile terminal determines the face image of the user corresponding to the preset face image as the face image of the target user.
5. The screen display method according to any one of claims 1 to 4, wherein the adjusting, by the mobile terminal, the raster on the raster screen according to the division result includes:
and the mobile terminal adjusts the width of the grating on the grating screen and the distance between the gratings according to the segmentation result, so that the grating corresponds to the area A, and the segmented picture corresponds to the area B.
6. The screen display method according to any one of claims 1 to 4, wherein after the mobile terminal performs the segmentation processing on the interface displayed on the display screen according to the face movement information of the target user, the method further comprises:
the mobile terminal selects an interference picture from a preset interference picture library;
and the mobile terminal divides the interference picture by taking the Y axis as a coordinate, and places the divided picture in the area A.
7. The screen display method according to any one of claims 1 to 4, wherein the face image of the target user is an eye image of the target user.
8. The utility model provides a mobile terminal, its characterized in that, mobile terminal includes leading camera, display screen and grating screen, the grating screen is installed on the display screen, mobile terminal includes:
the acquisition unit is used for acquiring a current face image of a target user through the front camera;
the first determining unit is used for determining the face movement information of the target user through the comparison between the current face image acquired by the acquiring unit and the acquired previous face image of the target user;
the first segmentation unit is used for carrying out segmentation processing on the interface displayed on the display screen according to the face movement information of the target user determined by the first determination unit to obtain a segmentation result;
the adjusting unit is used for adjusting the grating on the grating screen according to the segmentation result of the first segmentation unit and determining a display center corresponding to the face of the target user;
the first dividing unit includes:
and the dividing subunit is used for dividing each frame of picture of the interface displayed on the display screen into an area A and an area B by taking the Y axis as a coordinate according to the face movement information of the target user, wherein the area A and the area B are arranged at intervals, and the area B corresponds to the divided picture.
9. The mobile terminal of claim 8, wherein the mobile terminal further comprises:
and the second determining unit is used for determining the face image of the user, which is firstly shot by the front camera after the screen is opened, as the face image of the target user.
10. The mobile terminal of claim 8, wherein the mobile terminal further comprises:
and the third determining unit is used for judging the distance between each user face image and the mobile terminal through the front camera and determining the user face image closest to the mobile terminal as the face image of the target user.
11. The mobile terminal of claim 8, wherein the mobile terminal further comprises:
the comparison unit is used for comparing the face images of the users shot by the front camera with the face images preset by the mobile terminal;
a fourth determination unit configured to determine the user face image corresponding to the preset face image obtained by the comparison unit as the face image of the target user.
12. The mobile terminal according to any of claims 8 to 11, wherein the adjusting unit comprises:
and the adjusting subunit is used for adjusting the width of the grating on the grating screen and the distance between the gratings according to the segmentation result, so that the grating corresponds to the area A, and the segmented picture corresponds to the area B.
13. The mobile terminal according to any of claims 8 to 11, characterized in that the mobile terminal further comprises:
the device comprises a selecting unit, a processing unit and a processing unit, wherein the selecting unit is used for selecting an interference picture from a preset interference picture library;
and the second segmentation unit is used for segmenting the interference picture selected by the selection unit by taking the Y axis as a coordinate, and placing the segmented picture in the area A.
14. The mobile terminal according to any of claims 8 to 11, wherein the image of the face of the target user is an image of the eyes of the target user.
15. The utility model provides a mobile terminal, mobile terminal includes leading camera, display screen and grating screen, the grating screen is installed on the display screen, its characterized in that includes:
an input device, an output device, a processor, and a memory;
the processor is used for executing the following steps by calling the operation instruction stored in the memory:
acquiring a current face image of a target user through a front-facing camera;
determining the face movement information of the target user by comparing the current face image with the acquired previous face image of the target user;
performing segmentation processing on an interface displayed on the display screen according to the face movement information of the target user to obtain a segmentation result;
adjusting the grating on the grating screen according to the segmentation result, and determining a display center corresponding to the face of the target user;
the mobile terminal performs segmentation processing on the interface displayed on the display screen according to the face movement information of the target user, and the segmentation processing comprises the following steps:
and the mobile terminal divides each frame of picture of the interface displayed on the display screen into an area A and an area B by taking the Y axis as a coordinate according to the face movement information of the target user, wherein the area A and the area B are arranged at intervals, and the area B corresponds to the divided picture.
CN201610835214.XA 2016-09-20 2016-09-20 Screen display method and mobile terminal thereof CN106406530B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610835214.XA CN106406530B (en) 2016-09-20 2016-09-20 Screen display method and mobile terminal thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610835214.XA CN106406530B (en) 2016-09-20 2016-09-20 Screen display method and mobile terminal thereof

Publications (2)

Publication Number Publication Date
CN106406530A CN106406530A (en) 2017-02-15
CN106406530B true CN106406530B (en) 2020-04-07

Family

ID=57997280

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610835214.XA CN106406530B (en) 2016-09-20 2016-09-20 Screen display method and mobile terminal thereof

Country Status (1)

Country Link
CN (1) CN106406530B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106910837B (en) * 2017-03-29 2019-01-15 京东方科技集团股份有限公司 Organic electroluminescence device and preparation method thereof, display device
CN109145655A (en) * 2017-06-19 2019-01-04 上海中兴软件有限责任公司 A kind of mobile terminal display methods, mobile terminal and storage medium
CN110140127A (en) * 2017-11-16 2019-08-16 华为技术有限公司 A kind of display methods, device and terminal
CN110096972A (en) * 2019-04-12 2019-08-06 重庆科芮智能科技有限公司 Data guard method, apparatus and system
CN110264967A (en) * 2019-05-09 2019-09-20 京东方科技集团股份有限公司 Display device and its control method
CN110765433A (en) * 2019-10-21 2020-02-07 珠海格力电器股份有限公司 Terminal control method, device, terminal and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103077361A (en) * 2012-12-28 2013-05-01 东莞宇龙通信科技有限公司 Mobile terminal and anti-spy method thereof
CN103108085A (en) * 2013-01-31 2013-05-15 广东欧珀移动通信有限公司 Glance prevention method of mobile terminal
CN103218579A (en) * 2013-03-28 2013-07-24 东莞宇龙通信科技有限公司 Method for preventing content on screen from being peeped, and mobile terminal thereof
CN103885574A (en) * 2012-12-19 2014-06-25 联想(北京)有限公司 Status switching method, status switching device and electronic equipment

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140078318A1 (en) * 2009-05-22 2014-03-20 Motorola Mobility Llc Electronic Device with Sensing Assembly and Method for Interpreting Consecutive Gestures
KR20140110551A (en) * 2013-03-08 2014-09-17 삼성전자주식회사 Method for controlling pattern and an electronic device thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103885574A (en) * 2012-12-19 2014-06-25 联想(北京)有限公司 Status switching method, status switching device and electronic equipment
CN103077361A (en) * 2012-12-28 2013-05-01 东莞宇龙通信科技有限公司 Mobile terminal and anti-spy method thereof
CN103108085A (en) * 2013-01-31 2013-05-15 广东欧珀移动通信有限公司 Glance prevention method of mobile terminal
CN103218579A (en) * 2013-03-28 2013-07-24 东莞宇龙通信科技有限公司 Method for preventing content on screen from being peeped, and mobile terminal thereof

Also Published As

Publication number Publication date
CN106406530A (en) 2017-02-15

Similar Documents

Publication Publication Date Title
US9549264B2 (en) Portable terminal for controlling hearing aid and method therefor
US10114514B2 (en) Electronic device, method for controlling the electronic device, and recording medium
EP3035656A1 (en) Method and apparatus for controlling an electronic device
CN107957839B (en) Display control method and mobile terminal
CN106415510B (en) Data processing method and electronic equipment thereof
US9906406B2 (en) Alerting method and mobile terminal
US10445482B2 (en) Identity authentication method, identity authentication device, and terminal
CN107613131B (en) Application program disturbance-free method, mobile terminal and computer-readable storage medium
CN103473494A (en) Application running method, device and terminal device
CN107665697B (en) A kind of adjusting method and mobile terminal of screen intensity
EP3454240A1 (en) Unlocking methods and related products
CN108182019B (en) Suspension control display processing method and mobile terminal
WO2014086218A1 (en) Interface adjustment method, device and terminal
CN107767839B (en) Brightness adjusting method and related product
CN104383681A (en) Game process control method and device as well as mobile terminal
EP3637289B1 (en) Permission control method and related product
CN107682634A (en) A kind of facial image acquisition methods and mobile terminal
WO2016173427A1 (en) Method, device and computer readable medium for creating motion blur effect
CN106326773B (en) A kind of method, apparatus and terminal of photo encryption handling
CN108111675B (en) Notification message processing method and device and mobile terminal
CN108762954A (en) A kind of object sharing method and mobile terminal
WO2019052418A1 (en) Facial recognition method and related product
CN108073343A (en) A kind of display interface method of adjustment and mobile terminal
CN107783887B (en) Control method of mobile terminal and mobile terminal
CN106959761B (en) A kind of terminal photographic method, device and terminal

Legal Events

Date Code Title Description
PB01 Publication
C06 Publication
SE01 Entry into force of request for substantive examination
C10 Entry into substantive examination
GR01 Patent grant
GR01 Patent grant