CN108388849B - Method and device for adjusting display image of terminal, electronic equipment and storage medium - Google Patents

Method and device for adjusting display image of terminal, electronic equipment and storage medium Download PDF

Info

Publication number
CN108388849B
CN108388849B CN201810125075.0A CN201810125075A CN108388849B CN 108388849 B CN108388849 B CN 108388849B CN 201810125075 A CN201810125075 A CN 201810125075A CN 108388849 B CN108388849 B CN 108388849B
Authority
CN
China
Prior art keywords
image
display image
human eyes
display screen
eyes
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201810125075.0A
Other languages
Chinese (zh)
Other versions
CN108388849A (en
Inventor
陈骏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201810125075.0A priority Critical patent/CN108388849B/en
Publication of CN108388849A publication Critical patent/CN108388849A/en
Application granted granted Critical
Publication of CN108388849B publication Critical patent/CN108388849B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention provides a method and a device for adjusting a display image of a terminal, electronic equipment and a computer readable storage medium, wherein the image in a shooting range is acquired through a shooting unit; identifying human eyes from the image and acquiring the change of the human eyes in the image along with time; dynamically adjusting the display image of the terminal according to the change of the human eyes in the image along with the time, so that the display image shows corresponding change when the human eyes move relative to the display screen; the method for dynamically adjusting the display image of the terminal comprises at least one of rotation, translation and zooming; therefore, the display image is stably displayed relative to human eyes at any time in the view of viewing experience, the terminal does not need to be adjusted in a manual mode, and the shaking condition of the display image caused by the fact that the terminal is adjusted manually is reduced.

Description

Method and device for adjusting display image of terminal, electronic equipment and storage medium
Technical Field
The present application relates to the field of display technologies of terminals, and in particular, to a method and an apparatus for adjusting a display image of a terminal, an electronic device, and a computer-readable storage medium.
Background
Nowadays, with the development of science and technology, more and more intelligent terminals are actively in the electronic market, such as mobile phones, tablet computers, MP4, electronic books, and the like. In the process of viewing by using the terminal, for example, when the user squints the display image of the terminal, the viewing experience is not good, and the terminal is usually manually adjusted to improve the viewing experience of the user. However, the shaking of the display image of the terminal can be caused when the terminal is adjusted by a manual mode, the shaking of the display image can be caused when the terminal is held by hands and instability exists, the display image is not easy to see by a user when the terminal is shaken, and the eyesight of the user can be influenced after the user lasts for a long time.
Disclosure of Invention
The embodiment of the application provides a method and a device for adjusting a display image of a terminal, electronic equipment and a computer readable storage medium, which can reduce the display image jitter of the terminal.
A method of adjusting a display image of a terminal provided with a camera unit, the method comprising:
acquiring a shot image through the camera shooting unit;
identifying human eyes from the image and acquiring the change of the human eyes in the image along with time;
and adjusting the display image of the terminal according to the change of the human eyes in the image along with the time, so that the display image shows corresponding change when the human eyes move relative to the display screen.
An apparatus for adjusting a display image of a terminal provided with a camera unit, the apparatus comprising:
the image shooting module is used for acquiring a shot image through the camera shooting unit;
the human eye identification module is used for identifying human eyes from the image and acquiring the change of the human eyes in the image along with time;
and the display image adjusting module is used for adjusting the display image of the terminal according to the change of the human eyes in the image along with the time, so that the display image shows corresponding change when the human eyes move relative to the display screen.
An electronic device comprises a memory and a processor, wherein the memory stores a computer program, and the computer program is executed by the processor to enable the processor to execute the steps of the method for adjusting the display image of the terminal.
A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the above-mentioned steps of the method of adjusting a display image of a terminal.
According to the method and the device for adjusting the display image of the terminal, the electronic equipment and the computer readable storage medium, the image in the shooting range is obtained in real time by utilizing the front-facing camera unit of the terminal, and if the image of the human eye changes along with the time, which indicates that the human eye moves relative to the display screen, the display image of the terminal is adjusted according to the change of the human eye along with the time in the image, so that the display image shows corresponding change along with the change of the human eye to adapt to the change of the human eye when the human eye moves relative to the display screen; the terminal has the advantages that the display image is stably displayed for human eyes and is suitable for the human eyes to read at any time from the view experience, the terminal does not need to be adjusted in a manual mode, the shaking condition of the display image caused by the fact that the terminal is adjusted manually is reduced, the shaking condition of the display image caused by instability existing in the process that the terminal is held by a human hand is reduced, and the blurring condition of the display image caused by shaking can be reduced.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a block diagram of a terminal in one embodiment;
FIG. 2 is a flowchart illustrating a method for adjusting a display image of a terminal according to an embodiment;
FIG. 3 is a flow chart illustrating step 202 in one embodiment;
FIG. 4 is a flowchart illustrating a method for adjusting a display image of a terminal according to another embodiment;
FIG. 5 is a diagram illustrating standard parameters of a human eye at a set position in one embodiment;
FIG. 6 is a diagram illustrating dynamic parameters of a human eye in an exemplary embodiment;
FIG. 7 is a diagram illustrating an adjusted display image in accordance with an exemplary embodiment;
FIG. 8 is a flowchart illustrating a method for adjusting a display image of a terminal according to an exemplary embodiment;
FIG. 9 is a schematic view of display parameters of a displayed image corresponding to the human eye of FIG. 4 in a set position relative to the display screen;
FIG. 10 is a flowchart illustrating a method for adjusting a display image of a terminal according to another embodiment;
FIG. 11 is a block diagram of an apparatus for adjusting a display image of a terminal according to an embodiment;
fig. 12 is a block diagram of a partial structure of a mobile phone related to an electronic device provided in an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first client may be referred to as a second client, and similarly, a second client may be referred to as a first client, without departing from the scope of the present application. Both the first client and the second client are clients, but they are not the same client.
The method of adjusting the display image of the terminal may be applied to the terminal. Fig. 1 is a schematic diagram of an internal structure of a terminal 110 in one embodiment. As shown in fig. 1, the terminal 110 includes a processor, a memory, a network interface, a display screen, and a camera unit connected through a system bus. Wherein, the processor is used for providing calculation and control capability and supporting the operation of the whole electronic equipment. The memory is used for storing data, programs and the like, and the memory stores at least one computer program which can be executed by the processor to realize the wireless network communication method suitable for the electronic device provided by the embodiment of the application. The memory may include a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The computer program can be executed by a processor to implement a method for adjusting a display image of a terminal provided in the following embodiments. The internal memory provides a cached execution environment for the operating system computer programs in the non-volatile storage medium. The network interface may be an ethernet card or a wireless network card, etc. for communicating with an external electronic device. The display of the terminal 110 may be a liquid crystal display, an electronic ink display, or the like. The camera unit of the terminal 110 may specifically be a front camera unit or a rear camera unit, and the terminal 110 may be a handheld terminal, such as a mobile phone, a tablet computer, a PDA, a personal digital assistant, or a wearable device. Those skilled in the art will appreciate that the configuration shown in fig. 1 is a block diagram of only a portion of the configuration associated with the present application and does not constitute a limitation on the terminal 110 to which the present application applies, and that a particular mobile terminal 110 may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
FIG. 2 is a flow diagram of a method for adjusting a display image of a terminal in one embodiment. In the method for adjusting the display image of the terminal in this embodiment, the terminal is provided with a camera unit, and a camera with a large shooting angle, such as a fisheye lens, may be selected as the camera unit. The description will be made by taking the terminal in fig. 1 as an example. For displaying a display image on a display screen on the front side of the terminal, a front-facing camera unit can be used for acquiring the image and then identifying human eyes from the image. For displaying a display image on a display screen on the back of the terminal, a rear-mounted camera unit can be used for acquiring the image, and then human eyes can be identified from the image. As shown in fig. 2, the method for adjusting the display image of the terminal includes steps 202 to 206. The display image is used for displaying media content, and specifically may be an interface of an application program displayed on a display screen, unlocking a desktop, a photo or a video, and the like.
In step 202, a captured image is acquired by an imaging unit.
In this step, the shot image may be a face image of the user, and the image pickup unit may shoot the face image of the user in real time when the user views the display image of the terminal. The step may acquire images at set time intervals, such as every 0.01s, 0.05s, 0.1s, 0.5s, and so on.
In one embodiment, as shown in fig. 3, step 202 includes steps 302 to 306:
step 302, acquiring the starting information of the camera unit.
And step 304, starting the camera shooting unit according to the starting information.
In step 306, an image captured by the image capturing unit is acquired.
The start information of the camera unit may be an adjustment function start instruction in response to a user, that is, when the user turns on the adjustment function switch, the camera is turned on, and the adjustment processing function is turned on.
And step 204, identifying human eyes from the image, and acquiring the change of the human eyes in the image along with time.
The camera unit of the terminal is generally fixed on the terminal, the shooting visual angle is limited, and a plurality of camera units can be arranged on the terminal. The shooting units can be arranged on the upper frame, the lower frame, the left frame and the right frame of the terminal, so that the shooting range can be enlarged as much as possible, and the situation that the eyes cannot be obtained due to the undersize of the shooting range is reduced. When the terminal is provided with a plurality of camera units, the corresponding camera units can be triggered to shoot images according to the positions of human eyes relative to the display screen so as to identify the human eyes. For example, when the human eyes are located at a certain position relative to the display screen, the camera units at the corresponding positions can be triggered to acquire images and recognize the human eyes, if the images shot by the camera units do not contain the human eyes, the current positions of the human eyes are predicted according to the historical motion tracks of the human eyes, and other corresponding camera units are triggered to shoot the images according to the predicted positions of the human eyes to recognize the human eyes; or if the image acquired by the camera shooting unit does not contain human eyes, the shooting modes of other camera shooting units can be directly activated, and the other camera shooting units are used for collecting the human eyes together.
When the user uses the terminal device, the image in the shooting range may specifically be an image including human eyes, may be an image including two human eyes, and then both human eyes in the image are recognized. If the human eye is moving with respect to the display screen over time, the human eye also changes in the image over time. When the eyes are not recognized in the image shot by the camera unit of the terminal, other facial features of the user can be further recognized from the image, and then the change of the eyes in the image along with the time can be predicted according to the change of the other facial features in the image along with the time. Other facial features may be forehead, eyebrow, cheek, ear, nose, mouth, chin, etc.
The change of the human eyes in the image along with the time is reflected in the change of the inclination angle, the position and/or the length between the two eyes in the image along with the time when the human eyes move relative to the display screen. Specifically, the change of the inclination angle of the human eyes in the image along with the time can be reflected in the change of the angle formed by the deviation of the straight line of the two eyes from the axis of the display screen. The change in the position of the human eye in the image over time may be reflected in a change in the midpoint coordinate between the two eyes. The length between the two eyes is reflected in the length variation between the pupils of the two eyes. On one hand, the length between the two eyes can be changed because the human eyes are far away from the display screen or are closer to the display screen, the length between the two eyes is shorter the farther the distance is, and the length between the two eyes is longer the closer the distance is; on the other hand, when there is relative movement between the eyes, for example, when the eyes are in the "high-low eye" state, the length between the eyes becomes long, and when the eyes are in the "hole-fighting" state, the length between the eyes becomes short.
In one embodiment, the human eye is identified from the image in step 204 by using a human eye identification algorithm.
Specifically, the human eye recognition algorithm may be an Adaboost-based or neural network-based human eye recognition algorithm, and may also be other human eye recognition algorithms commonly used in the art.
If the magnification of the camera shooting unit is changed, even if the human eyes do not move relative to the display screen along with time, the positions of the human eyes in the images along with time can also be changed, so that the magnification of the camera shooting unit when shooting the images sequentially can be set to be the same, and the change of the human eyes in the images along with time can be accurately acquired.
And step 206, adjusting the display image of the terminal according to the change of the human eyes in the image along with the time, so that the display image shows corresponding change to adapt to the change of the human eyes when the human eyes move relative to the display screen.
Specifically, if the variation value of the human eye in the image over time is greater than the preset variation value, the display image is adjusted, otherwise, the adjustment may not be performed. Therefore, the terminal system resources can be saved, and the resource utilization rate is reduced.
The change of the human eye in the image along with the time specifically refers to the change of the human eye in more than two images acquired successively, for example, the change of the human eye in the position of the image A and the position of the image B is acquired successively, the change of the human eye in the position of the image A and the position of the image B is illustrated if the change of the human eye in the position of the image A and the position of the image B along with the time is different, and then the displayed image is adjusted according to the change of the human eye in the image A to the image B along with the time, so that the displayed image is suitable for the human eye to read when the human eye is in the position. The image B is the image shot by the camera unit at the current time, the image a may be the image shot by the camera unit at the previous time, and if the human eye has not moved or moved a small amount relative to the display screen for a period of time before the image B is shot and the camera unit has shot the image for a period of time before the image B is shot, the image a may be any image shot by the camera unit for the period of time, for example, the first image shot during the period of time.
Specifically, the manner of adjusting the display image of the terminal includes, but is not limited to, at least one of rotation, translation, and zooming. The position of the human eye in the image changes with time, and in order to keep the display image unchanged relative to the human eye, the change in the display image should follow the position of the human eye in the image. For example, if the angle of inclination of the human eye in the image becomes larger with time, the image is displayed by rotating accordingly. When the human eyes move upwards, downwards, leftwards or rightwards in the image along with the time, the image is correspondingly displayed in a translation mode, and when the human eyes are closer to or farther away from the display screen along with the time or the two eyes move relatively, so that the length between the two eyes is longer or shorter, the image is correspondingly displayed in an enlarged or reduced mode.
In one embodiment, as shown in fig. 4, the method for adjusting the display image of the terminal further includes steps 402 to 406:
step 402, providing standard parameters when the human eyes are in a set position relative to the display screen.
Specifically, the set position of the human eye in the image relative to the display screen is the position where the human eye can best read the displayed image, the standard parameters of the set position specifically include an inclination angle, the position of the human eye and the length between the two eyes, and the inclination angle is specifically the angle formed by the straight line of the two eyes and the longitudinal axis or the transverse axis of the display screen. The eye position can be characterized by the midpoint coordinate between the two eyes.
FIG. 5 is a diagram of a standard parameter of a human eye at a set position in an embodiment, including a standard tilt angle θOA standard midpoint coordinate P (xo, yo) between the two eyes, and a standard length L between the two eyesO. In fig. 5, the human eye is at the middle-upper position of the display image.
Step 404, acquiring dynamic parameters of the human eyes when the human eyes move relative to the display screen.
Specifically, the dynamic parameters of the movement of the human eye relative to the display screen include an inclination angle, a position of the human eye and a length between the two eyes, wherein the inclination angle of the human eye in the image is specifically an angle formed by a straight line in which the two eyes are located and a longitudinal axis or a transverse axis of the display screen. The eye position can be characterized by the midpoint coordinate between the two eyes.
As shown in fig. 6, fig. 6 is a schematic diagram of dynamic parameters of a human eye in an embodiment, which includes a dynamic tilt angle θ (t), a dynamic midpoint coordinate P (x (t), y (t)) between two eyes, and a dynamic length l (t) between two eyes.
The step of adjusting the display image of the terminal according to the change of the human eye in the image over time includes the step 406 of: and comparing the dynamic parameters with the standard parameters, and adjusting the display image of the terminal.
Specifically, if the dynamic parameter changes with time, which indicates that the human eye changes in the image with time, the display image is adjusted according to the change of the dynamic parameter. The adjustment manner includes at least one of rotating the display image to adjust a tilt angle of the display image, translating the display image to adjust a position of the display image, and zooming the display image to adjust a size of the display image. Specifically, the display image is shifted to adjust the center position of the display image. The angle of inclination of the displayed image may be the angle formed by the transverse axis of the displayed image and the longitudinal/transverse axis of the display screen.
The reference display parameters of the display image can be calculated according to the dynamic parameters of human eyes, and the display image is adjusted to be the reference display image with the reference display parameters. The reference display parameters include a reference center coordinate of the display image, a reference diagonal length, and a reference tilt angle.
In one embodiment, the step of rotating the display image of the terminal includes:
and comparing the standard inclination angle of the human eyes when the human eyes are at the set position relative to the display screen with the dynamic inclination angle of the human eyes when the human eyes move relative to the display screen, and rotating the display image according to the change of the inclination angle of the human eyes in the image along with the time when the human eyes move relative to the display screen, wherein the inclination angle of the rotated display image is equal to the dynamic inclination angle. I.e. the reference tilt angle is equal to the dynamic tilt angle of the human eye when moving relative to the display screen.
The rotation direction of the display image is kept consistent with the inclination direction of the human eyes, if the inclination angle of the human eyes changes in the clockwise direction along with the time, the display image is rotated clockwise, and if the inclination angle of the human eyes changes in the counterclockwise direction along with the time, the display image is rotated counterclockwise. And the inclination angle theta (t)' of the rotated display image is equal to the dynamic inclination angle theta (t) when the human eyes move relative to the display screen. FIG. 7 is a diagram illustrating an adjusted display image in accordance with an exemplary embodiment.
In one embodiment, the step of translating the display image of the terminal includes: and comparing the position of the human eye relative to the display screen when the human eye is at the set position with the position of the human eye when the human eye moves relative to the display screen, and translating the display image according to the change of the position of the human eye in the image with time when the human eye moves relative to the display screen.
And if the position of the human eye in the shot image is changed along with the time, correspondingly translating the display image according to the position change.
Further, in a specific embodiment, as shown in fig. 8, the method for adjusting the display image of the terminal in this embodiment further includes steps 802 to 806:
step 802, providing standard center coordinates of the display image when the human eyes are at the set positions relative to the display screen, dynamic midpoint coordinates when the two eyes move relative to the display screen, and standard midpoint coordinates when the two eyes are at the set positions relative to the display screen.
And step 804, acquiring a reference center coordinate, wherein the reference center coordinate is a value obtained by subtracting the standard midpoint coordinate from the dynamic midpoint coordinate and adding the standard center coordinate.
Specifically, the reference center coordinates are obtained according to the following formula:
P’(x(t),y(t))-P’(xo,yo)=P(x(t),y(t))-P(xo,yo);
wherein, as shown in fig. 7, P '(x (t), y (t)) is the reference center coordinate, as shown in fig. 9, P' (xo, yo) is the standard center coordinate, as shown in fig. 6, P (x (t), y (t)) is the dynamic midpoint coordinate, as shown in fig. 4, P (xo, yo) is the standard midpoint coordinate; fig. 9 is a schematic diagram of display parameters of a display image corresponding to fig. 4 when the human eye is in a set position relative to the display screen. Fig. 4, 6, 7 and 9 all use the lower left corner of the display screen as the origin of coordinates.
The step of translating the display image of the terminal includes step 806 of translating the display image, the center coordinates of the translated display image being equal to the reference center coordinates.
In one embodiment, the step of scaling the display image of the terminal includes: and comparing the length between the two eyes when the human eyes are at the set position relative to the display screen with the length between the two eyes when the human eyes move relative to the display screen, and zooming the displayed image according to the time-dependent change of the length between the two eyes in the image when the human eyes move relative to the display screen.
For the zoomed display image, if the length between the two eyes changes in the image, but the size of the human eye does not change in the image, the zoom may be performed in a certain direction of the display image, specifically, the length direction of the zoomed display image or the width direction of the zoomed display image. For example, if the length between the two eyes is shortened in the shot image (for example, the two eyes become a cross-hatched eye), and the sizes of the two eyes in the image are not changed, the length of the display image is correspondingly enlarged (the length direction of the two eyes is consistent with the length direction of the display image); otherwise, the length of the displayed image is reduced.
For the zoom display image, if the length between two eyes changes in the image, the size of human eyes also changes in the image, and the zoom can be performed on the whole display image, specifically, the zoom can be performed in the length direction and the width direction of the display image. For example, the length between the two eyes is longer in the shot image, and the sizes of the two eyes are smaller in the image, which indicates that the eyes are farther away relative to the display screen, the whole image is correspondingly displayed in a reduced mode; conversely, if the length between the two eyes is shorter in the captured image and the size of the two eyes in the image is larger, indicating that the eyes are closer to the display screen, the display image as a whole is correspondingly larger.
Further, in a specific embodiment, as shown in fig. 10, the method for adjusting the display image of the terminal in this embodiment further includes steps 1002 to 1006:
step 1002, providing a standard diagonal length of a display image when the human eye is at a set position relative to the display screen, a dynamic length between two eyes when the human eye moves relative to the display screen, a standard length between two eyes when the two eyes are at the set position relative to the display screen, and a relatively fixed parameter of the image.
Step 1004, obtaining a reference diagonal length, wherein the reference diagonal length is a value obtained by multiplying an image relative fixed parameter by a standard length and the standard diagonal length and then dividing by a dynamic length; wherein the image relative fixed parameter is greater than zero.
Specifically, the reference diagonal length is obtained according to the following formula: l ' (t) × L (t) ═ K1 × Lo ' L ' o; k1> 0;
wherein L ' (t) is a reference diagonal length, L (t) is a dynamic length between two eyes, L ' as shown in FIG. 9 'OIs a standard diagonal length, LOThe standard length between the two eyes, K1 is the image relative fixed coefficient;
the step of scaling the display image of the terminal includes a step 1006 of scaling the display image, a diagonal length of the scaled display image being equal to the reference diagonal length L' (t).
The method for adjusting the display image of the terminal in this embodiment acquires the image in the shooting range in real time by using the terminal front-end camera unit, if the image changes with time, which indicates that the eye moves relative to the display screen, dynamically adjusts the display image of the terminal according to the change of the eye with time in the image, so that the display image shows corresponding change with the change of the eye when the eye moves relative to the display screen, and adjusts the display image into the display image with reference display parameters (reference tilt angle, reference center coordinate, reference diagonal length), so that the display image is stably displayed and always suitable for being browsed by the eye whenever the eye moves relative to the display image, and the method for adjusting the display image of the terminal in this embodiment does not need to adjust the terminal in a manual manner, thereby reducing the jitter of the display image caused by the manual adjustment of the terminal, the shaking condition of the display image of the terminal caused by instability existing when the terminal is held by hands is reduced, the fuzzy condition of the display image caused by shaking is reduced, and particularly, the eyesight of a user can be protected when the user watches the display screen for a long time.
It should be understood that, although the steps in the flowchart of fig. 2 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in fig. 2 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performance of the sub-steps or stages is not necessarily sequential, but may be performed in turn or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
Fig. 11 is a block diagram illustrating an apparatus for adjusting a display image of a terminal according to an embodiment. As shown in fig. 11, the apparatus for adjusting a display image of a terminal includes:
and an image capturing module 1110 for acquiring a captured image in real time by the image capturing unit.
And an eye recognition module 1120, configured to recognize human eyes from the image and acquire changes of human eyes in the image over time.
And a display image adjusting module 1130, configured to adjust the display image of the terminal according to the change of the human eye in the image over time, so that the display image shows a corresponding change when the human eye moves relative to the display screen.
The manner of adjusting the display image of the terminal includes, but is not limited to, at least one of rotation, translation, and zooming.
In one embodiment, the apparatus for adjusting a display image of a terminal further includes:
and the standard parameter providing module is used for providing the standard parameters when the human eyes are in the set positions relative to the display screen.
And the dynamic parameter acquisition module is used for acquiring dynamic parameters of human eyes when the human eyes move relative to the display screen.
The display image adjusting module 1130 includes a comparing module for comparing the dynamic parameter with the standard parameter to dynamically adjust the display image of the terminal.
In one embodiment, the standard parameter and the dynamic parameter each comprise a tilt angle;
the display image adjusting module 1130 includes a rotation module for comparing a standard tilt angle of the human eye when the human eye is at a set position relative to the display screen with a dynamic tilt angle of the human eye when the human eye moves relative to the display screen, and rotating the display image according to a change of the tilt angle of the human eye in the image with time when the human eye moves relative to the display screen, wherein the tilt angle of the rotated display image is equal to the dynamic tilt angle.
In a specific embodiment, the standard parameter and the dynamic parameter both comprise positions, and the positions are positions of human eyes in the display screen;
the display image adjustment module 1130 includes a translation module for comparing the position of the human eye relative to the display screen when in the set position with the position of the human eye when in motion relative to the display screen, and translating the display image according to a change over time in the position of the human eye in the image when in motion relative to the display screen.
Further, in an embodiment, the apparatus for adjusting the display image of the terminal further includes:
and the standard center coordinate providing module is used for providing the standard center coordinates of the display image when the human eyes are at the set positions relative to the display screen, the dynamic midpoint coordinates when the two eyes move relative to the display screen, and the standard midpoint coordinates when the two eyes are at the set positions relative to the display screen.
And the reference center coordinate calculation module is used for acquiring a reference center coordinate, wherein the reference center coordinate is a value obtained by subtracting the standard midpoint coordinate from the dynamic midpoint coordinate and adding the standard center coordinate.
Specifically, the reference center coordinates are obtained according to the following formula:
P’(x(t),y(t))-P’(xo,yo)=P(x(t),y(t))-P(xo,yo);
wherein, P '(x (t), y (t)) are reference center coordinates, P' (xo, yo) are standard center coordinates, P (x (t), y (t)) are dynamic midpoint coordinates when the two eyes move relative to the display screen, and P (xo, yo) are standard midpoint coordinates when the two eyes are at set positions relative to the display screen.
The display image adjustment module 1130 includes a first translation module for translating the display image, wherein the center coordinate of the translated display image is equal to the reference center coordinate.
In one embodiment, the standard parameter and the dynamic parameter each comprise a length between the eyes;
the display image adjustment module 1130 includes a scaling module for comparing a length between the two eyes when the human eye is in a set position relative to the display screen with a length between the two eyes when the human eye moves relative to the display screen, and scaling the display image according to a change over time in the length between the two eyes in the image when the human eye moves relative to the display screen.
Further, in an embodiment, the apparatus for adjusting the display image of the terminal further includes:
the standard diagonal length providing module is used for providing a standard diagonal length of a displayed image when the human eye is at a set position relative to the display screen, a dynamic length between two eyes when the human eye moves relative to the display screen, a standard length between two eyes when the two eyes are at the set position relative to the display screen, and image relative fixed parameters.
The reference diagonal length calculation module is used for acquiring a reference diagonal length which is a value obtained by multiplying a relative fixed parameter of the image by a standard length and the standard diagonal length and then dividing the standard length by a dynamic length; wherein the image relative fixed parameter is greater than zero.
Specifically, the reference diagonal length is obtained according to the following formula:
L’(t)*L(t)=K1*Lo*L’o;K1>0;
wherein L (t)' is the reference diagonal length, L (t) is the dynamic length between the two eyes, LO' is the standard diagonal length, LOK1 is the image relative fixed coefficient, which is the standard length between the two eyes;
the display image adjustment module 1130 includes a first scaling module for scaling the display image, wherein a diagonal length of the scaled display image is equal to a reference diagonal length.
In the device for adjusting the display image of the terminal in the embodiment, the image in the shooting range is acquired in real time by using the terminal front-facing camera unit, and if the image of the human eye changes along with time, which indicates that the human eye moves relative to the display screen, the display image of the terminal is dynamically adjusted according to the change of the human eye along with time in the image, so that when the human eye moves relative to the display screen, the display image shows corresponding change along with the change of the human eye to adapt to the change of the human eye; the display image is adjusted to a display image having reference display parameters (reference tilt angle, reference center coordinates, reference diagonal length) so that the display image is stably displayed with respect to the human eye whenever viewed from the viewing experience and is always suitable for the human eye to view. The device for adjusting the display image of the terminal in the embodiment does not need to adjust the terminal in a manual mode, reduces the shaking condition of the display image caused by manually adjusting the terminal, reduces the shaking condition of the display image of the terminal caused by instability existing when a user holds the terminal by hands, reduces the blurring condition of the display image caused by shaking, and can protect the vision of the user particularly when the user watches a display screen for a long time.
The division of each module in the apparatus for adjusting the display image of the terminal is only used for illustration, and in other embodiments, the apparatus for adjusting the display image of the terminal may be divided into different modules as needed to complete all or part of the functions of the apparatus for adjusting the display image of the terminal.
For specific limitations of the device for adjusting the display image of the terminal, reference may be made to the above limitations of the method for adjusting the display image of the terminal, and details are not repeated here. All or part of each module in the device for adjusting the display image of the terminal can be realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
The implementation of each module in the apparatus for adjusting a display image of a terminal provided in the embodiments of the present application may be in the form of a computer program. The computer program may be run on a terminal or a server. The program modules constituted by the computer program may be stored on the memory of the terminal or the server. Which when executed by a processor, performs the steps of the method described in the embodiments of the present application.
The embodiment of the application also provides a computer readable storage medium. One or more non-transitory computer-readable storage media embodying computer-executable instructions that, when executed by one or more processors, cause the processors to perform steps of a method of adjusting a display image of a terminal.
A computer program product containing instructions which, when run on a computer, cause the computer to perform a method of adjusting a display image of a terminal.
The embodiment of the application also provides the electronic equipment. As shown in fig. 12, for convenience of explanation, only the portions related to the embodiments of the present application are shown, and details of the specific techniques are not disclosed, please refer to the method portion of the embodiments of the present application. The electronic device may be any terminal device including a mobile phone, a tablet computer, a PDA (Personal Digital Assistant), a POS (Point of Sales), a vehicle-mounted computer, a wearable device, and the like, taking the electronic device as the mobile phone as an example:
fig. 12 is a block diagram of a partial structure of a mobile phone related to an electronic device provided in an embodiment of the present application. Referring to fig. 12, the cellular phone includes: radio Frequency (RF) circuit 1210, memory 1220, input unit 1230, display unit 1240, sensor 1250, audio circuit 1260, wireless fidelity (WiFi) module 1270, processor 1280, and power supply 1290. Those skilled in the art will appreciate that the handset configuration shown in fig. 12 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The RF circuit 1210 may be configured to receive and transmit signals during information transmission or communication, and may receive downlink information of a base station and then process the downlink information to the processor 1280; the uplink data may also be transmitted to the base station. Typically, the RF circuitry includes, but is not limited to, an antenna, at least one Amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuit 1210 may also communicate with networks and other devices via wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE)), e-mail, Short Messaging Service (SMS), and the like.
The memory 1220 may be used to store software programs and modules, and the processor 1280 executes various functional applications and data processing of the mobile phone by operating the software programs and modules stored in the memory 1220. The memory 1220 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function (such as an application program for a sound playing function, an application program for an image playing function, and the like), and the like; the data storage area may store data (such as audio data, an address book, etc.) created according to the use of the mobile phone, and the like. Further, the memory 1220 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 1230 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone 1200. Specifically, the input unit 1230 may include a touch panel 1231 and other input devices 1232. The touch panel 1231, which may also be referred to as a touch screen, may collect touch operations performed by a user on or near the touch panel 1231 (e.g., operations performed by the user on or near the touch panel 1231 using any suitable object or accessory such as a finger, a stylus, etc.), and drive the corresponding connection device according to a preset program. In one embodiment, the touch panel 1231 can include two portions, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, and sends the touch point coordinates to the processor 1280, and can receive and execute commands sent by the processor 1280. In addition, the touch panel 1231 may be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The input unit 1230 may include other input devices 1232 in addition to the touch panel 1231. In particular, other input devices 1232 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), and the like.
The display unit 1240 may be used to display information input by the user or information provided to the user and various menus of the cellular phone. Display unit 1240 may include a display panel 1241. In one embodiment, the Display panel 1241 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. In one embodiment, touch panel 1231 can overlay display panel 1241, and when touch panel 1231 detects a touch operation thereon or nearby, the touch panel 1231 can transmit the touch operation to processor 1280 to determine the type of touch event, and then processor 1280 can provide corresponding visual output on display panel 1241 according to the type of touch event. Although in fig. 12, the touch panel 1231 and the display panel 1241 are implemented as two independent components to implement the input and output functions of the mobile phone, in some embodiments, the touch panel 1231 and the display panel 1241 may be integrated to implement the input and output functions of the mobile phone.
The cell phone 1200 may also include at least one sensor 1250, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 1241 according to the brightness of ambient light, and the proximity sensor may turn off the display panel 1241 and/or the backlight when the mobile phone moves to the ear. The motion sensor can comprise an acceleration sensor, the acceleration sensor can detect the magnitude of acceleration in each direction, the magnitude and the direction of gravity can be detected when the mobile phone is static, and the motion sensor can be used for identifying the application of the gesture of the mobile phone (such as horizontal and vertical screen switching), the vibration identification related functions (such as pedometer and knocking) and the like; the mobile phone may be provided with other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor.
Audio circuit 1260, speaker 1261, and microphone 1262 can provide an audio interface between a user and a cell phone. The audio circuit 1260 can transmit the received electrical signal converted from the audio data to the speaker 1261, and the audio signal is converted into a sound signal by the speaker 1261 and output; on the other hand, the microphone 1262 converts the collected sound signal into an electrical signal, which is received by the audio circuit 1260 and converted into audio data, and then the audio data is processed by the audio data output processor 1280, and then the processed audio data is transmitted to another mobile phone through the RF circuit 1210, or the audio data is output to the memory 1220 for subsequent processing.
WiFi belongs to short-distance wireless transmission technology, and the mobile phone can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 1270, and provides wireless broadband internet access for the user. Although fig. 12 shows WiFi module 1270, it is understood that it is not an essential component of cell phone 1200 and may be omitted as desired.
The processor 1280 is a control center of the mobile phone, connects various parts of the entire mobile phone by using various interfaces and lines, and performs various functions of the mobile phone and processes data by operating or executing software programs and/or modules stored in the memory 1220 and calling data stored in the memory 1220, thereby performing overall monitoring of the mobile phone. In one embodiment, the processor 1280 may include one or more processing units. In one embodiment, the processor 1280 may integrate an application processor and a modem processor, wherein the application processor primarily handles operating systems, user interfaces, applications, and the like; the modem processor handles primarily wireless communications. It is to be appreciated that the modem processor described above may not be integrated into the processor 1280.
The mobile phone 1200 further includes a power supply 1290 (e.g., a battery) for supplying power to various components, and preferably, the power supply may be logically connected to the processor 1280 through a power management system, so that the power management system may manage charging, discharging, and power consumption.
In one embodiment, the cell phone 1200 may also include a camera, a bluetooth module, and the like.
In the embodiment of the present application, the processor 1280 included in the electronic device realizes the steps of the anti-shake method for the display image of the terminal when executing the computer program stored on the memory.
Any reference to memory, storage, database, or other medium used herein may include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and bus dynamic RAM (RDRAM).
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (8)

1. A method of adjusting a display image of a terminal provided with a camera unit, the method comprising:
acquiring a shot image through the camera shooting unit;
identifying human eyes from the image and acquiring the change of the human eyes in the image along with time;
adjusting the display image of the terminal according to the change of the human eyes in the image along with the time, so that the display image shows corresponding change when the human eyes move relative to the display screen;
the method further comprises the following steps:
providing standard parameters when human eyes are at set positions relative to a display screen;
acquiring dynamic parameters of human eyes when the human eyes move relative to the display screen;
the step of adjusting the display image of the terminal according to the change of the human eyes in the image along with the time comprises the following steps: comparing the dynamic parameters with the standard parameters, and adjusting the display image of the terminal;
the standard parameter and the dynamic parameter both comprise positions and/or both comprise lengths between two eyes;
providing a standard diagonal length of a display image when the human eye is at a set position relative to the display screen, a dynamic length between two eyes when the human eye moves relative to the display screen, a standard length between two eyes when the two eyes are at the set position relative to the display screen, and a relatively fixed parameter of the image;
acquiring a reference diagonal length which is a value obtained by multiplying the image relative fixed parameter by the standard length and the standard diagonal length and then dividing the result by the dynamic length; wherein the image relative fixed parameter is greater than zero;
the step of adjusting the display image of the terminal comprises: comparing the position of the human eyes when the human eyes are at the set position relative to the display screen with the position of the human eyes when the human eyes move relative to the display screen, and translating the display image according to the change of the position of the human eyes in the image along with the time when the human eyes move relative to the display screen;
the method for adjusting the display image of the terminal further comprises the following steps: and the length between the two eyes when the human eyes are at the set position relative to the display screen is compared with the length between the two eyes when the human eyes move relative to the display screen, the display image is zoomed according to the change of the length between the two eyes in the image with time when the human eyes move relative to the display screen, and the diagonal length of the zoomed display image is equal to the length of the reference diagonal.
2. The method of claim 1, wherein the standard parameter and the dynamic parameter each comprise a tilt angle;
the step of adjusting the display image of the terminal comprises: comparing a standard inclination angle of human eyes relative to a display screen when the human eyes are at a set position with a dynamic inclination angle of the human eyes relative to the display screen when the human eyes move, and rotating the display image according to the change of the inclination angle of the human eyes relative to the display screen when the human eyes move in the image along with the time, wherein the inclination angle of the rotated display image is equal to the dynamic inclination angle.
3. The method of claim 1, further comprising:
providing standard center coordinates of a display image when the human eyes are at a set position relative to the display screen, dynamic midpoint coordinates when the two eyes move relative to the display screen, and standard midpoint coordinates when the two eyes are at the set position relative to the display screen;
acquiring a reference center coordinate, wherein the reference center coordinate is a value obtained by subtracting the standard midpoint coordinate from the dynamic midpoint coordinate and adding the standard center coordinate;
the step of translating the display image comprises: and translating the display image, wherein the center coordinate of the translated display image is equal to the reference center coordinate.
4. An apparatus for adjusting a display image of a terminal provided with a camera unit, the apparatus comprising:
the image shooting module is used for acquiring a shot image through the camera shooting unit;
the human eye identification module is used for identifying human eyes from the image and acquiring the change of the human eyes in the image along with time;
the display image adjusting module is used for adjusting the display image of the terminal according to the change of the human eyes in the image along with the time, so that the display image shows corresponding change when the human eyes move relative to the display screen;
the standard parameter providing module is used for providing standard parameters when human eyes are at set positions relative to the display screen;
the dynamic parameter acquisition module is used for acquiring dynamic parameters of human eyes when the human eyes move relative to the display screen;
the standard parameter and the dynamic parameter both comprise positions and/or both comprise lengths between two eyes;
the standard diagonal length providing module is used for providing a standard diagonal length of a display image when the human eyes are at a set position relative to the display screen, a dynamic length between two eyes when the human eyes move relative to the display screen, a standard length between two eyes when the two eyes are at the set position relative to the display screen and image relative fixed parameters;
the reference diagonal length calculation module is used for acquiring a reference diagonal length, and the reference diagonal length is a value obtained by multiplying the image relative fixed parameter by the standard length and the standard diagonal length and then dividing the result by the dynamic length; wherein the image relative fixed parameter is greater than zero;
the display image adjusting module comprises a translation module, a display image adjusting module and a display image adjusting module, wherein the translation module is used for comparing the position of human eyes at a set position relative to the display screen with the position of the human eyes moving relative to the display screen, and translating the display image according to the change of the position of the human eyes in the image with time when moving relative to the display screen;
the display image adjusting module further comprises a zooming module for comparing the length between the two eyes when the human eyes are at the set position relative to the display screen with the length between the two eyes when the human eyes move relative to the display screen, zooming the display image according to the change of the length between the two eyes in the image with time when the human eyes move relative to the display screen, wherein the length of the diagonal line of the zoomed display image is equal to the length of the reference diagonal line.
5. The apparatus of claim 4, wherein the standard parameter and the dynamic parameter each comprise a tilt angle;
the display image adjusting module comprises a rotating module for comparing a standard inclination angle of human eyes relative to a display screen when the human eyes are at a set position with a dynamic inclination angle of the human eyes relative to the display screen when the display screen moves, and rotating the display image according to the change of the inclination angle of the human eyes relative to the display screen when the display screen moves along with the change of time, wherein the inclination angle of the rotated display image is equal to the dynamic inclination angle.
6. The apparatus of claim 4, further comprising:
the standard center coordinate providing module is used for providing standard center coordinates of a display image when the human eyes are at a set position relative to the display screen, dynamic midpoint coordinates when the two eyes move relative to the display screen and standard midpoint coordinates when the two eyes are at the set position relative to the display screen;
a reference center obtaining module, configured to obtain a reference center coordinate, where the reference center coordinate is a value obtained by subtracting the standard midpoint coordinate from the dynamic midpoint coordinate and adding the standard center coordinate;
the display image adjusting module further comprises a first translation module for translating the display image, and the translated center coordinate of the display image is equal to the reference center coordinate.
7. An electronic device comprising a memory and a processor, the memory having stored therein a computer program that, when executed by the processor, causes the processor to perform the steps of the method of adjusting a display image of a terminal as claimed in any one of claims 1 to 3.
8. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of adjusting the display image of a terminal according to any one of claims 1 to 3.
CN201810125075.0A 2018-02-07 2018-02-07 Method and device for adjusting display image of terminal, electronic equipment and storage medium Expired - Fee Related CN108388849B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810125075.0A CN108388849B (en) 2018-02-07 2018-02-07 Method and device for adjusting display image of terminal, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810125075.0A CN108388849B (en) 2018-02-07 2018-02-07 Method and device for adjusting display image of terminal, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN108388849A CN108388849A (en) 2018-08-10
CN108388849B true CN108388849B (en) 2021-02-02

Family

ID=63074414

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810125075.0A Expired - Fee Related CN108388849B (en) 2018-02-07 2018-02-07 Method and device for adjusting display image of terminal, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN108388849B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109933189A (en) * 2018-12-28 2019-06-25 惠州Tcl移动通信有限公司 Dynamic desktop layout method, display equipment and computer storage medium
JP7255202B2 (en) * 2019-01-29 2023-04-11 セイコーエプソン株式会社 Display method and display device
CN109885368A (en) * 2019-01-31 2019-06-14 维沃移动通信有限公司 A kind of interface display anti-fluttering method and mobile terminal
CN110605505A (en) * 2019-09-23 2019-12-24 佛山耀立电气有限公司 Control method and device for display duration of screen content of welding machine and welding machine
CN111142660A (en) * 2019-12-10 2020-05-12 广东中兴新支点技术有限公司 Display device, picture display method and storage medium
CN111479068A (en) * 2020-04-22 2020-07-31 维沃移动通信(杭州)有限公司 Photographing method, electronic device, and computer-readable storage medium
CN112040316B (en) * 2020-08-26 2022-05-20 深圳创维-Rgb电子有限公司 Video image display method, device, multimedia equipment and storage medium
CN112689052B (en) * 2020-12-18 2022-05-13 Oppo(重庆)智能科技有限公司 Display control method and device, electronic equipment and computer readable storage medium
CN114816641B (en) * 2022-05-09 2023-05-02 海信视像科技股份有限公司 Display device, multimedia content display method, and storage medium
CN114489539A (en) * 2022-01-04 2022-05-13 佛山市顺德区美的饮水机制造有限公司 Display device, display method, and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102841725A (en) * 2011-06-21 2012-12-26 鸿富锦精密工业(深圳)有限公司 Electronic device and screen information adjusting method thereof
CN103760980A (en) * 2014-01-21 2014-04-30 Tcl集团股份有限公司 Display method, system and device for conducting dynamic adjustment according to positions of two eyes
CN107305430A (en) * 2016-04-25 2017-10-31 张翔宇 The scheme of human eye deflection control display content change

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8300036B2 (en) * 2010-06-29 2012-10-30 Bank Of America Corporation Method and apparatus for reducing glare and/or increasing privacy of a self-service device
CN104007909B (en) * 2013-02-25 2019-03-01 腾讯科技(深圳)有限公司 Page automatic adjusting method and device
CN104122985A (en) * 2013-04-29 2014-10-29 鸿富锦精密工业(深圳)有限公司 Screen video image adjusting system and method
CN104679225B (en) * 2013-11-28 2018-02-02 上海斐讯数据通信技术有限公司 Screen adjustment method, screen adjustment device and the mobile terminal of mobile terminal
CN106572389A (en) * 2015-10-08 2017-04-19 小米科技有限责任公司 Method and apparatus for adjusting display image
CN106843634B (en) * 2016-12-15 2020-11-10 宇龙计算机通信科技(深圳)有限公司 Screen display adjustment method and system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102841725A (en) * 2011-06-21 2012-12-26 鸿富锦精密工业(深圳)有限公司 Electronic device and screen information adjusting method thereof
CN103760980A (en) * 2014-01-21 2014-04-30 Tcl集团股份有限公司 Display method, system and device for conducting dynamic adjustment according to positions of two eyes
CN107305430A (en) * 2016-04-25 2017-10-31 张翔宇 The scheme of human eye deflection control display content change

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Eye-tracking Study of Reading Speed from LCD Displays: Influence of Type Style and Type Size;Gregor Franken等;《Journal of Eye Movement Research》;20150330;第8卷(第1期);第1-8页 *
自由立体显示中人眼跟踪算法的研究;夏光荣等;《光学与光电技术》;20160229;第14卷(第1期);第11-15页 *

Also Published As

Publication number Publication date
CN108388849A (en) 2018-08-10

Similar Documents

Publication Publication Date Title
CN108388849B (en) Method and device for adjusting display image of terminal, electronic equipment and storage medium
EP3582487B1 (en) Image stabilisation
CN109348125B (en) Video correction method, video correction device, electronic equipment and computer-readable storage medium
CN108513070B (en) Image processing method, mobile terminal and computer readable storage medium
US20190080188A1 (en) Facial recognition method and related product
CN108234875B (en) Shooting display method and device, mobile terminal and storage medium
CN108388414B (en) Screen-off control method and device for terminal, computer-readable storage medium and terminal
CN108989672B (en) Shooting method and mobile terminal
CN107124556B (en) Focusing method, focusing device, computer readable storage medium and mobile terminal
CN107566742B (en) Shooting method, shooting device, storage medium and electronic equipment
CN113179370B (en) Shooting method, mobile terminal and readable storage medium
CN107749046B (en) Image processing method and mobile terminal
CN110198413B (en) Video shooting method, video shooting device and electronic equipment
CN110456911B (en) Electronic equipment control method and device, electronic equipment and readable storage medium
WO2020253295A1 (en) Control method and apparatus, and terminal device
US20150077437A1 (en) Method for Implementing Electronic Magnifier and User Equipment
CN111031253B (en) Shooting method and electronic equipment
CN109819166B (en) Image processing method and electronic equipment
CN107995417B (en) Photographing method and mobile terminal
CN116033269A (en) Linkage auxiliary anti-shake shooting method, equipment and computer readable storage medium
CN111031246A (en) Shooting method and electronic equipment
CN111556248B (en) Shooting method, shooting device, storage medium and mobile terminal
CN108628508B (en) Method for adjusting clipping window and mobile terminal
CN108769529B (en) Image correction method, electronic equipment and computer readable storage medium
CN115134527A (en) Processing method, intelligent terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant after: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd.

Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant before: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd.

GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20210202